Repository: karnikram/rp-vio Branch: main Commit: 42a0ac1c6541 Files: 106 Total size: 60.9 MB Directory structure: gitextract_7sq47s8w/ ├── LICENCE ├── README.md ├── camera_model/ │ ├── CMakeLists.txt │ ├── cmake/ │ │ └── FindEigen.cmake │ ├── include/ │ │ └── camodocal/ │ │ ├── calib/ │ │ │ └── CameraCalibration.h │ │ ├── camera_models/ │ │ │ ├── Camera.h │ │ │ ├── CameraFactory.h │ │ │ ├── CataCamera.h │ │ │ ├── CostFunctionFactory.h │ │ │ ├── EquidistantCamera.h │ │ │ ├── PinholeCamera.h │ │ │ └── ScaramuzzaCamera.h │ │ ├── chessboard/ │ │ │ ├── Chessboard.h │ │ │ ├── ChessboardCorner.h │ │ │ ├── ChessboardQuad.h │ │ │ └── Spline.h │ │ ├── gpl/ │ │ │ ├── EigenQuaternionParameterization.h │ │ │ ├── EigenUtils.h │ │ │ └── gpl.h │ │ └── sparse_graph/ │ │ └── Transform.h │ ├── instruction │ ├── package.xml │ ├── readme.md │ └── src/ │ ├── calib/ │ │ └── CameraCalibration.cc │ ├── camera_models/ │ │ ├── Camera.cc │ │ ├── CameraFactory.cc │ │ ├── CataCamera.cc │ │ ├── CostFunctionFactory.cc │ │ ├── EquidistantCamera.cc │ │ ├── PinholeCamera.cc │ │ └── ScaramuzzaCamera.cc │ ├── chessboard/ │ │ └── Chessboard.cc │ ├── gpl/ │ │ ├── EigenQuaternionParameterization.cc │ │ └── gpl.cc │ ├── intrinsic_calib.cc │ └── sparse_graph/ │ └── Transform.cc ├── config/ │ ├── advio_12_config.yaml │ ├── ol_market1_config.yaml │ ├── rpvio_rviz_config.rviz │ └── rpvio_sim_config.yaml ├── plane_segmentation/ │ ├── RecoverPlane_perpendicular.py │ ├── crf_inference.py │ ├── data_loader_new.py │ ├── inference.py │ ├── net.py │ ├── openloris.txt │ ├── pretrained_model/ │ │ ├── model.data-00000-of-00001 │ │ ├── model.index │ │ └── model.meta │ ├── requirements.txt │ ├── train.py │ └── utils.py ├── rpvio.patch ├── rpvio_estimator/ │ ├── CMakeLists.txt │ ├── cmake/ │ │ └── FindEigen.cmake │ ├── launch/ │ │ ├── advio_12.launch │ │ ├── ol_market1.launch │ │ ├── rpvio_rviz.launch │ │ └── rpvio_sim.launch │ ├── package.xml │ └── src/ │ ├── estimator.cpp │ ├── estimator.h │ ├── estimator_node.cpp │ ├── factor/ │ │ ├── homography_factor.h │ │ ├── imu_factor.h │ │ ├── integration_base.h │ │ ├── marginalization_factor.cpp │ │ ├── marginalization_factor.h │ │ ├── pose_local_parameterization.cpp │ │ ├── pose_local_parameterization.h │ │ ├── projection_factor.cpp │ │ ├── projection_factor.h │ │ ├── projection_td_factor.cpp │ │ └── projection_td_factor.h │ ├── feature_manager.cpp │ ├── feature_manager.h │ ├── initial/ │ │ ├── initial_aligment.cpp │ │ ├── initial_alignment.h │ │ ├── initial_ex_rotation.cpp │ │ ├── initial_ex_rotation.h │ │ ├── initial_sfm.cpp │ │ ├── initial_sfm.h │ │ ├── solve_5pts.cpp │ │ └── solve_5pts.h │ ├── parameters.cpp │ ├── parameters.h │ └── utility/ │ ├── CameraPoseVisualization.cpp │ ├── CameraPoseVisualization.h │ ├── tic_toc.h │ ├── utility.cpp │ ├── utility.h │ ├── visualization.cpp │ └── visualization.h ├── rpvio_feature_tracker/ │ ├── CMakeLists.txt │ ├── cmake/ │ │ └── FindEigen.cmake │ ├── package.xml │ └── src/ │ ├── feature_tracker.cpp │ ├── feature_tracker.h │ ├── feature_tracker_node.cpp │ ├── parameters.cpp │ ├── parameters.h │ └── tic_toc.h └── scripts/ ├── convert_vins_to_tum.py ├── run_advio_12.sh ├── run_ol_market1.sh └── run_rpvio_sim.sh ================================================ FILE CONTENTS ================================================ ================================================ FILE: LICENCE ================================================ GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The GNU General Public License is a free, copyleft license for software and other kinds of works. The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. We, the Free Software Foundation, use the GNU General Public License for most of our software; it applies also to any other work released this way by its authors. You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things. To protect your rights, we need to prevent others from denying you these rights or asking you to surrender the rights. Therefore, you have certain responsibilities if you distribute copies of the software, or if you modify it: responsibilities to respect the freedom of others. For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. Developers that use the GNU GPL protect your rights with two steps: (1) assert copyright on the software, and (2) offer you this License giving you legal permission to copy, distribute and/or modify it. For the developers' and authors' protection, the GPL clearly explains that there is no warranty for this free software. For both users' and authors' sake, the GPL requires that modified versions be marked as changed, so that their problems will not be attributed erroneously to authors of previous versions. Some devices are designed to deny users access to install or run modified versions of the software inside them, although the manufacturer can do so. This is fundamentally incompatible with the aim of protecting users' freedom to change the software. The systematic pattern of such abuse occurs in the area of products for individuals to use, which is precisely where it is most unacceptable. Therefore, we have designed this version of the GPL to prohibit the practice for those products. If such problems arise substantially in other domains, we stand ready to extend this provision to those domains in future versions of the GPL, as needed to protect the freedom of users. Finally, every program is threatened constantly by software patents. States should not allow patents to restrict development and use of software on general-purpose computers, but in those that do, we wish to avoid the special danger that patents applied to a free program could make it effectively proprietary. To prevent this, the GPL assures that patents cannot be used to render the program non-free. The precise terms and conditions for copying, distribution and modification follow. TERMS AND CONDITIONS 0. Definitions. "This License" refers to version 3 of the GNU General Public License. "Copyright" also means copyright-like laws that apply to other kinds of works, such as semiconductor masks. "The Program" refers to any copyrightable work licensed under this License. Each licensee is addressed as "you". "Licensees" and "recipients" may be individuals or organizations. To "modify" a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a "modified version" of the earlier work or a work "based on" the earlier work. A "covered work" means either the unmodified Program or a work based on the Program. To "propagate" a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well. To "convey" a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying. An interactive user interface displays "Appropriate Legal Notices" to the extent that it includes a convenient and prominently visible feature that (1) displays an appropriate copyright notice, and (2) tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion. 1. Source Code. The "source code" for a work means the preferred form of the work for making modifications to it. "Object code" means any non-source form of a work. A "Standard Interface" means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language. The "System Libraries" of an executable work include anything, other than the work as a whole, that (a) is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and (b) serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A "Major Component", in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it. The "Corresponding Source" for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work. The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source. The Corresponding Source for a work in source code form is that same work. 2. Basic Permissions. All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law. You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you. Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary. 3. Protecting Users' Legal Rights From Anti-Circumvention Law. No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures. When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures. 4. Conveying Verbatim Copies. You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program. You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee. 5. Conveying Modified Source Versions. You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions: a) The work must carry prominent notices stating that you modified it, and giving a relevant date. b) The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to "keep intact all notices". c) You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it. d) If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so. A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an "aggregate" if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate. 6. Conveying Non-Source Forms. You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways: a) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange. b) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge. c) Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b. d) Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements. e) Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d. A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work. A "User Product" is either (1) a "consumer product", which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, "normally used" refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product. "Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made. If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM). The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network. Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying. 7. Additional Terms. "Additional permissions" are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions. When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission. Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms: a) Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or b) Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or c) Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or d) Limiting the use for publicity purposes of names of licensors or authors of the material; or e) Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or f) Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors. All other non-permissive additional terms are considered "further restrictions" within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying. If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms. Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way. 8. Termination. You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11). However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation. Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice. Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10. 9. Acceptance Not Required for Having Copies. You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so. 10. Automatic Licensing of Downstream Recipients. Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License. An "entity transaction" is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts. You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. 11. Patents. A "contributor" is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's "contributor version". A contributor's "essential patent claims" are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, "control" includes the right to grant patent sublicenses in a manner consistent with the requirements of this License. Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version. In the following three paragraphs, a "patent license" is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To "grant" such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party. If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either (1) cause the Corresponding Source to be so available, or (2) arrange to deprive yourself of the benefit of the patent license for this particular work, or (3) arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. "Knowingly relying" means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid. If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it. A patent license is "discriminatory" if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license (a) in connection with copies of the covered work conveyed by you (or copies made from those copies), or (b) primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007. Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law. 12. No Surrender of Others' Freedom. If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program. 13. Use with the GNU Affero General Public License. Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU Affero General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the special requirements of the GNU Affero General Public License, section 13, concerning interaction through a network will apply to the combination as such. 14. Revised Versions of this License. The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU General Public License, you may choose any version ever published by the Free Software Foundation. If the Program specifies that a proxy can decide which future versions of the GNU General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program. Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version. 15. Disclaimer of Warranty. THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. Limitation of Liability. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. 17. Interpretation of Sections 15 and 16. If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively state the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. {one line to give the program's name and a brief idea of what it does.} Copyright (C) {year} {name of author} This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . Also add information on how to contact you by electronic and paper mail. If the program does terminal interaction, make it output a short notice like this when it starts in an interactive mode: {project} Copyright (C) {year} {fullname} This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, your program's commands might be different; for a GUI interface, you would use an "about box". You should also get your employer (if you work as a programmer) or school, if any, to sign a "copyright disclaimer" for the program, if necessary. For more information on this, and how to apply and follow the GNU GPL, see . The GNU General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. But first, please read . ================================================ FILE: README.md ================================================ ## RP-VIO: Robust Plane-based Visual-Inertial Odometry for Dynamic Environments Karnik Ram, Chaitanya Kharyal, Sudarshan S. Harithas, K. Madhava Krishna. [[`arXiv`](https://arxiv.org/pdf/2103.10400.pdf)] [[`Project Page`](https://karnikram.info/rp-vio/)] In IROS 2021

RP-VIO is a monocular visual-inertial odometry (VIO) system that uses only planar features and their induced homographies, during both initialization and sliding-window estimation, for increased robustness and accuracy in dynamic environments. ## Setup Our evaluation setup is a 6-core Intel Core i5-8400 CPU with 8GB RAM and a 1 TB HDD, running Ubuntu 18.04.1. We recommend using a more powerful setup, especially for heavy datasets like ADVIO or OpenLORIS. ### Pre-requisites [ROS Melodic](http://wiki.ros.org/melodic) (OpenCV 3.2.0, Eigen 3.3.4-4)
[Ceres Solver 1.14.0](https://github.com/ceres-solver/ceres-solver/releases)
[EVO](https://github.com/MichaelGrupp/evo) The versions indicated are the versions used in our evaluation setup, and we do not guarantee our code to run on newer versions like ROS Noetic (OpenCV 4.2). ### Build Run the following commands in your terminal to clone our project and build, ``` cd ~/catkin_ws/src git clone https://github.com/karnikram/rp-vio.git cd ../ catkin_make -j4 source ~/catkin_ws/devel/setup.bash ``` ## Evaluation We provide evaluation scripts to run RP-VIO on the [RPVIO-Sim](https://github.com/karnikram/rp-vio#rpvio-sim-dataset-1) dataset, and select sequences from the [OpenLORIS-Scene]((https://lifelong-robotic-vision.github.io/dataset/scene.html)), [ADVIO](https://github.com/AaltoVision/ADVIO), and [VIODE](https://github.com/kminoda/VIODE) datasets. The output errors from your evaluation might not be exactly the same as reported in our paper, but should be similar. ### RPVIO-Sim Dataset Download the [dataset](https://github.com/karnikram/rp-vio#rpvio-sim-dataset-1) files to a parent folder, and then run the following commands to launch our evaluation script. The script runs rp-vio on each of the six sequences once and computes the ATE error statistics. ``` cd ~/catkin_ws/src/rp-vio/scripts/ ./run_rpvio_sim.sh ``` To run the multiple planes version (RPVIO-Multi), checkout the corresponding branch by running `git checkout rpvio-multi`, and re-run the above script. ### Real-world sequences We evaluate on two real-world sequences: the market1-1 sequence from the OpenLORIS-Scene dataset and the metro station sequence (12) from the ADVIO dataset. Both of these sequences along with their segmented plane masks are available as bagfiles for download [here](https://iiitaphyd-my.sharepoint.com/:f:/g/personal/robotics_iiit_ac_in/EozZ6vJP5UZFmZhA-9w0bBcBvTXpszD42fPx3x3ZlKvD6A?e=FtzFRz). After downloading and extracting the files run the following commands for evaluation, ``` cd ~/catkin_ws/src/rp-vio/scripts/ ./run_ol_market1.sh ./run_advio_12.sh ``` ### Own data To run RP-VIO on your own data, you need to provide synchronized monocular images, IMU readings, and plane masks on three separate ROS topics. The camera and IMU need to be properly calibrated and synchronized as there is no online calibration. A plane segmentation model to segment plane masks from images is provided [below](https://github.com/karnikram/rp-vio#plane-segmentation). A semantic segmentation model can also be used as long as the RGB labels of the (static) planar semantic classes are provided. As an example, we evaluate on a sequence from the VIODE dataset (provided [here](https://iiitaphyd-my.sharepoint.com/:f:/g/personal/robotics_iiit_ac_in/EoxFVvuAxUdFsnXu0XJY0egBMFxB9D8XbNqe0jkUkRdjVg?e=G18fDo)) using semantic segmentation labels which are specified in the [config file](https://github.com/karnikram/rp-vio/blob/semantic-viode/config/viode_config.yaml). To run, ``` cd ~/catkin_ws/src/rp-vio/scripts git checkout semantic-viode ./run_viode_night.sh ``` ## Plane segmentation We provide a pre-trained plane instance segmentation model, based on the [Plane-Recover](https://github.com/fuy34/planerecover) model. We retrained their model, with an added inter-plane constraint, on their SYNTHIA training data and two additional sequences (00,01) from the ScanNet dataset. The model was trained on a single Titan X (maxwell) GPU for about 700K iterations. We also provide the training script. We run the model offline, after extracting and [processing](https://github.com/fuy34/planerecover#preparing-training-data) the input RGB images from their ROS bagfiles. Follow the steps given below to run the pre-trained model on your custom dataset (requires CUDA 9.0), #### Create Environment Run the following commands to create a suitable conda environemnt, ``` cd plane_segmentation/ conda create --name plane_seg --file requirements.txt conda activate plane_seg ``` #### Run inference Now extract images from your dataset to a test folder, resize them to (192,320) (height, width), and run the following, ``` python inference.py --dataset= --output_dir= --test_list= --ckpt_file= --use_preprocessed=true ``` *TEST_DATA_LIST.txt* is a file that points to every image within the test dataset, an example can be found [here](./plane_segmentation/openloris.txt). *PATH_TO_DATASET* is the path to the parent directory of the test folder. The result of the inference would be a stored in three folders that are named as *plane_sgmts* (predicted masks in grayscale), *plane_sgmts_vis* (predicted masks in color), *plane_sgmts_modified* (grayscale masks but suitable for visualization (feed this output to the CRF inference)). #### Run CRF inference We also use a dense CRF model (from [PyDenseCRF](https://github.com/lucasb-eyer/pydensecrf)) to further refine the output masks. To run, ``` python crf_inference.py ``` where the *labels_dir* is the path to the *plane_sgmts_modified* folder. We then write these outputs mask images back into the original bagfile on a separate topic for running with RP-VIO. ## RPVIO-Sim Dataset

For an effective evaluation of the capabilities of modern VINS systems, we generate a highly-dynamic visual-inertial dataset using [AirSim](https://github.com/microsoft/AirSim/) which contains dynamic characters present throughout the sequences (including initialization), and with sufficient IMU excitation. Dynamic characters are progressively added, keeping everything else fixed, starting from no characters in the `static` sequence to eight characters in the `C8` sequence. All the generated sequences (six) in rosbag format, along with their groundtruth files, have been made available via [Zenodo](https://zenodo.org/record/4603494#.YE4BzlMzZH4). Each rosbag contains RGB images published on the `/image` topic at 20 Hz, imu measurements published on the`/imu` topic at ~1000 Hz (which we sub-sample to 200Hz for our evaluations), and plane-instance mask images published on the`/mask` topic at 20 Hz. The groundtruth trajectory is saved as a txt file in TUM format. The parameters for the camera and IMU used in our dataset are as follows,
To quantify the dynamic nature of our generated sequences, we compute the percentage of dynamic pixels out of all the pixels present in every image. We report these values in the following table,
### TO-DO - [ ] Provide Unreal Engine environment - [ ] Provide AirSim recording scripts ## Acknowledgement Our code is built upon [VINS-Mono](https://github.com/HKUST-Aerial-Robotics/VINS-Mono). Its implementations of feature tracking, IMU preintegration, IMU state initialization, the reprojection factor, and marginalization are used as such. Our contributions include planar features tracking, planar homography based initialization, and the planar homography factor. All these changes (corresponding to a slightly older version) are available as a [git patch file](./rpvio.patch). For our simulated dataset, we imported several high-quality assets from the [FlightGoggles](https://flightgoggles.mit.edu/) project into [Unreal Engine](unrealengine.com) before integrating it with AirSim. The dynamic characters were downloaded from [Mixamo](https://mixamo.com). ================================================ FILE: camera_model/CMakeLists.txt ================================================ cmake_minimum_required(VERSION 2.8.3) project(camera_model) set(CMAKE_BUILD_TYPE "Release") set(CMAKE_CXX_FLAGS "-std=c++11") set(CMAKE_CXX_FLAGS_RELEASE "-O3 -fPIC") find_package(catkin REQUIRED COMPONENTS roscpp std_msgs ) find_package(Boost REQUIRED COMPONENTS filesystem program_options system) include_directories(${Boost_INCLUDE_DIRS}) find_package(OpenCV REQUIRED) # set(EIGEN_INCLUDE_DIR "/usr/local/include/eigen3") find_package(Ceres REQUIRED) include_directories(${CERES_INCLUDE_DIRS}) catkin_package( INCLUDE_DIRS include LIBRARIES camera_model CATKIN_DEPENDS roscpp std_msgs # DEPENDS system_lib ) include_directories( ${catkin_INCLUDE_DIRS} ) include_directories("include") add_executable(Calibration src/intrinsic_calib.cc src/chessboard/Chessboard.cc src/calib/CameraCalibration.cc src/camera_models/Camera.cc src/camera_models/CameraFactory.cc src/camera_models/CostFunctionFactory.cc src/camera_models/PinholeCamera.cc src/camera_models/CataCamera.cc src/camera_models/EquidistantCamera.cc src/camera_models/ScaramuzzaCamera.cc src/sparse_graph/Transform.cc src/gpl/gpl.cc src/gpl/EigenQuaternionParameterization.cc) add_library(camera_model src/chessboard/Chessboard.cc src/calib/CameraCalibration.cc src/camera_models/Camera.cc src/camera_models/CameraFactory.cc src/camera_models/CostFunctionFactory.cc src/camera_models/PinholeCamera.cc src/camera_models/CataCamera.cc src/camera_models/EquidistantCamera.cc src/camera_models/ScaramuzzaCamera.cc src/sparse_graph/Transform.cc src/gpl/gpl.cc src/gpl/EigenQuaternionParameterization.cc) target_link_libraries(Calibration ${Boost_LIBRARIES} ${OpenCV_LIBS} ${CERES_LIBRARIES}) target_link_libraries(camera_model ${Boost_LIBRARIES} ${OpenCV_LIBS} ${CERES_LIBRARIES}) ================================================ FILE: camera_model/cmake/FindEigen.cmake ================================================ # Ceres Solver - A fast non-linear least squares minimizer # Copyright 2015 Google Inc. All rights reserved. # http://ceres-solver.org/ # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # * Redistributions of source code must retain the above copyright notice, # this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright notice, # this list of conditions and the following disclaimer in the documentation # and/or other materials provided with the distribution. # * Neither the name of Google Inc. nor the names of its contributors may be # used to endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE # ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE # LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR # CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF # SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS # INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN # CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) # ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE # POSSIBILITY OF SUCH DAMAGE. # # Author: alexs.mac@gmail.com (Alex Stewart) # # FindEigen.cmake - Find Eigen library, version >= 3. # # This module defines the following variables: # # EIGEN_FOUND: TRUE iff Eigen is found. # EIGEN_INCLUDE_DIRS: Include directories for Eigen. # # EIGEN_VERSION: Extracted from Eigen/src/Core/util/Macros.h # EIGEN_WORLD_VERSION: Equal to 3 if EIGEN_VERSION = 3.2.0 # EIGEN_MAJOR_VERSION: Equal to 2 if EIGEN_VERSION = 3.2.0 # EIGEN_MINOR_VERSION: Equal to 0 if EIGEN_VERSION = 3.2.0 # # The following variables control the behaviour of this module: # # EIGEN_INCLUDE_DIR_HINTS: List of additional directories in which to # search for eigen includes, e.g: /timbuktu/eigen3. # # The following variables are also defined by this module, but in line with # CMake recommended FindPackage() module style should NOT be referenced directly # by callers (use the plural variables detailed above instead). These variables # do however affect the behaviour of the module via FIND_[PATH/LIBRARY]() which # are NOT re-called (i.e. search for library is not repeated) if these variables # are set with valid values _in the CMake cache_. This means that if these # variables are set directly in the cache, either by the user in the CMake GUI, # or by the user passing -DVAR=VALUE directives to CMake when called (which # explicitly defines a cache variable), then they will be used verbatim, # bypassing the HINTS variables and other hard-coded search locations. # # EIGEN_INCLUDE_DIR: Include directory for CXSparse, not including the # include directory of any dependencies. # Called if we failed to find Eigen or any of it's required dependencies, # unsets all public (designed to be used externally) variables and reports # error message at priority depending upon [REQUIRED/QUIET/] argument. macro(EIGEN_REPORT_NOT_FOUND REASON_MSG) unset(EIGEN_FOUND) unset(EIGEN_INCLUDE_DIRS) # Make results of search visible in the CMake GUI if Eigen has not # been found so that user does not have to toggle to advanced view. mark_as_advanced(CLEAR EIGEN_INCLUDE_DIR) # Note _FIND_[REQUIRED/QUIETLY] variables defined by FindPackage() # use the camelcase library name, not uppercase. if (Eigen_FIND_QUIETLY) message(STATUS "Failed to find Eigen - " ${REASON_MSG} ${ARGN}) elseif (Eigen_FIND_REQUIRED) message(FATAL_ERROR "Failed to find Eigen - " ${REASON_MSG} ${ARGN}) else() # Neither QUIETLY nor REQUIRED, use no priority which emits a message # but continues configuration and allows generation. message("-- Failed to find Eigen - " ${REASON_MSG} ${ARGN}) endif () return() endmacro(EIGEN_REPORT_NOT_FOUND) # Protect against any alternative find_package scripts for this library having # been called previously (in a client project) which set EIGEN_FOUND, but not # the other variables we require / set here which could cause the search logic # here to fail. unset(EIGEN_FOUND) # Search user-installed locations first, so that we prefer user installs # to system installs where both exist. list(APPEND EIGEN_CHECK_INCLUDE_DIRS /usr/local/include /usr/local/homebrew/include # Mac OS X /opt/local/var/macports/software # Mac OS X. /opt/local/include /usr/include) # Additional suffixes to try appending to each search path. list(APPEND EIGEN_CHECK_PATH_SUFFIXES eigen3 # Default root directory for Eigen. Eigen/include/eigen3 # Windows (for C:/Program Files prefix) < 3.3 Eigen3/include/eigen3 ) # Windows (for C:/Program Files prefix) >= 3.3 # Search supplied hint directories first if supplied. find_path(EIGEN_INCLUDE_DIR NAMES Eigen/Core PATHS ${EIGEN_INCLUDE_DIR_HINTS} ${EIGEN_CHECK_INCLUDE_DIRS} PATH_SUFFIXES ${EIGEN_CHECK_PATH_SUFFIXES}) if (NOT EIGEN_INCLUDE_DIR OR NOT EXISTS ${EIGEN_INCLUDE_DIR}) eigen_report_not_found( "Could not find eigen3 include directory, set EIGEN_INCLUDE_DIR to " "path to eigen3 include directory, e.g. /usr/local/include/eigen3.") endif (NOT EIGEN_INCLUDE_DIR OR NOT EXISTS ${EIGEN_INCLUDE_DIR}) # Mark internally as found, then verify. EIGEN_REPORT_NOT_FOUND() unsets # if called. set(EIGEN_FOUND TRUE) # Extract Eigen version from Eigen/src/Core/util/Macros.h if (EIGEN_INCLUDE_DIR) set(EIGEN_VERSION_FILE ${EIGEN_INCLUDE_DIR}/Eigen/src/Core/util/Macros.h) if (NOT EXISTS ${EIGEN_VERSION_FILE}) eigen_report_not_found( "Could not find file: ${EIGEN_VERSION_FILE} " "containing version information in Eigen install located at: " "${EIGEN_INCLUDE_DIR}.") else (NOT EXISTS ${EIGEN_VERSION_FILE}) file(READ ${EIGEN_VERSION_FILE} EIGEN_VERSION_FILE_CONTENTS) string(REGEX MATCH "#define EIGEN_WORLD_VERSION [0-9]+" EIGEN_WORLD_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_WORLD_VERSION ([0-9]+)" "\\1" EIGEN_WORLD_VERSION "${EIGEN_WORLD_VERSION}") string(REGEX MATCH "#define EIGEN_MAJOR_VERSION [0-9]+" EIGEN_MAJOR_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_MAJOR_VERSION ([0-9]+)" "\\1" EIGEN_MAJOR_VERSION "${EIGEN_MAJOR_VERSION}") string(REGEX MATCH "#define EIGEN_MINOR_VERSION [0-9]+" EIGEN_MINOR_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_MINOR_VERSION ([0-9]+)" "\\1" EIGEN_MINOR_VERSION "${EIGEN_MINOR_VERSION}") # This is on a single line s/t CMake does not interpret it as a list of # elements and insert ';' separators which would result in 3.;2.;0 nonsense. set(EIGEN_VERSION "${EIGEN_WORLD_VERSION}.${EIGEN_MAJOR_VERSION}.${EIGEN_MINOR_VERSION}") endif (NOT EXISTS ${EIGEN_VERSION_FILE}) endif (EIGEN_INCLUDE_DIR) # Set standard CMake FindPackage variables if found. if (EIGEN_FOUND) set(EIGEN_INCLUDE_DIRS ${EIGEN_INCLUDE_DIR}) endif (EIGEN_FOUND) # Handle REQUIRED / QUIET optional arguments and version. include(FindPackageHandleStandardArgs) find_package_handle_standard_args(Eigen REQUIRED_VARS EIGEN_INCLUDE_DIRS VERSION_VAR EIGEN_VERSION) # Only mark internal variables as advanced if we found Eigen, otherwise # leave it visible in the standard GUI for the user to set manually. if (EIGEN_FOUND) mark_as_advanced(FORCE EIGEN_INCLUDE_DIR) endif (EIGEN_FOUND) ================================================ FILE: camera_model/include/camodocal/calib/CameraCalibration.h ================================================ #ifndef CAMERACALIBRATION_H #define CAMERACALIBRATION_H #include #include "camodocal/camera_models/Camera.h" namespace camodocal { class CameraCalibration { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW CameraCalibration(); CameraCalibration(Camera::ModelType modelType, const std::string& cameraName, const cv::Size& imageSize, const cv::Size& boardSize, float squareSize); void clear(void); void addChessboardData(const std::vector& corners); bool calibrate(void); int sampleCount(void) const; std::vector >& imagePoints(void); const std::vector >& imagePoints(void) const; std::vector >& scenePoints(void); const std::vector >& scenePoints(void) const; CameraPtr& camera(void); const CameraConstPtr camera(void) const; Eigen::Matrix2d& measurementCovariance(void); const Eigen::Matrix2d& measurementCovariance(void) const; cv::Mat& cameraPoses(void); const cv::Mat& cameraPoses(void) const; void drawResults(std::vector& images) const; void writeParams(const std::string& filename) const; bool writeChessboardData(const std::string& filename) const; bool readChessboardData(const std::string& filename); void setVerbose(bool verbose); private: bool calibrateHelper(CameraPtr& camera, std::vector& rvecs, std::vector& tvecs) const; void optimize(CameraPtr& camera, std::vector& rvecs, std::vector& tvecs) const; template void readData(std::ifstream& ifs, T& data) const; template void writeData(std::ofstream& ofs, T data) const; cv::Size m_boardSize; float m_squareSize; CameraPtr m_camera; cv::Mat m_cameraPoses; std::vector > m_imagePoints; std::vector > m_scenePoints; Eigen::Matrix2d m_measurementCovariance; bool m_verbose; }; } #endif ================================================ FILE: camera_model/include/camodocal/camera_models/Camera.h ================================================ #ifndef CAMERA_H #define CAMERA_H #include #include #include #include namespace camodocal { class Camera { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW enum ModelType { KANNALA_BRANDT, MEI, PINHOLE, SCARAMUZZA }; class Parameters { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW Parameters(ModelType modelType); Parameters(ModelType modelType, const std::string& cameraName, int w, int h); ModelType& modelType(void); std::string& cameraName(void); int& imageWidth(void); int& imageHeight(void); ModelType modelType(void) const; const std::string& cameraName(void) const; int imageWidth(void) const; int imageHeight(void) const; int nIntrinsics(void) const; virtual bool readFromYamlFile(const std::string& filename) = 0; virtual void writeToYamlFile(const std::string& filename) const = 0; protected: ModelType m_modelType; int m_nIntrinsics; std::string m_cameraName; int m_imageWidth; int m_imageHeight; }; virtual ModelType modelType(void) const = 0; virtual const std::string& cameraName(void) const = 0; virtual int imageWidth(void) const = 0; virtual int imageHeight(void) const = 0; virtual cv::Mat& mask(void); virtual const cv::Mat& mask(void) const; virtual void estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints) = 0; virtual void estimateExtrinsics(const std::vector& objectPoints, const std::vector& imagePoints, cv::Mat& rvec, cv::Mat& tvec) const; // Lift points from the image plane to the sphere virtual void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const = 0; //%output P // Lift points from the image plane to the projective space virtual void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const = 0; //%output P // Projects 3D points to the image plane (Pi function) virtual void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const = 0; //%output p // Projects 3D points to the image plane (Pi function) // and calculates jacobian //virtual void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p, // Eigen::Matrix& J) const = 0; //%output p //%output J virtual void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const = 0; //%output p //virtual void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const = 0; virtual cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx = -1.0f, float fy = -1.0f, cv::Size imageSize = cv::Size(0, 0), float cx = -1.0f, float cy = -1.0f, cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const = 0; virtual int parameterCount(void) const = 0; virtual void readParameters(const std::vector& parameters) = 0; virtual void writeParameters(std::vector& parameters) const = 0; virtual void writeParametersToYamlFile(const std::string& filename) const = 0; virtual std::string parametersToString(void) const = 0; /** * \brief Calculates the reprojection distance between points * * \param P1 first 3D point coordinates * \param P2 second 3D point coordinates * \return euclidean distance in the plane */ double reprojectionDist(const Eigen::Vector3d& P1, const Eigen::Vector3d& P2) const; double reprojectionError(const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints, const std::vector& rvecs, const std::vector& tvecs, cv::OutputArray perViewErrors = cv::noArray()) const; double reprojectionError(const Eigen::Vector3d& P, const Eigen::Quaterniond& camera_q, const Eigen::Vector3d& camera_t, const Eigen::Vector2d& observed_p) const; void projectPoints(const std::vector& objectPoints, const cv::Mat& rvec, const cv::Mat& tvec, std::vector& imagePoints) const; protected: cv::Mat m_mask; }; typedef boost::shared_ptr CameraPtr; typedef boost::shared_ptr CameraConstPtr; } #endif ================================================ FILE: camera_model/include/camodocal/camera_models/CameraFactory.h ================================================ #ifndef CAMERAFACTORY_H #define CAMERAFACTORY_H #include #include #include "camodocal/camera_models/Camera.h" namespace camodocal { class CameraFactory { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW CameraFactory(); static boost::shared_ptr instance(void); CameraPtr generateCamera(Camera::ModelType modelType, const std::string& cameraName, cv::Size imageSize) const; CameraPtr generateCameraFromYamlFile(const std::string& filename); private: static boost::shared_ptr m_instance; }; } #endif ================================================ FILE: camera_model/include/camodocal/camera_models/CataCamera.h ================================================ #ifndef CATACAMERA_H #define CATACAMERA_H #include #include #include "ceres/rotation.h" #include "Camera.h" namespace camodocal { /** * C. Mei, and P. Rives, Single View Point Omnidirectional Camera Calibration * from Planar Grids, ICRA 2007 */ class CataCamera: public Camera { public: class Parameters: public Camera::Parameters { public: Parameters(); Parameters(const std::string& cameraName, int w, int h, double xi, double k1, double k2, double p1, double p2, double gamma1, double gamma2, double u0, double v0); double& xi(void); double& k1(void); double& k2(void); double& p1(void); double& p2(void); double& gamma1(void); double& gamma2(void); double& u0(void); double& v0(void); double xi(void) const; double k1(void) const; double k2(void) const; double p1(void) const; double p2(void) const; double gamma1(void) const; double gamma2(void) const; double u0(void) const; double v0(void) const; bool readFromYamlFile(const std::string& filename); void writeToYamlFile(const std::string& filename) const; Parameters& operator=(const Parameters& other); friend std::ostream& operator<< (std::ostream& out, const Parameters& params); private: double m_xi; double m_k1; double m_k2; double m_p1; double m_p2; double m_gamma1; double m_gamma2; double m_u0; double m_v0; }; CataCamera(); /** * \brief Constructor from the projection model parameters */ CataCamera(const std::string& cameraName, int imageWidth, int imageHeight, double xi, double k1, double k2, double p1, double p2, double gamma1, double gamma2, double u0, double v0); /** * \brief Constructor from the projection model parameters */ CataCamera(const Parameters& params); Camera::ModelType modelType(void) const; const std::string& cameraName(void) const; int imageWidth(void) const; int imageHeight(void) const; void estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints); // Lift points from the image plane to the sphere void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const; //%output P // Lift points from the image plane to the projective space void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const; //%output P // Projects 3D points to the image plane (Pi function) void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const; //%output p // Projects 3D points to the image plane (Pi function) // and calculates jacobian void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p, Eigen::Matrix& J) const; //%output p //%output J void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const; //%output p template static void spaceToPlane(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& p); void distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u) const; void distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u, Eigen::Matrix2d& J) const; void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const; cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx = -1.0f, float fy = -1.0f, cv::Size imageSize = cv::Size(0, 0), float cx = -1.0f, float cy = -1.0f, cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const; int parameterCount(void) const; const Parameters& getParameters(void) const; void setParameters(const Parameters& parameters); void readParameters(const std::vector& parameterVec); void writeParameters(std::vector& parameterVec) const; void writeParametersToYamlFile(const std::string& filename) const; std::string parametersToString(void) const; private: Parameters mParameters; double m_inv_K11, m_inv_K13, m_inv_K22, m_inv_K23; bool m_noDistortion; }; typedef boost::shared_ptr CataCameraPtr; typedef boost::shared_ptr CataCameraConstPtr; template void CataCamera::spaceToPlane(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& p) { T P_w[3]; P_w[0] = T(P(0)); P_w[1] = T(P(1)); P_w[2] = T(P(2)); // Convert quaternion from Eigen convention (x, y, z, w) // to Ceres convention (w, x, y, z) T q_ceres[4] = {q[3], q[0], q[1], q[2]}; T P_c[3]; ceres::QuaternionRotatePoint(q_ceres, P_w, P_c); P_c[0] += t[0]; P_c[1] += t[1]; P_c[2] += t[2]; // project 3D object point to the image plane T xi = params[0]; T k1 = params[1]; T k2 = params[2]; T p1 = params[3]; T p2 = params[4]; T gamma1 = params[5]; T gamma2 = params[6]; T alpha = T(0); //cameraParams.alpha(); T u0 = params[7]; T v0 = params[8]; // Transform to model plane T len = sqrt(P_c[0] * P_c[0] + P_c[1] * P_c[1] + P_c[2] * P_c[2]); P_c[0] /= len; P_c[1] /= len; P_c[2] /= len; T u = P_c[0] / (P_c[2] + xi); T v = P_c[1] / (P_c[2] + xi); T rho_sqr = u * u + v * v; T L = T(1.0) + k1 * rho_sqr + k2 * rho_sqr * rho_sqr; T du = T(2.0) * p1 * u * v + p2 * (rho_sqr + T(2.0) * u * u); T dv = p1 * (rho_sqr + T(2.0) * v * v) + T(2.0) * p2 * u * v; u = L * u + du; v = L * v + dv; p(0) = gamma1 * (u + alpha * v) + u0; p(1) = gamma2 * v + v0; } } #endif ================================================ FILE: camera_model/include/camodocal/camera_models/CostFunctionFactory.h ================================================ #ifndef COSTFUNCTIONFACTORY_H #define COSTFUNCTIONFACTORY_H #include #include #include "camodocal/camera_models/Camera.h" namespace ceres { class CostFunction; } namespace camodocal { enum { CAMERA_INTRINSICS = 1 << 0, CAMERA_POSE = 1 << 1, POINT_3D = 1 << 2, ODOMETRY_INTRINSICS = 1 << 3, ODOMETRY_3D_POSE = 1 << 4, ODOMETRY_6D_POSE = 1 << 5, CAMERA_ODOMETRY_TRANSFORM = 1 << 6 }; class CostFunctionFactory { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW CostFunctionFactory(); static boost::shared_ptr instance(void); ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p, int flags) const; ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p, const Eigen::Matrix2d& sqrtPrecisionMat, int flags) const; ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector2d& observed_p, int flags, bool optimize_cam_odo_z = true) const; ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector2d& observed_p, const Eigen::Matrix2d& sqrtPrecisionMat, int flags, bool optimize_cam_odo_z = true) const; ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector3d& odo_pos, const Eigen::Vector3d& odo_att, const Eigen::Vector2d& observed_p, int flags, bool optimize_cam_odo_z = true) const; ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera, const Eigen::Quaterniond& cam_odo_q, const Eigen::Vector3d& cam_odo_t, const Eigen::Vector3d& odo_pos, const Eigen::Vector3d& odo_att, const Eigen::Vector2d& observed_p, int flags) const; ceres::CostFunction* generateCostFunction(const CameraConstPtr& cameraLeft, const CameraConstPtr& cameraRight, const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p_left, const Eigen::Vector2d& observed_p_right) const; private: static boost::shared_ptr m_instance; }; } #endif ================================================ FILE: camera_model/include/camodocal/camera_models/EquidistantCamera.h ================================================ #ifndef EQUIDISTANTCAMERA_H #define EQUIDISTANTCAMERA_H #include #include #include "ceres/rotation.h" #include "Camera.h" namespace camodocal { /** * J. Kannala, and S. Brandt, A Generic Camera Model and Calibration Method * for Conventional, Wide-Angle, and Fish-Eye Lenses, PAMI 2006 */ class EquidistantCamera: public Camera { public: class Parameters: public Camera::Parameters { public: Parameters(); Parameters(const std::string& cameraName, int w, int h, double k2, double k3, double k4, double k5, double mu, double mv, double u0, double v0); double& k2(void); double& k3(void); double& k4(void); double& k5(void); double& mu(void); double& mv(void); double& u0(void); double& v0(void); double k2(void) const; double k3(void) const; double k4(void) const; double k5(void) const; double mu(void) const; double mv(void) const; double u0(void) const; double v0(void) const; bool readFromYamlFile(const std::string& filename); void writeToYamlFile(const std::string& filename) const; Parameters& operator=(const Parameters& other); friend std::ostream& operator<< (std::ostream& out, const Parameters& params); private: // projection double m_k2; double m_k3; double m_k4; double m_k5; double m_mu; double m_mv; double m_u0; double m_v0; }; EquidistantCamera(); /** * \brief Constructor from the projection model parameters */ EquidistantCamera(const std::string& cameraName, int imageWidth, int imageHeight, double k2, double k3, double k4, double k5, double mu, double mv, double u0, double v0); /** * \brief Constructor from the projection model parameters */ EquidistantCamera(const Parameters& params); Camera::ModelType modelType(void) const; const std::string& cameraName(void) const; int imageWidth(void) const; int imageHeight(void) const; void estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints); // Lift points from the image plane to the sphere virtual void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const; //%output P // Lift points from the image plane to the projective space void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const; //%output P // Projects 3D points to the image plane (Pi function) void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const; //%output p // Projects 3D points to the image plane (Pi function) // and calculates jacobian void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p, Eigen::Matrix& J) const; //%output p //%output J void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const; //%output p template static void spaceToPlane(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& p); void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const; cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx = -1.0f, float fy = -1.0f, cv::Size imageSize = cv::Size(0, 0), float cx = -1.0f, float cy = -1.0f, cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const; int parameterCount(void) const; const Parameters& getParameters(void) const; void setParameters(const Parameters& parameters); void readParameters(const std::vector& parameterVec); void writeParameters(std::vector& parameterVec) const; void writeParametersToYamlFile(const std::string& filename) const; std::string parametersToString(void) const; private: template static T r(T k2, T k3, T k4, T k5, T theta); void fitOddPoly(const std::vector& x, const std::vector& y, int n, std::vector& coeffs) const; void backprojectSymmetric(const Eigen::Vector2d& p_u, double& theta, double& phi) const; Parameters mParameters; double m_inv_K11, m_inv_K13, m_inv_K22, m_inv_K23; }; typedef boost::shared_ptr EquidistantCameraPtr; typedef boost::shared_ptr EquidistantCameraConstPtr; template T EquidistantCamera::r(T k2, T k3, T k4, T k5, T theta) { // k1 = 1 return theta + k2 * theta * theta * theta + k3 * theta * theta * theta * theta * theta + k4 * theta * theta * theta * theta * theta * theta * theta + k5 * theta * theta * theta * theta * theta * theta * theta * theta * theta; } template void EquidistantCamera::spaceToPlane(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& p) { T P_w[3]; P_w[0] = T(P(0)); P_w[1] = T(P(1)); P_w[2] = T(P(2)); // Convert quaternion from Eigen convention (x, y, z, w) // to Ceres convention (w, x, y, z) T q_ceres[4] = {q[3], q[0], q[1], q[2]}; T P_c[3]; ceres::QuaternionRotatePoint(q_ceres, P_w, P_c); P_c[0] += t[0]; P_c[1] += t[1]; P_c[2] += t[2]; // project 3D object point to the image plane; T k2 = params[0]; T k3 = params[1]; T k4 = params[2]; T k5 = params[3]; T mu = params[4]; T mv = params[5]; T u0 = params[6]; T v0 = params[7]; T len = sqrt(P_c[0] * P_c[0] + P_c[1] * P_c[1] + P_c[2] * P_c[2]); T theta = acos(P_c[2] / len); T phi = atan2(P_c[1], P_c[0]); Eigen::Matrix p_u = r(k2, k3, k4, k5, theta) * Eigen::Matrix(cos(phi), sin(phi)); p(0) = mu * p_u(0) + u0; p(1) = mv * p_u(1) + v0; } } #endif ================================================ FILE: camera_model/include/camodocal/camera_models/PinholeCamera.h ================================================ #ifndef PINHOLECAMERA_H #define PINHOLECAMERA_H #include #include #include "ceres/rotation.h" #include "Camera.h" namespace camodocal { class PinholeCamera: public Camera { public: class Parameters: public Camera::Parameters { public: Parameters(); Parameters(const std::string& cameraName, int w, int h, double k1, double k2, double p1, double p2, double fx, double fy, double cx, double cy); double& k1(void); double& k2(void); double& p1(void); double& p2(void); double& fx(void); double& fy(void); double& cx(void); double& cy(void); double xi(void) const; double k1(void) const; double k2(void) const; double p1(void) const; double p2(void) const; double fx(void) const; double fy(void) const; double cx(void) const; double cy(void) const; bool readFromYamlFile(const std::string& filename); void writeToYamlFile(const std::string& filename) const; Parameters& operator=(const Parameters& other); friend std::ostream& operator<< (std::ostream& out, const Parameters& params); private: double m_k1; double m_k2; double m_p1; double m_p2; double m_fx; double m_fy; double m_cx; double m_cy; }; PinholeCamera(); /** * \brief Constructor from the projection model parameters */ PinholeCamera(const std::string& cameraName, int imageWidth, int imageHeight, double k1, double k2, double p1, double p2, double fx, double fy, double cx, double cy); /** * \brief Constructor from the projection model parameters */ PinholeCamera(const Parameters& params); Camera::ModelType modelType(void) const; const std::string& cameraName(void) const; int imageWidth(void) const; int imageHeight(void) const; void estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints); // Lift points from the image plane to the sphere virtual void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const; //%output P // Lift points from the image plane to the projective space void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const; //%output P // Projects 3D points to the image plane (Pi function) void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const; //%output p // Projects 3D points to the image plane (Pi function) // and calculates jacobian void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p, Eigen::Matrix& J) const; //%output p //%output J void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const; //%output p template static void spaceToPlane(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& p); void distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u) const; void distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u, Eigen::Matrix2d& J) const; void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const; cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx = -1.0f, float fy = -1.0f, cv::Size imageSize = cv::Size(0, 0), float cx = -1.0f, float cy = -1.0f, cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const; int parameterCount(void) const; const Parameters& getParameters(void) const; void setParameters(const Parameters& parameters); void readParameters(const std::vector& parameterVec); void writeParameters(std::vector& parameterVec) const; void writeParametersToYamlFile(const std::string& filename) const; std::string parametersToString(void) const; private: Parameters mParameters; double m_inv_K11, m_inv_K13, m_inv_K22, m_inv_K23; bool m_noDistortion; }; typedef boost::shared_ptr PinholeCameraPtr; typedef boost::shared_ptr PinholeCameraConstPtr; template void PinholeCamera::spaceToPlane(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& p) { T P_w[3]; P_w[0] = T(P(0)); P_w[1] = T(P(1)); P_w[2] = T(P(2)); // Convert quaternion from Eigen convention (x, y, z, w) // to Ceres convention (w, x, y, z) T q_ceres[4] = {q[3], q[0], q[1], q[2]}; T P_c[3]; ceres::QuaternionRotatePoint(q_ceres, P_w, P_c); P_c[0] += t[0]; P_c[1] += t[1]; P_c[2] += t[2]; // project 3D object point to the image plane T k1 = params[0]; T k2 = params[1]; T p1 = params[2]; T p2 = params[3]; T fx = params[4]; T fy = params[5]; T alpha = T(0); //cameraParams.alpha(); T cx = params[6]; T cy = params[7]; // Transform to model plane T u = P_c[0] / P_c[2]; T v = P_c[1] / P_c[2]; T rho_sqr = u * u + v * v; T L = T(1.0) + k1 * rho_sqr + k2 * rho_sqr * rho_sqr; T du = T(2.0) * p1 * u * v + p2 * (rho_sqr + T(2.0) * u * u); T dv = p1 * (rho_sqr + T(2.0) * v * v) + T(2.0) * p2 * u * v; u = L * u + du; v = L * v + dv; p(0) = fx * (u + alpha * v) + cx; p(1) = fy * v + cy; } } #endif ================================================ FILE: camera_model/include/camodocal/camera_models/ScaramuzzaCamera.h ================================================ #ifndef SCARAMUZZACAMERA_H #define SCARAMUZZACAMERA_H #include #include #include "ceres/rotation.h" #include "Camera.h" namespace camodocal { #define SCARAMUZZA_POLY_SIZE 5 #define SCARAMUZZA_INV_POLY_SIZE 20 #define SCARAMUZZA_CAMERA_NUM_PARAMS (SCARAMUZZA_POLY_SIZE + SCARAMUZZA_INV_POLY_SIZE + 2 /*center*/ + 3 /*affine*/) /** * Scaramuzza Camera (Omnidirectional) * https://sites.google.com/site/scarabotix/ocamcalib-toolbox */ class OCAMCamera: public Camera { public: class Parameters: public Camera::Parameters { public: Parameters(); double& C(void) { return m_C; } double& D(void) { return m_D; } double& E(void) { return m_E; } double& center_x(void) { return m_center_x; } double& center_y(void) { return m_center_y; } double& poly(int idx) { return m_poly[idx]; } double& inv_poly(int idx) { return m_inv_poly[idx]; } double C(void) const { return m_C; } double D(void) const { return m_D; } double E(void) const { return m_E; } double center_x(void) const { return m_center_x; } double center_y(void) const { return m_center_y; } double poly(int idx) const { return m_poly[idx]; } double inv_poly(int idx) const { return m_inv_poly[idx]; } bool readFromYamlFile(const std::string& filename); void writeToYamlFile(const std::string& filename) const; Parameters& operator=(const Parameters& other); friend std::ostream& operator<< (std::ostream& out, const Parameters& params); private: double m_poly[SCARAMUZZA_POLY_SIZE]; double m_inv_poly[SCARAMUZZA_INV_POLY_SIZE]; double m_C; double m_D; double m_E; double m_center_x; double m_center_y; }; OCAMCamera(); /** * \brief Constructor from the projection model parameters */ OCAMCamera(const Parameters& params); Camera::ModelType modelType(void) const; const std::string& cameraName(void) const; int imageWidth(void) const; int imageHeight(void) const; void estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints); // Lift points from the image plane to the sphere void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const; //%output P // Lift points from the image plane to the projective space void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const; //%output P // Projects 3D points to the image plane (Pi function) void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const; //%output p // Projects 3D points to the image plane (Pi function) // and calculates jacobian //void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p, // Eigen::Matrix& J) const; //%output p //%output J void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const; //%output p template static void spaceToPlane(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& p); template static void spaceToSphere(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& P_s); template static void LiftToSphere(const T* const params, const Eigen::Matrix& p, Eigen::Matrix& P); template static void SphereToPlane(const T* const params, const Eigen::Matrix& P, Eigen::Matrix& p); void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const; cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx = -1.0f, float fy = -1.0f, cv::Size imageSize = cv::Size(0, 0), float cx = -1.0f, float cy = -1.0f, cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const; int parameterCount(void) const; const Parameters& getParameters(void) const; void setParameters(const Parameters& parameters); void readParameters(const std::vector& parameterVec); void writeParameters(std::vector& parameterVec) const; void writeParametersToYamlFile(const std::string& filename) const; std::string parametersToString(void) const; private: Parameters mParameters; double m_inv_scale; }; typedef boost::shared_ptr OCAMCameraPtr; typedef boost::shared_ptr OCAMCameraConstPtr; template void OCAMCamera::spaceToPlane(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& p) { T P_c[3]; { T P_w[3]; P_w[0] = T(P(0)); P_w[1] = T(P(1)); P_w[2] = T(P(2)); // Convert quaternion from Eigen convention (x, y, z, w) // to Ceres convention (w, x, y, z) T q_ceres[4] = {q[3], q[0], q[1], q[2]}; ceres::QuaternionRotatePoint(q_ceres, P_w, P_c); P_c[0] += t[0]; P_c[1] += t[1]; P_c[2] += t[2]; } T c = params[0]; T d = params[1]; T e = params[2]; T xc[2] = { params[3], params[4] }; //T poly[SCARAMUZZA_POLY_SIZE]; //for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++) // poly[i] = params[5+i]; T inv_poly[SCARAMUZZA_INV_POLY_SIZE]; for (int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++) inv_poly[i] = params[5 + SCARAMUZZA_POLY_SIZE + i]; T norm_sqr = P_c[0] * P_c[0] + P_c[1] * P_c[1]; T norm = T(0.0); if (norm_sqr > T(0.0)) norm = sqrt(norm_sqr); T theta = atan2(-P_c[2], norm); T rho = T(0.0); T theta_i = T(1.0); for (int i = 0; i < SCARAMUZZA_INV_POLY_SIZE; i++) { rho += theta_i * inv_poly[i]; theta_i *= theta; } T invNorm = T(1.0) / norm; T xn[2] = { P_c[0] * invNorm * rho, P_c[1] * invNorm * rho }; p(0) = xn[0] * c + xn[1] * d + xc[0]; p(1) = xn[0] * e + xn[1] + xc[1]; } template void OCAMCamera::spaceToSphere(const T* const params, const T* const q, const T* const t, const Eigen::Matrix& P, Eigen::Matrix& P_s) { T P_c[3]; { T P_w[3]; P_w[0] = T(P(0)); P_w[1] = T(P(1)); P_w[2] = T(P(2)); // Convert quaternion from Eigen convention (x, y, z, w) // to Ceres convention (w, x, y, z) T q_ceres[4] = {q[3], q[0], q[1], q[2]}; ceres::QuaternionRotatePoint(q_ceres, P_w, P_c); P_c[0] += t[0]; P_c[1] += t[1]; P_c[2] += t[2]; } //T poly[SCARAMUZZA_POLY_SIZE]; //for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++) // poly[i] = params[5+i]; T norm_sqr = P_c[0] * P_c[0] + P_c[1] * P_c[1] + P_c[2] * P_c[2]; T norm = T(0.0); if (norm_sqr > T(0.0)) norm = sqrt(norm_sqr); P_s(0) = P_c[0] / norm; P_s(1) = P_c[1] / norm; P_s(2) = P_c[2] / norm; } template void OCAMCamera::LiftToSphere(const T* const params, const Eigen::Matrix& p, Eigen::Matrix& P) { T c = params[0]; T d = params[1]; T e = params[2]; T cc[2] = { params[3], params[4] }; T poly[SCARAMUZZA_POLY_SIZE]; for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++) poly[i] = params[5+i]; // Relative to Center T p_2d[2]; p_2d[0] = T(p(0)); p_2d[1] = T(p(1)); T xc[2] = { p_2d[0] - cc[0], p_2d[1] - cc[1]}; T inv_scale = T(1.0) / (c - d * e); // Affine Transformation T xc_a[2]; xc_a[0] = inv_scale * (xc[0] - d * xc[1]); xc_a[1] = inv_scale * (-e * xc[0] + c * xc[1]); T norm_sqr = xc_a[0] * xc_a[0] + xc_a[1] * xc_a[1]; T phi = sqrt(norm_sqr); T phi_i = T(1.0); T z = T(0.0); for (int i = 0; i < SCARAMUZZA_POLY_SIZE; i++) { if (i!=1) { z += phi_i * poly[i]; } phi_i *= phi; } T p_3d[3]; p_3d[0] = xc[0]; p_3d[1] = xc[1]; p_3d[2] = -z; T p_3d_norm_sqr = p_3d[0] * p_3d[0] + p_3d[1] * p_3d[1] + p_3d[2] * p_3d[2]; T p_3d_norm = sqrt(p_3d_norm_sqr); P << p_3d[0] / p_3d_norm, p_3d[1] / p_3d_norm, p_3d[2] / p_3d_norm; } template void OCAMCamera::SphereToPlane(const T* const params, const Eigen::Matrix& P, Eigen::Matrix& p) { T P_c[3]; { P_c[0] = T(P(0)); P_c[1] = T(P(1)); P_c[2] = T(P(2)); } T c = params[0]; T d = params[1]; T e = params[2]; T xc[2] = {params[3], params[4]}; T inv_poly[SCARAMUZZA_INV_POLY_SIZE]; for (int i = 0; i < SCARAMUZZA_INV_POLY_SIZE; i++) inv_poly[i] = params[5 + SCARAMUZZA_POLY_SIZE + i]; T norm_sqr = P_c[0] * P_c[0] + P_c[1] * P_c[1]; T norm = T(0.0); if (norm_sqr > T(0.0)) norm = sqrt(norm_sqr); T theta = atan2(-P_c[2], norm); T rho = T(0.0); T theta_i = T(1.0); for (int i = 0; i < SCARAMUZZA_INV_POLY_SIZE; i++) { rho += theta_i * inv_poly[i]; theta_i *= theta; } T invNorm = T(1.0) / norm; T xn[2] = {P_c[0] * invNorm * rho, P_c[1] * invNorm * rho}; p(0) = xn[0] * c + xn[1] * d + xc[0]; p(1) = xn[0] * e + xn[1] + xc[1]; } } #endif ================================================ FILE: camera_model/include/camodocal/chessboard/Chessboard.h ================================================ #ifndef CHESSBOARD_H #define CHESSBOARD_H #include #include namespace camodocal { // forward declarations class ChessboardCorner; typedef boost::shared_ptr ChessboardCornerPtr; class ChessboardQuad; typedef boost::shared_ptr ChessboardQuadPtr; class Chessboard { public: Chessboard(cv::Size boardSize, cv::Mat& image); void findCorners(bool useOpenCV = false); const std::vector& getCorners(void) const; bool cornersFound(void) const; const cv::Mat& getImage(void) const; const cv::Mat& getSketch(void) const; private: bool findChessboardCorners(const cv::Mat& image, const cv::Size& patternSize, std::vector& corners, int flags, bool useOpenCV); bool findChessboardCornersImproved(const cv::Mat& image, const cv::Size& patternSize, std::vector& corners, int flags); void cleanFoundConnectedQuads(std::vector& quadGroup, cv::Size patternSize); void findConnectedQuads(std::vector& quads, std::vector& group, int group_idx, int dilation); // int checkQuadGroup(std::vector& quadGroup, // std::vector& outCorners, // cv::Size patternSize); void labelQuadGroup(std::vector& quad_group, cv::Size patternSize, bool firstRun); void findQuadNeighbors(std::vector& quads, int dilation); int augmentBestRun(std::vector& candidateQuads, int candidateDilation, std::vector& existingQuads, int existingDilation); void generateQuads(std::vector& quads, cv::Mat& image, int flags, int dilation, bool firstRun); bool checkQuadGroup(std::vector& quads, std::vector& corners, cv::Size patternSize); void getQuadrangleHypotheses(const std::vector< std::vector >& contours, std::vector< std::pair >& quads, int classId) const; bool checkChessboard(const cv::Mat& image, cv::Size patternSize) const; bool checkBoardMonotony(std::vector& corners, cv::Size patternSize); bool matchCorners(ChessboardQuadPtr& quad1, int corner1, ChessboardQuadPtr& quad2, int corner2) const; cv::Mat mImage; cv::Mat mSketch; std::vector mCorners; cv::Size mBoardSize; bool mCornersFound; }; } #endif ================================================ FILE: camera_model/include/camodocal/chessboard/ChessboardCorner.h ================================================ #ifndef CHESSBOARDCORNER_H #define CHESSBOARDCORNER_H #include #include namespace camodocal { class ChessboardCorner; typedef boost::shared_ptr ChessboardCornerPtr; class ChessboardCorner { public: ChessboardCorner() : row(0), column(0), needsNeighbor(true), count(0) {} float meanDist(int &n) const { float sum = 0; n = 0; for (int i = 0; i < 4; ++i) { if (neighbors[i].get()) { float dx = neighbors[i]->pt.x - pt.x; float dy = neighbors[i]->pt.y - pt.y; sum += sqrt(dx*dx + dy*dy); n++; } } return sum / std::max(n, 1); } cv::Point2f pt; // X and y coordinates int row; // Row and column of the corner int column; // in the found pattern bool needsNeighbor; // Does the corner require a neighbor? int count; // number of corner neighbors ChessboardCornerPtr neighbors[4]; // pointer to all corner neighbors }; } #endif ================================================ FILE: camera_model/include/camodocal/chessboard/ChessboardQuad.h ================================================ #ifndef CHESSBOARDQUAD_H #define CHESSBOARDQUAD_H #include #include "camodocal/chessboard/ChessboardCorner.h" namespace camodocal { class ChessboardQuad; typedef boost::shared_ptr ChessboardQuadPtr; class ChessboardQuad { public: ChessboardQuad() : count(0), group_idx(-1), edge_len(FLT_MAX), labeled(false) {} int count; // Number of quad neighbors int group_idx; // Quad group ID float edge_len; // Smallest side length^2 ChessboardCornerPtr corners[4]; // Coordinates of quad corners ChessboardQuadPtr neighbors[4]; // Pointers of quad neighbors bool labeled; // Has this corner been labeled? }; } #endif ================================================ FILE: camera_model/include/camodocal/chessboard/Spline.h ================================================ /* dynamo:- Event driven molecular dynamics simulator http://www.marcusbannerman.co.uk/dynamo Copyright (C) 2011 Marcus N Campbell Bannerman This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License version 3 as published by the Free Software Foundation. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . */ #pragma once #include #include #include #include #include #include namespace ublas = boost::numeric::ublas; class Spline : private std::vector > { public: //The boundary conditions available enum BC_type { FIXED_1ST_DERIV_BC, FIXED_2ND_DERIV_BC, PARABOLIC_RUNOUT_BC }; enum Spline_type { LINEAR, CUBIC }; //Constructor takes the boundary conditions as arguments, this //sets the first derivative (gradient) at the lower and upper //end points Spline(): _valid(false), _BCLow(FIXED_2ND_DERIV_BC), _BCHigh(FIXED_2ND_DERIV_BC), _BCLowVal(0), _BCHighVal(0), _type(CUBIC) {} typedef std::vector > base; typedef base::const_iterator const_iterator; //Standard STL read-only container stuff const_iterator begin() const { return base::begin(); } const_iterator end() const { return base::end(); } void clear() { _valid = false; base::clear(); _data.clear(); } size_t size() const { return base::size(); } size_t max_size() const { return base::max_size(); } size_t capacity() const { return base::capacity(); } bool empty() const { return base::empty(); } //Add a point to the spline, and invalidate it so its //recalculated on the next access inline void addPoint(double x, double y) { _valid = false; base::push_back(std::pair(x,y)); } //Reset the boundary conditions inline void setLowBC(BC_type BC, double val = 0) { _BCLow = BC; _BCLowVal = val; _valid = false; } inline void setHighBC(BC_type BC, double val = 0) { _BCHigh = BC; _BCHighVal = val; _valid = false; } void setType(Spline_type type) { _type = type; _valid = false; } //Check if the spline has been calculated, then generate the //spline interpolated value double operator()(double xval) { if (!_valid) generate(); //Special cases when we're outside the range of the spline points if (xval <= x(0)) return lowCalc(xval); if (xval >= x(size()-1)) return highCalc(xval); //Check all intervals except the last one for (std::vector::const_iterator iPtr = _data.begin(); iPtr != _data.end()-1; ++iPtr) if ((xval >= iPtr->x) && (xval <= (iPtr+1)->x)) return splineCalc(iPtr, xval); return splineCalc(_data.end() - 1, xval); } private: ///////PRIVATE DATA MEMBERS struct SplineData { double x,a,b,c,d; }; //vector of calculated spline data std::vector _data; //Second derivative at each point ublas::vector _ddy; //Tracks whether the spline parameters have been calculated for //the current set of points bool _valid; //The boundary conditions BC_type _BCLow, _BCHigh; //The values of the boundary conditions double _BCLowVal, _BCHighVal; Spline_type _type; ///////PRIVATE FUNCTIONS //Function to calculate the value of a given spline at a point xval inline double splineCalc(std::vector::const_iterator i, double xval) { const double lx = xval - i->x; return ((i->a * lx + i->b) * lx + i->c) * lx + i->d; } inline double lowCalc(double xval) { const double lx = xval - x(0); if (_type == LINEAR) return lx * _BCHighVal + y(0); const double firstDeriv = (y(1) - y(0)) / h(0) - 2 * h(0) * (_data[0].b + 2 * _data[1].b) / 6; switch(_BCLow) { case FIXED_1ST_DERIV_BC: return lx * _BCLowVal + y(0); case FIXED_2ND_DERIV_BC: return lx * lx * _BCLowVal + firstDeriv * lx + y(0); case PARABOLIC_RUNOUT_BC: return lx * lx * _ddy[0] + lx * firstDeriv + y(0); } throw std::runtime_error("Unknown BC"); } inline double highCalc(double xval) { const double lx = xval - x(size() - 1); if (_type == LINEAR) return lx * _BCHighVal + y(size() - 1); const double firstDeriv = 2 * h(size() - 2) * (_ddy[size() - 2] + 2 * _ddy[size() - 1]) / 6 + (y(size() - 1) - y(size() - 2)) / h(size() - 2); switch(_BCHigh) { case FIXED_1ST_DERIV_BC: return lx * _BCHighVal + y(size() - 1); case FIXED_2ND_DERIV_BC: return lx * lx * _BCHighVal + firstDeriv * lx + y(size() - 1); case PARABOLIC_RUNOUT_BC: return lx * lx * _ddy[size()-1] + lx * firstDeriv + y(size() - 1); } throw std::runtime_error("Unknown BC"); } //These just provide access to the point data in a clean way inline double x(size_t i) const { return operator[](i).first; } inline double y(size_t i) const { return operator[](i).second; } inline double h(size_t i) const { return x(i+1) - x(i); } //Invert a arbitrary matrix using the boost ublas library template bool InvertMatrix(ublas::matrix A, ublas::matrix& inverse) { using namespace ublas; // create a permutation matrix for the LU-factorization permutation_matrix pm(A.size1()); // perform LU-factorization int res = lu_factorize(A,pm); if( res != 0 ) return false; // create identity matrix of "inverse" inverse.assign(ublas::identity_matrix(A.size1())); // backsubstitute to get the inverse lu_substitute(A, pm, inverse); return true; } //This function will recalculate the spline parameters and store //them in _data, ready for spline interpolation void generate() { if (size() < 2) throw std::runtime_error("Spline requires at least 2 points"); //If any spline points are at the same x location, we have to //just slightly seperate them { bool testPassed(false); while (!testPassed) { testPassed = true; std::sort(base::begin(), base::end()); for (base::iterator iPtr = base::begin(); iPtr != base::end() - 1; ++iPtr) if (iPtr->first == (iPtr+1)->first) { if ((iPtr+1)->first != 0) (iPtr+1)->first += (iPtr+1)->first * std::numeric_limits::epsilon() * 10; else (iPtr+1)->first = std::numeric_limits::epsilon() * 10; testPassed = false; break; } } } const size_t e = size() - 1; switch (_type) { case LINEAR: { _data.resize(e); for (size_t i(0); i < e; ++i) { _data[i].x = x(i); _data[i].a = 0; _data[i].b = 0; _data[i].c = (y(i+1) - y(i)) / (x(i+1) - x(i)); _data[i].d = y(i); } break; } case CUBIC: { ublas::matrix A(size(), size()); for (size_t yv(0); yv <= e; ++yv) for (size_t xv(0); xv <= e; ++xv) A(xv,yv) = 0; for (size_t i(1); i < e; ++i) { A(i-1,i) = h(i-1); A(i,i) = 2 * (h(i-1) + h(i)); A(i+1,i) = h(i); } ublas::vector C(size()); for (size_t xv(0); xv <= e; ++xv) C(xv) = 0; for (size_t i(1); i < e; ++i) C(i) = 6 * ((y(i+1) - y(i)) / h(i) - (y(i) - y(i-1)) / h(i-1)); //Boundary conditions switch(_BCLow) { case FIXED_1ST_DERIV_BC: C(0) = 6 * ((y(1) - y(0)) / h(0) - _BCLowVal); A(0,0) = 2 * h(0); A(1,0) = h(0); break; case FIXED_2ND_DERIV_BC: C(0) = _BCLowVal; A(0,0) = 1; break; case PARABOLIC_RUNOUT_BC: C(0) = 0; A(0,0) = 1; A(1,0) = -1; break; } switch(_BCHigh) { case FIXED_1ST_DERIV_BC: C(e) = 6 * (_BCHighVal - (y(e) - y(e-1)) / h(e-1)); A(e,e) = 2 * h(e - 1); A(e-1,e) = h(e - 1); break; case FIXED_2ND_DERIV_BC: C(e) = _BCHighVal; A(e,e) = 1; break; case PARABOLIC_RUNOUT_BC: C(e) = 0; A(e,e) = 1; A(e-1,e) = -1; break; } ublas::matrix AInv(size(), size()); InvertMatrix(A,AInv); _ddy = ublas::prod(C, AInv); _data.resize(size()-1); for (size_t i(0); i < e; ++i) { _data[i].x = x(i); _data[i].a = (_ddy(i+1) - _ddy(i)) / (6 * h(i)); _data[i].b = _ddy(i) / 2; _data[i].c = (y(i+1) - y(i)) / h(i) - _ddy(i+1) * h(i) / 6 - _ddy(i) * h(i) / 3; _data[i].d = y(i); } } } _valid = true; } }; ================================================ FILE: camera_model/include/camodocal/gpl/EigenQuaternionParameterization.h ================================================ #ifndef EIGENQUATERNIONPARAMETERIZATION_H #define EIGENQUATERNIONPARAMETERIZATION_H #include "ceres/local_parameterization.h" namespace camodocal { class EigenQuaternionParameterization : public ceres::LocalParameterization { public: virtual ~EigenQuaternionParameterization() {} virtual bool Plus(const double* x, const double* delta, double* x_plus_delta) const; virtual bool ComputeJacobian(const double* x, double* jacobian) const; virtual int GlobalSize() const { return 4; } virtual int LocalSize() const { return 3; } private: template void EigenQuaternionProduct(const T z[4], const T w[4], T zw[4]) const; }; template void EigenQuaternionParameterization::EigenQuaternionProduct(const T z[4], const T w[4], T zw[4]) const { zw[0] = z[3] * w[0] + z[0] * w[3] + z[1] * w[2] - z[2] * w[1]; zw[1] = z[3] * w[1] - z[0] * w[2] + z[1] * w[3] + z[2] * w[0]; zw[2] = z[3] * w[2] + z[0] * w[1] - z[1] * w[0] + z[2] * w[3]; zw[3] = z[3] * w[3] - z[0] * w[0] - z[1] * w[1] - z[2] * w[2]; } } #endif ================================================ FILE: camera_model/include/camodocal/gpl/EigenUtils.h ================================================ #ifndef EIGENUTILS_H #define EIGENUTILS_H #include #include "ceres/rotation.h" #include "camodocal/gpl/gpl.h" namespace camodocal { // Returns the 3D cross product skew symmetric matrix of a given 3D vector template Eigen::Matrix skew(const Eigen::Matrix& vec) { return (Eigen::Matrix() << T(0), -vec(2), vec(1), vec(2), T(0), -vec(0), -vec(1), vec(0), T(0)).finished(); } template typename Eigen::MatrixBase::PlainObject sqrtm(const Eigen::MatrixBase& A) { Eigen::SelfAdjointEigenSolver es(A); return es.operatorSqrt(); } template Eigen::Matrix AngleAxisToRotationMatrix(const Eigen::Matrix& rvec) { T angle = rvec.norm(); if (angle == T(0)) { return Eigen::Matrix::Identity(); } Eigen::Matrix axis; axis = rvec.normalized(); Eigen::Matrix rmat; rmat = Eigen::AngleAxis(angle, axis); return rmat; } template Eigen::Quaternion AngleAxisToQuaternion(const Eigen::Matrix& rvec) { Eigen::Matrix rmat = AngleAxisToRotationMatrix(rvec); return Eigen::Quaternion(rmat); } template void AngleAxisToQuaternion(const Eigen::Matrix& rvec, T* q) { Eigen::Quaternion quat = AngleAxisToQuaternion(rvec); q[0] = quat.x(); q[1] = quat.y(); q[2] = quat.z(); q[3] = quat.w(); } template Eigen::Matrix RotationToAngleAxis(const Eigen::Matrix & rmat) { Eigen::AngleAxis angleaxis; angleaxis.fromRotationMatrix(rmat); return angleaxis.angle() * angleaxis.axis(); } template void QuaternionToAngleAxis(const T* const q, Eigen::Matrix& rvec) { Eigen::Quaternion quat(q[3], q[0], q[1], q[2]); Eigen::Matrix rmat = quat.toRotationMatrix(); Eigen::AngleAxis angleaxis; angleaxis.fromRotationMatrix(rmat); rvec = angleaxis.angle() * angleaxis.axis(); } template Eigen::Matrix QuaternionToRotation(const T* const q) { T R[9]; ceres::QuaternionToRotation(q, R); Eigen::Matrix rmat; for (int i = 0; i < 3; ++i) { for (int j = 0; j < 3; ++j) { rmat(i,j) = R[i * 3 + j]; } } return rmat; } template void QuaternionToRotation(const T* const q, T* rot) { ceres::QuaternionToRotation(q, rot); } template Eigen::Matrix QuaternionMultMatLeft(const Eigen::Quaternion& q) { return (Eigen::Matrix() << q.w(), -q.z(), q.y(), q.x(), q.z(), q.w(), -q.x(), q.y(), -q.y(), q.x(), q.w(), q.z(), -q.x(), -q.y(), -q.z(), q.w()).finished(); } template Eigen::Matrix QuaternionMultMatRight(const Eigen::Quaternion& q) { return (Eigen::Matrix() << q.w(), q.z(), -q.y(), q.x(), -q.z(), q.w(), q.x(), q.y(), q.y(), -q.x(), q.w(), q.z(), -q.x(), -q.y(), -q.z(), q.w()).finished(); } /// @param theta - rotation about screw axis /// @param d - projection of tvec on the rotation axis /// @param l - screw axis direction /// @param m - screw axis moment template void AngleAxisAndTranslationToScrew(const Eigen::Matrix& rvec, const Eigen::Matrix& tvec, T& theta, T& d, Eigen::Matrix& l, Eigen::Matrix& m) { theta = rvec.norm(); if (theta == 0) { l.setZero(); m.setZero(); std::cout << "Warning: Undefined screw! Returned 0. " << std::endl; } l = rvec.normalized(); Eigen::Matrix t = tvec; d = t.transpose() * l; // point on screw axis - projection of origin on screw axis Eigen::Matrix c; c = 0.5 * (t - d * l + (1.0 / tan(theta / 2.0) * l).cross(t)); // c and hence the screw axis is not defined if theta is either 0 or M_PI m = c.cross(l); } template Eigen::Matrix RPY2mat(T roll, T pitch, T yaw) { Eigen::Matrix m; T cr = cos(roll); T sr = sin(roll); T cp = cos(pitch); T sp = sin(pitch); T cy = cos(yaw); T sy = sin(yaw); m(0,0) = cy * cp; m(0,1) = cy * sp * sr - sy * cr; m(0,2) = cy * sp * cr + sy * sr; m(1,0) = sy * cp; m(1,1) = sy * sp * sr + cy * cr; m(1,2) = sy * sp * cr - cy * sr; m(2,0) = - sp; m(2,1) = cp * sr; m(2,2) = cp * cr; return m; } template void mat2RPY(const Eigen::Matrix& m, T& roll, T& pitch, T& yaw) { roll = atan2(m(2,1), m(2,2)); pitch = atan2(-m(2,0), sqrt(m(2,1) * m(2,1) + m(2,2) * m(2,2))); yaw = atan2(m(1,0), m(0,0)); } template Eigen::Matrix homogeneousTransform(const Eigen::Matrix& R, const Eigen::Matrix& t) { Eigen::Matrix H; H.setIdentity(); H.block(0,0,3,3) = R; H.block(0,3,3,1) = t; return H; } template Eigen::Matrix poseWithCartesianTranslation(const T* const q, const T* const p) { Eigen::Matrix pose = Eigen::Matrix::Identity(); T rotation[9]; ceres::QuaternionToRotation(q, rotation); for (int i = 0; i < 3; ++i) { for (int j = 0; j < 3; ++j) { pose(i,j) = rotation[i * 3 + j]; } } pose(0,3) = p[0]; pose(1,3) = p[1]; pose(2,3) = p[2]; return pose; } template Eigen::Matrix poseWithSphericalTranslation(const T* const q, const T* const p, const T scale = T(1.0)) { Eigen::Matrix pose = Eigen::Matrix::Identity(); T rotation[9]; ceres::QuaternionToRotation(q, rotation); for (int i = 0; i < 3; ++i) { for (int j = 0; j < 3; ++j) { pose(i,j) = rotation[i * 3 + j]; } } T theta = p[0]; T phi = p[1]; pose(0,3) = sin(theta) * cos(phi) * scale; pose(1,3) = sin(theta) * sin(phi) * scale; pose(2,3) = cos(theta) * scale; return pose; } // Returns the Sampson error of a given essential matrix and 2 image points template T sampsonError(const Eigen::Matrix& E, const Eigen::Matrix& p1, const Eigen::Matrix& p2) { Eigen::Matrix Ex1 = E * p1; Eigen::Matrix Etx2 = E.transpose() * p2; T x2tEx1 = p2.dot(Ex1); // compute Sampson error T err = square(x2tEx1) / (square(Ex1(0,0)) + square(Ex1(1,0)) + square(Etx2(0,0)) + square(Etx2(1,0))); return err; } // Returns the Sampson error of a given rotation/translation and 2 image points template T sampsonError(const Eigen::Matrix& R, const Eigen::Matrix& t, const Eigen::Matrix& p1, const Eigen::Matrix& p2) { // construct essential matrix Eigen::Matrix E = skew(t) * R; Eigen::Matrix Ex1 = E * p1; Eigen::Matrix Etx2 = E.transpose() * p2; T x2tEx1 = p2.dot(Ex1); // compute Sampson error T err = square(x2tEx1) / (square(Ex1(0,0)) + square(Ex1(1,0)) + square(Etx2(0,0)) + square(Etx2(1,0))); return err; } // Returns the Sampson error of a given rotation/translation and 2 image points template T sampsonError(const Eigen::Matrix& H, const Eigen::Matrix& p1, const Eigen::Matrix& p2) { Eigen::Matrix R = H.block(0, 0, 3, 3); Eigen::Matrix t = H.block(0, 3, 3, 1); return sampsonError(R, t, p1, p2); } template Eigen::Matrix transformPoint(const Eigen::Matrix& H, const Eigen::Matrix& P) { Eigen::Matrix P_trans = H.block(0, 0, 3, 3) * P + H.block(0, 3, 3, 1); return P_trans; } template Eigen::Matrix estimate3DRigidTransform(const std::vector, Eigen::aligned_allocator > >& points1, const std::vector, Eigen::aligned_allocator > >& points2) { // compute centroids Eigen::Matrix c1, c2; c1.setZero(); c2.setZero(); for (size_t i = 0; i < points1.size(); ++i) { c1 += points1.at(i); c2 += points2.at(i); } c1 /= points1.size(); c2 /= points1.size(); Eigen::Matrix X(3, points1.size()); Eigen::Matrix Y(3, points1.size()); for (size_t i = 0; i < points1.size(); ++i) { X.col(i) = points1.at(i) - c1; Y.col(i) = points2.at(i) - c2; } Eigen::Matrix H = X * Y.transpose(); Eigen::JacobiSVD< Eigen::Matrix > svd(H, Eigen::ComputeFullU | Eigen::ComputeFullV); Eigen::Matrix U = svd.matrixU(); Eigen::Matrix V = svd.matrixV(); if (U.determinant() * V.determinant() < 0.0) { V.col(2) *= -1.0; } Eigen::Matrix R = V * U.transpose(); Eigen::Matrix t = c2 - R * c1; return homogeneousTransform(R, t); } template Eigen::Matrix estimate3DRigidSimilarityTransform(const std::vector, Eigen::aligned_allocator > >& points1, const std::vector, Eigen::aligned_allocator > >& points2) { // compute centroids Eigen::Matrix c1, c2; c1.setZero(); c2.setZero(); for (size_t i = 0; i < points1.size(); ++i) { c1 += points1.at(i); c2 += points2.at(i); } c1 /= points1.size(); c2 /= points1.size(); Eigen::Matrix X(3, points1.size()); Eigen::Matrix Y(3, points1.size()); for (size_t i = 0; i < points1.size(); ++i) { X.col(i) = points1.at(i) - c1; Y.col(i) = points2.at(i) - c2; } Eigen::Matrix H = X * Y.transpose(); Eigen::JacobiSVD< Eigen::Matrix > svd(H, Eigen::ComputeFullU | Eigen::ComputeFullV); Eigen::Matrix U = svd.matrixU(); Eigen::Matrix V = svd.matrixV(); if (U.determinant() * V.determinant() < 0.0) { V.col(2) *= -1.0; } Eigen::Matrix R = V * U.transpose(); std::vector, Eigen::aligned_allocator > > rotatedPoints1(points1.size()); for (size_t i = 0; i < points1.size(); ++i) { rotatedPoints1.at(i) = R * (points1.at(i) - c1); } double sum_ss = 0.0, sum_tt = 0.0; for (size_t i = 0; i < points1.size(); ++i) { sum_ss += (points1.at(i) - c1).squaredNorm(); sum_tt += (points2.at(i) - c2).dot(rotatedPoints1.at(i)); } double scale = sum_tt / sum_ss; Eigen::Matrix sR = scale * R; Eigen::Matrix t = c2 - sR * c1; return homogeneousTransform(sR, t); } } #endif ================================================ FILE: camera_model/include/camodocal/gpl/gpl.h ================================================ #ifndef GPL_H #define GPL_H #include #include #include namespace camodocal { template const T clamp(const T& v, const T& a, const T& b) { return std::min(b, std::max(a, v)); } double hypot3(double x, double y, double z); float hypot3f(float x, float y, float z); template const T normalizeTheta(const T& theta) { T normTheta = theta; while (normTheta < - M_PI) { normTheta += 2.0 * M_PI; } while (normTheta > M_PI) { normTheta -= 2.0 * M_PI; } return normTheta; } double d2r(double deg); float d2r(float deg); double r2d(double rad); float r2d(float rad); double sinc(double theta); template const T square(const T& x) { return x * x; } template const T cube(const T& x) { return x * x * x; } template const T random(const T& a, const T& b) { return static_cast(rand()) / RAND_MAX * (b - a) + a; } template const T randomNormal(const T& sigma) { T x1, x2, w; do { x1 = 2.0 * random(0.0, 1.0) - 1.0; x2 = 2.0 * random(0.0, 1.0) - 1.0; w = x1 * x1 + x2 * x2; } while (w >= 1.0 || w == 0.0); w = sqrt((-2.0 * log(w)) / w); return x1 * w * sigma; } unsigned long long timeInMicroseconds(void); double timeInSeconds(void); void colorDepthImage(cv::Mat& imgDepth, cv::Mat& imgColoredDepth, float minRange, float maxRange); bool colormap(const std::string& name, unsigned char idx, float& r, float& g, float& b); std::vector bresLine(int x0, int y0, int x1, int y1); std::vector bresCircle(int x0, int y0, int r); void fitCircle(const std::vector& points, double& centerX, double& centerY, double& radius); std::vector intersectCircles(double x1, double y1, double r1, double x2, double y2, double r2); void LLtoUTM(double latitude, double longitude, double& utmNorthing, double& utmEasting, std::string& utmZone); void UTMtoLL(double utmNorthing, double utmEasting, const std::string& utmZone, double& latitude, double& longitude); long int timestampDiff(uint64_t t1, uint64_t t2); } #endif ================================================ FILE: camera_model/include/camodocal/sparse_graph/Transform.h ================================================ #ifndef TRANSFORM_H #define TRANSFORM_H #include #include #include namespace camodocal { class Transform { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW Transform(); Transform(const Eigen::Matrix4d& H); Eigen::Quaterniond& rotation(void); const Eigen::Quaterniond& rotation(void) const; double* rotationData(void); const double* const rotationData(void) const; Eigen::Vector3d& translation(void); const Eigen::Vector3d& translation(void) const; double* translationData(void); const double* const translationData(void) const; Eigen::Matrix4d toMatrix(void) const; private: Eigen::Quaterniond m_q; Eigen::Vector3d m_t; }; } #endif ================================================ FILE: camera_model/instruction ================================================ rosrun camera_model Calibration -w 8 -h 11 -s 70 -i ~/bag/PX/calib/ ================================================ FILE: camera_model/package.xml ================================================ camera_model 0.0.0 The camera_model package lionel GPLv3 catkin roscpp std_msgs roscpp std_msgs ================================================ FILE: camera_model/readme.md ================================================ part of [camodocal](https://github.com/hengli/camodocal) [Google Ceres](http://ceres-solver.org) is needed. # Calibration: Use [intrinsic_calib.cc](https://github.com/dvorak0/camera_model/blob/master/src/intrinsic_calib.cc) to calibrate your camera. # Undistortion: See [Camera.h](https://github.com/dvorak0/camera_model/blob/master/include/camodocal/camera_models/Camera.h) for general interface: - liftProjective: Lift points from the image plane to the projective space. - spaceToPlane: Projects 3D points to the image plane (Pi function) ================================================ FILE: camera_model/src/calib/CameraCalibration.cc ================================================ #include "camodocal/calib/CameraCalibration.h" #include #include #include #include #include #include #include #include #include #include #include "camodocal/camera_models/CameraFactory.h" #include "camodocal/sparse_graph/Transform.h" #include "camodocal/gpl/EigenQuaternionParameterization.h" #include "camodocal/gpl/EigenUtils.h" #include "camodocal/camera_models/CostFunctionFactory.h" #include "ceres/ceres.h" namespace camodocal { CameraCalibration::CameraCalibration() : m_boardSize(cv::Size(0,0)) , m_squareSize(0.0f) , m_verbose(false) { } CameraCalibration::CameraCalibration(const Camera::ModelType modelType, const std::string& cameraName, const cv::Size& imageSize, const cv::Size& boardSize, float squareSize) : m_boardSize(boardSize) , m_squareSize(squareSize) , m_verbose(false) { m_camera = CameraFactory::instance()->generateCamera(modelType, cameraName, imageSize); } void CameraCalibration::clear(void) { m_imagePoints.clear(); m_scenePoints.clear(); } void CameraCalibration::addChessboardData(const std::vector& corners) { m_imagePoints.push_back(corners); std::vector scenePointsInView; for (int i = 0; i < m_boardSize.height; ++i) { for (int j = 0; j < m_boardSize.width; ++j) { scenePointsInView.push_back(cv::Point3f(i * m_squareSize, j * m_squareSize, 0.0)); } } m_scenePoints.push_back(scenePointsInView); } bool CameraCalibration::calibrate(void) { int imageCount = m_imagePoints.size(); // compute intrinsic camera parameters and extrinsic parameters for each of the views std::vector rvecs; std::vector tvecs; bool ret = calibrateHelper(m_camera, rvecs, tvecs); m_cameraPoses = cv::Mat(imageCount, 6, CV_64F); for (int i = 0; i < imageCount; ++i) { m_cameraPoses.at(i,0) = rvecs.at(i).at(0); m_cameraPoses.at(i,1) = rvecs.at(i).at(1); m_cameraPoses.at(i,2) = rvecs.at(i).at(2); m_cameraPoses.at(i,3) = tvecs.at(i).at(0); m_cameraPoses.at(i,4) = tvecs.at(i).at(1); m_cameraPoses.at(i,5) = tvecs.at(i).at(2); } // Compute measurement covariance. std::vector > errVec(m_imagePoints.size()); Eigen::Vector2d errSum = Eigen::Vector2d::Zero(); size_t errCount = 0; for (size_t i = 0; i < m_imagePoints.size(); ++i) { std::vector estImagePoints; m_camera->projectPoints(m_scenePoints.at(i), rvecs.at(i), tvecs.at(i), estImagePoints); for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j) { cv::Point2f pObs = m_imagePoints.at(i).at(j); cv::Point2f pEst = estImagePoints.at(j); cv::Point2f err = pObs - pEst; errVec.at(i).push_back(err); errSum += Eigen::Vector2d(err.x, err.y); } errCount += m_imagePoints.at(i).size(); } Eigen::Vector2d errMean = errSum / static_cast(errCount); Eigen::Matrix2d measurementCovariance = Eigen::Matrix2d::Zero(); for (size_t i = 0; i < errVec.size(); ++i) { for (size_t j = 0; j < errVec.at(i).size(); ++j) { cv::Point2f err = errVec.at(i).at(j); double d0 = err.x - errMean(0); double d1 = err.y - errMean(1); measurementCovariance(0,0) += d0 * d0; measurementCovariance(0,1) += d0 * d1; measurementCovariance(1,1) += d1 * d1; } } measurementCovariance /= static_cast(errCount); measurementCovariance(1,0) = measurementCovariance(0,1); m_measurementCovariance = measurementCovariance; return ret; } int CameraCalibration::sampleCount(void) const { return m_imagePoints.size(); } std::vector >& CameraCalibration::imagePoints(void) { return m_imagePoints; } const std::vector >& CameraCalibration::imagePoints(void) const { return m_imagePoints; } std::vector >& CameraCalibration::scenePoints(void) { return m_scenePoints; } const std::vector >& CameraCalibration::scenePoints(void) const { return m_scenePoints; } CameraPtr& CameraCalibration::camera(void) { return m_camera; } const CameraConstPtr CameraCalibration::camera(void) const { return m_camera; } Eigen::Matrix2d& CameraCalibration::measurementCovariance(void) { return m_measurementCovariance; } const Eigen::Matrix2d& CameraCalibration::measurementCovariance(void) const { return m_measurementCovariance; } cv::Mat& CameraCalibration::cameraPoses(void) { return m_cameraPoses; } const cv::Mat& CameraCalibration::cameraPoses(void) const { return m_cameraPoses; } void CameraCalibration::drawResults(std::vector& images) const { std::vector rvecs, tvecs; for (size_t i = 0; i < images.size(); ++i) { cv::Mat rvec(3, 1, CV_64F); rvec.at(0) = m_cameraPoses.at(i,0); rvec.at(1) = m_cameraPoses.at(i,1); rvec.at(2) = m_cameraPoses.at(i,2); cv::Mat tvec(3, 1, CV_64F); tvec.at(0) = m_cameraPoses.at(i,3); tvec.at(1) = m_cameraPoses.at(i,4); tvec.at(2) = m_cameraPoses.at(i,5); rvecs.push_back(rvec); tvecs.push_back(tvec); } int drawShiftBits = 4; int drawMultiplier = 1 << drawShiftBits; cv::Scalar green(0, 255, 0); cv::Scalar red(0, 0, 255); for (size_t i = 0; i < images.size(); ++i) { cv::Mat& image = images.at(i); if (image.channels() == 1) { cv::cvtColor(image, image, CV_GRAY2RGB); } std::vector estImagePoints; m_camera->projectPoints(m_scenePoints.at(i), rvecs.at(i), tvecs.at(i), estImagePoints); float errorSum = 0.0f; float errorMax = std::numeric_limits::min(); for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j) { cv::Point2f pObs = m_imagePoints.at(i).at(j); cv::Point2f pEst = estImagePoints.at(j); cv::circle(image, cv::Point(cvRound(pObs.x * drawMultiplier), cvRound(pObs.y * drawMultiplier)), 5, green, 2, CV_AA, drawShiftBits); cv::circle(image, cv::Point(cvRound(pEst.x * drawMultiplier), cvRound(pEst.y * drawMultiplier)), 5, red, 2, CV_AA, drawShiftBits); float error = cv::norm(pObs - pEst); errorSum += error; if (error > errorMax) { errorMax = error; } } std::ostringstream oss; oss << "Reprojection error: avg = " << errorSum / m_imagePoints.at(i).size() << " max = " << errorMax; cv::putText(image, oss.str(), cv::Point(10, image.rows - 10), cv::FONT_HERSHEY_COMPLEX, 0.5, cv::Scalar(255, 255, 255), 1, CV_AA); } } void CameraCalibration::writeParams(const std::string& filename) const { m_camera->writeParametersToYamlFile(filename); } bool CameraCalibration::writeChessboardData(const std::string& filename) const { std::ofstream ofs(filename.c_str(), std::ios::out | std::ios::binary); if (!ofs.is_open()) { return false; } writeData(ofs, m_boardSize.width); writeData(ofs, m_boardSize.height); writeData(ofs, m_squareSize); writeData(ofs, m_measurementCovariance(0,0)); writeData(ofs, m_measurementCovariance(0,1)); writeData(ofs, m_measurementCovariance(1,0)); writeData(ofs, m_measurementCovariance(1,1)); writeData(ofs, m_cameraPoses.rows); writeData(ofs, m_cameraPoses.cols); writeData(ofs, m_cameraPoses.type()); for (int i = 0; i < m_cameraPoses.rows; ++i) { for (int j = 0; j < m_cameraPoses.cols; ++j) { writeData(ofs, m_cameraPoses.at(i,j)); } } writeData(ofs, m_imagePoints.size()); for (size_t i = 0; i < m_imagePoints.size(); ++i) { writeData(ofs, m_imagePoints.at(i).size()); for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j) { const cv::Point2f& ipt = m_imagePoints.at(i).at(j); writeData(ofs, ipt.x); writeData(ofs, ipt.y); } } writeData(ofs, m_scenePoints.size()); for (size_t i = 0; i < m_scenePoints.size(); ++i) { writeData(ofs, m_scenePoints.at(i).size()); for (size_t j = 0; j < m_scenePoints.at(i).size(); ++j) { const cv::Point3f& spt = m_scenePoints.at(i).at(j); writeData(ofs, spt.x); writeData(ofs, spt.y); writeData(ofs, spt.z); } } return true; } bool CameraCalibration::readChessboardData(const std::string& filename) { std::ifstream ifs(filename.c_str(), std::ios::in | std::ios::binary); if (!ifs.is_open()) { return false; } readData(ifs, m_boardSize.width); readData(ifs, m_boardSize.height); readData(ifs, m_squareSize); readData(ifs, m_measurementCovariance(0,0)); readData(ifs, m_measurementCovariance(0,1)); readData(ifs, m_measurementCovariance(1,0)); readData(ifs, m_measurementCovariance(1,1)); int rows, cols, type; readData(ifs, rows); readData(ifs, cols); readData(ifs, type); m_cameraPoses = cv::Mat(rows, cols, type); for (int i = 0; i < m_cameraPoses.rows; ++i) { for (int j = 0; j < m_cameraPoses.cols; ++j) { readData(ifs, m_cameraPoses.at(i,j)); } } size_t nImagePointSets; readData(ifs, nImagePointSets); m_imagePoints.clear(); m_imagePoints.resize(nImagePointSets); for (size_t i = 0; i < m_imagePoints.size(); ++i) { size_t nImagePoints; readData(ifs, nImagePoints); m_imagePoints.at(i).resize(nImagePoints); for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j) { cv::Point2f& ipt = m_imagePoints.at(i).at(j); readData(ifs, ipt.x); readData(ifs, ipt.y); } } size_t nScenePointSets; readData(ifs, nScenePointSets); m_scenePoints.clear(); m_scenePoints.resize(nScenePointSets); for (size_t i = 0; i < m_scenePoints.size(); ++i) { size_t nScenePoints; readData(ifs, nScenePoints); m_scenePoints.at(i).resize(nScenePoints); for (size_t j = 0; j < m_scenePoints.at(i).size(); ++j) { cv::Point3f& spt = m_scenePoints.at(i).at(j); readData(ifs, spt.x); readData(ifs, spt.y); readData(ifs, spt.z); } } return true; } void CameraCalibration::setVerbose(bool verbose) { m_verbose = verbose; } bool CameraCalibration::calibrateHelper(CameraPtr& camera, std::vector& rvecs, std::vector& tvecs) const { rvecs.assign(m_scenePoints.size(), cv::Mat()); tvecs.assign(m_scenePoints.size(), cv::Mat()); // STEP 1: Estimate intrinsics camera->estimateIntrinsics(m_boardSize, m_scenePoints, m_imagePoints); // STEP 2: Estimate extrinsics for (size_t i = 0; i < m_scenePoints.size(); ++i) { camera->estimateExtrinsics(m_scenePoints.at(i), m_imagePoints.at(i), rvecs.at(i), tvecs.at(i)); } if (m_verbose) { std::cout << "[" << camera->cameraName() << "] " << "# INFO: " << "Initial reprojection error: " << std::fixed << std::setprecision(3) << camera->reprojectionError(m_scenePoints, m_imagePoints, rvecs, tvecs) << " pixels" << std::endl; } // STEP 3: optimization using ceres optimize(camera, rvecs, tvecs); if (m_verbose) { double err = camera->reprojectionError(m_scenePoints, m_imagePoints, rvecs, tvecs); std::cout << "[" << camera->cameraName() << "] " << "# INFO: Final reprojection error: " << err << " pixels" << std::endl; std::cout << "[" << camera->cameraName() << "] " << "# INFO: " << camera->parametersToString() << std::endl; } return true; } void CameraCalibration::optimize(CameraPtr& camera, std::vector& rvecs, std::vector& tvecs) const { // Use ceres to do optimization ceres::Problem problem; std::vector > transformVec(rvecs.size()); for (size_t i = 0; i < rvecs.size(); ++i) { Eigen::Vector3d rvec; cv::cv2eigen(rvecs.at(i), rvec); transformVec.at(i).rotation() = Eigen::AngleAxisd(rvec.norm(), rvec.normalized()); transformVec.at(i).translation() << tvecs[i].at(0), tvecs[i].at(1), tvecs[i].at(2); } std::vector intrinsicCameraParams; m_camera->writeParameters(intrinsicCameraParams); // create residuals for each observation for (size_t i = 0; i < m_imagePoints.size(); ++i) { for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j) { const cv::Point3f& spt = m_scenePoints.at(i).at(j); const cv::Point2f& ipt = m_imagePoints.at(i).at(j); ceres::CostFunction* costFunction = CostFunctionFactory::instance()->generateCostFunction(camera, Eigen::Vector3d(spt.x, spt.y, spt.z), Eigen::Vector2d(ipt.x, ipt.y), CAMERA_INTRINSICS | CAMERA_POSE); ceres::LossFunction* lossFunction = new ceres::CauchyLoss(1.0); problem.AddResidualBlock(costFunction, lossFunction, intrinsicCameraParams.data(), transformVec.at(i).rotationData(), transformVec.at(i).translationData()); } ceres::LocalParameterization* quaternionParameterization = new EigenQuaternionParameterization; problem.SetParameterization(transformVec.at(i).rotationData(), quaternionParameterization); } std::cout << "begin ceres" << std::endl; ceres::Solver::Options options; options.max_num_iterations = 1000; options.num_threads = 1; if (m_verbose) { options.minimizer_progress_to_stdout = true; } ceres::Solver::Summary summary; ceres::Solve(options, &problem, &summary); std::cout << "end ceres" << std::endl; if (m_verbose) { std::cout << summary.FullReport() << std::endl; } camera->readParameters(intrinsicCameraParams); for (size_t i = 0; i < rvecs.size(); ++i) { Eigen::AngleAxisd aa(transformVec.at(i).rotation()); Eigen::Vector3d rvec = aa.angle() * aa.axis(); cv::eigen2cv(rvec, rvecs.at(i)); cv::Mat& tvec = tvecs.at(i); tvec.at(0) = transformVec.at(i).translation()(0); tvec.at(1) = transformVec.at(i).translation()(1); tvec.at(2) = transformVec.at(i).translation()(2); } } template void CameraCalibration::readData(std::ifstream& ifs, T& data) const { char* buffer = new char[sizeof(T)]; ifs.read(buffer, sizeof(T)); data = *(reinterpret_cast(buffer)); delete[] buffer; } template void CameraCalibration::writeData(std::ofstream& ofs, T data) const { char* pData = reinterpret_cast(&data); ofs.write(pData, sizeof(T)); } } ================================================ FILE: camera_model/src/camera_models/Camera.cc ================================================ #include "camodocal/camera_models/Camera.h" #include "camodocal/camera_models/ScaramuzzaCamera.h" #include namespace camodocal { Camera::Parameters::Parameters(ModelType modelType) : m_modelType(modelType) , m_imageWidth(0) , m_imageHeight(0) { switch (modelType) { case KANNALA_BRANDT: m_nIntrinsics = 8; break; case PINHOLE: m_nIntrinsics = 8; break; case SCARAMUZZA: m_nIntrinsics = SCARAMUZZA_CAMERA_NUM_PARAMS; break; case MEI: default: m_nIntrinsics = 9; } } Camera::Parameters::Parameters(ModelType modelType, const std::string& cameraName, int w, int h) : m_modelType(modelType) , m_cameraName(cameraName) , m_imageWidth(w) , m_imageHeight(h) { switch (modelType) { case KANNALA_BRANDT: m_nIntrinsics = 8; break; case PINHOLE: m_nIntrinsics = 8; break; case SCARAMUZZA: m_nIntrinsics = SCARAMUZZA_CAMERA_NUM_PARAMS; break; case MEI: default: m_nIntrinsics = 9; } } Camera::ModelType& Camera::Parameters::modelType(void) { return m_modelType; } std::string& Camera::Parameters::cameraName(void) { return m_cameraName; } int& Camera::Parameters::imageWidth(void) { return m_imageWidth; } int& Camera::Parameters::imageHeight(void) { return m_imageHeight; } Camera::ModelType Camera::Parameters::modelType(void) const { return m_modelType; } const std::string& Camera::Parameters::cameraName(void) const { return m_cameraName; } int Camera::Parameters::imageWidth(void) const { return m_imageWidth; } int Camera::Parameters::imageHeight(void) const { return m_imageHeight; } int Camera::Parameters::nIntrinsics(void) const { return m_nIntrinsics; } cv::Mat& Camera::mask(void) { return m_mask; } const cv::Mat& Camera::mask(void) const { return m_mask; } void Camera::estimateExtrinsics(const std::vector& objectPoints, const std::vector& imagePoints, cv::Mat& rvec, cv::Mat& tvec) const { std::vector Ms(imagePoints.size()); for (size_t i = 0; i < Ms.size(); ++i) { Eigen::Vector3d P; liftProjective(Eigen::Vector2d(imagePoints.at(i).x, imagePoints.at(i).y), P); P /= P(2); Ms.at(i).x = P(0); Ms.at(i).y = P(1); } // assume unit focal length, zero principal point, and zero distortion cv::solvePnP(objectPoints, Ms, cv::Mat::eye(3, 3, CV_64F), cv::noArray(), rvec, tvec); } double Camera::reprojectionDist(const Eigen::Vector3d& P1, const Eigen::Vector3d& P2) const { Eigen::Vector2d p1, p2; spaceToPlane(P1, p1); spaceToPlane(P2, p2); return (p1 - p2).norm(); } double Camera::reprojectionError(const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints, const std::vector& rvecs, const std::vector& tvecs, cv::OutputArray _perViewErrors) const { int imageCount = objectPoints.size(); size_t pointsSoFar = 0; double totalErr = 0.0; bool computePerViewErrors = _perViewErrors.needed(); cv::Mat perViewErrors; if (computePerViewErrors) { _perViewErrors.create(imageCount, 1, CV_64F); perViewErrors = _perViewErrors.getMat(); } for (int i = 0; i < imageCount; ++i) { size_t pointCount = imagePoints.at(i).size(); pointsSoFar += pointCount; std::vector estImagePoints; projectPoints(objectPoints.at(i), rvecs.at(i), tvecs.at(i), estImagePoints); double err = 0.0; for (size_t j = 0; j < imagePoints.at(i).size(); ++j) { err += cv::norm(imagePoints.at(i).at(j) - estImagePoints.at(j)); } if (computePerViewErrors) { perViewErrors.at(i) = err / pointCount; } totalErr += err; } return totalErr / pointsSoFar; } double Camera::reprojectionError(const Eigen::Vector3d& P, const Eigen::Quaterniond& camera_q, const Eigen::Vector3d& camera_t, const Eigen::Vector2d& observed_p) const { Eigen::Vector3d P_cam = camera_q.toRotationMatrix() * P + camera_t; Eigen::Vector2d p; spaceToPlane(P_cam, p); return (p - observed_p).norm(); } void Camera::projectPoints(const std::vector& objectPoints, const cv::Mat& rvec, const cv::Mat& tvec, std::vector& imagePoints) const { // project 3D object points to the image plane imagePoints.reserve(objectPoints.size()); //double cv::Mat R0; cv::Rodrigues(rvec, R0); Eigen::MatrixXd R(3,3); R << R0.at(0,0), R0.at(0,1), R0.at(0,2), R0.at(1,0), R0.at(1,1), R0.at(1,2), R0.at(2,0), R0.at(2,1), R0.at(2,2); Eigen::Vector3d t; t << tvec.at(0), tvec.at(1), tvec.at(2); for (size_t i = 0; i < objectPoints.size(); ++i) { const cv::Point3f& objectPoint = objectPoints.at(i); // Rotate and translate Eigen::Vector3d P; P << objectPoint.x, objectPoint.y, objectPoint.z; P = R * P + t; Eigen::Vector2d p; spaceToPlane(P, p); imagePoints.push_back(cv::Point2f(p(0), p(1))); } } } ================================================ FILE: camera_model/src/camera_models/CameraFactory.cc ================================================ #include "camodocal/camera_models/CameraFactory.h" #include #include "camodocal/camera_models/CataCamera.h" #include "camodocal/camera_models/EquidistantCamera.h" #include "camodocal/camera_models/PinholeCamera.h" #include "camodocal/camera_models/ScaramuzzaCamera.h" #include "ceres/ceres.h" namespace camodocal { boost::shared_ptr CameraFactory::m_instance; CameraFactory::CameraFactory() { } boost::shared_ptr CameraFactory::instance(void) { if (m_instance.get() == 0) { m_instance.reset(new CameraFactory); } return m_instance; } CameraPtr CameraFactory::generateCamera(Camera::ModelType modelType, const std::string& cameraName, cv::Size imageSize) const { switch (modelType) { case Camera::KANNALA_BRANDT: { EquidistantCameraPtr camera(new EquidistantCamera); EquidistantCamera::Parameters params = camera->getParameters(); params.cameraName() = cameraName; params.imageWidth() = imageSize.width; params.imageHeight() = imageSize.height; camera->setParameters(params); return camera; } case Camera::PINHOLE: { PinholeCameraPtr camera(new PinholeCamera); PinholeCamera::Parameters params = camera->getParameters(); params.cameraName() = cameraName; params.imageWidth() = imageSize.width; params.imageHeight() = imageSize.height; camera->setParameters(params); return camera; } case Camera::SCARAMUZZA: { OCAMCameraPtr camera(new OCAMCamera); OCAMCamera::Parameters params = camera->getParameters(); params.cameraName() = cameraName; params.imageWidth() = imageSize.width; params.imageHeight() = imageSize.height; camera->setParameters(params); return camera; } case Camera::MEI: default: { CataCameraPtr camera(new CataCamera); CataCamera::Parameters params = camera->getParameters(); params.cameraName() = cameraName; params.imageWidth() = imageSize.width; params.imageHeight() = imageSize.height; camera->setParameters(params); return camera; } } } CameraPtr CameraFactory::generateCameraFromYamlFile(const std::string& filename) { cv::FileStorage fs(filename, cv::FileStorage::READ); if (!fs.isOpened()) { return CameraPtr(); } Camera::ModelType modelType = Camera::MEI; if (!fs["model_type"].isNone()) { std::string sModelType; fs["model_type"] >> sModelType; if (boost::iequals(sModelType, "kannala_brandt")) { modelType = Camera::KANNALA_BRANDT; } else if (boost::iequals(sModelType, "mei")) { modelType = Camera::MEI; } else if (boost::iequals(sModelType, "scaramuzza")) { modelType = Camera::SCARAMUZZA; } else if (boost::iequals(sModelType, "pinhole")) { modelType = Camera::PINHOLE; } else { std::cerr << "# ERROR: Unknown camera model: " << sModelType << std::endl; return CameraPtr(); } } switch (modelType) { case Camera::KANNALA_BRANDT: { EquidistantCameraPtr camera(new EquidistantCamera); EquidistantCamera::Parameters params = camera->getParameters(); params.readFromYamlFile(filename); camera->setParameters(params); return camera; } case Camera::PINHOLE: { PinholeCameraPtr camera(new PinholeCamera); PinholeCamera::Parameters params = camera->getParameters(); params.readFromYamlFile(filename); camera->setParameters(params); return camera; } case Camera::SCARAMUZZA: { OCAMCameraPtr camera(new OCAMCamera); OCAMCamera::Parameters params = camera->getParameters(); params.readFromYamlFile(filename); camera->setParameters(params); return camera; } case Camera::MEI: default: { CataCameraPtr camera(new CataCamera); CataCamera::Parameters params = camera->getParameters(); params.readFromYamlFile(filename); camera->setParameters(params); return camera; } } return CameraPtr(); } } ================================================ FILE: camera_model/src/camera_models/CataCamera.cc ================================================ #include "camodocal/camera_models/CataCamera.h" #include #include #include #include #include #include #include #include #include "camodocal/gpl/gpl.h" namespace camodocal { CataCamera::Parameters::Parameters() : Camera::Parameters(MEI) , m_xi(0.0) , m_k1(0.0) , m_k2(0.0) , m_p1(0.0) , m_p2(0.0) , m_gamma1(0.0) , m_gamma2(0.0) , m_u0(0.0) , m_v0(0.0) { } CataCamera::Parameters::Parameters(const std::string& cameraName, int w, int h, double xi, double k1, double k2, double p1, double p2, double gamma1, double gamma2, double u0, double v0) : Camera::Parameters(MEI, cameraName, w, h) , m_xi(xi) , m_k1(k1) , m_k2(k2) , m_p1(p1) , m_p2(p2) , m_gamma1(gamma1) , m_gamma2(gamma2) , m_u0(u0) , m_v0(v0) { } double& CataCamera::Parameters::xi(void) { return m_xi; } double& CataCamera::Parameters::k1(void) { return m_k1; } double& CataCamera::Parameters::k2(void) { return m_k2; } double& CataCamera::Parameters::p1(void) { return m_p1; } double& CataCamera::Parameters::p2(void) { return m_p2; } double& CataCamera::Parameters::gamma1(void) { return m_gamma1; } double& CataCamera::Parameters::gamma2(void) { return m_gamma2; } double& CataCamera::Parameters::u0(void) { return m_u0; } double& CataCamera::Parameters::v0(void) { return m_v0; } double CataCamera::Parameters::xi(void) const { return m_xi; } double CataCamera::Parameters::k1(void) const { return m_k1; } double CataCamera::Parameters::k2(void) const { return m_k2; } double CataCamera::Parameters::p1(void) const { return m_p1; } double CataCamera::Parameters::p2(void) const { return m_p2; } double CataCamera::Parameters::gamma1(void) const { return m_gamma1; } double CataCamera::Parameters::gamma2(void) const { return m_gamma2; } double CataCamera::Parameters::u0(void) const { return m_u0; } double CataCamera::Parameters::v0(void) const { return m_v0; } bool CataCamera::Parameters::readFromYamlFile(const std::string& filename) { cv::FileStorage fs(filename, cv::FileStorage::READ); if (!fs.isOpened()) { return false; } if (!fs["model_type"].isNone()) { std::string sModelType; fs["model_type"] >> sModelType; if (sModelType.compare("MEI") != 0) { return false; } } m_modelType = MEI; fs["camera_name"] >> m_cameraName; m_imageWidth = static_cast(fs["image_width"]); m_imageHeight = static_cast(fs["image_height"]); cv::FileNode n = fs["mirror_parameters"]; m_xi = static_cast(n["xi"]); n = fs["distortion_parameters"]; m_k1 = static_cast(n["k1"]); m_k2 = static_cast(n["k2"]); m_p1 = static_cast(n["p1"]); m_p2 = static_cast(n["p2"]); n = fs["projection_parameters"]; m_gamma1 = static_cast(n["gamma1"]); m_gamma2 = static_cast(n["gamma2"]); m_u0 = static_cast(n["u0"]); m_v0 = static_cast(n["v0"]); return true; } void CataCamera::Parameters::writeToYamlFile(const std::string& filename) const { cv::FileStorage fs(filename, cv::FileStorage::WRITE); fs << "model_type" << "MEI"; fs << "camera_name" << m_cameraName; fs << "image_width" << m_imageWidth; fs << "image_height" << m_imageHeight; // mirror: xi fs << "mirror_parameters"; fs << "{" << "xi" << m_xi << "}"; // radial distortion: k1, k2 // tangential distortion: p1, p2 fs << "distortion_parameters"; fs << "{" << "k1" << m_k1 << "k2" << m_k2 << "p1" << m_p1 << "p2" << m_p2 << "}"; // projection: gamma1, gamma2, u0, v0 fs << "projection_parameters"; fs << "{" << "gamma1" << m_gamma1 << "gamma2" << m_gamma2 << "u0" << m_u0 << "v0" << m_v0 << "}"; fs.release(); } CataCamera::Parameters& CataCamera::Parameters::operator=(const CataCamera::Parameters& other) { if (this != &other) { m_modelType = other.m_modelType; m_cameraName = other.m_cameraName; m_imageWidth = other.m_imageWidth; m_imageHeight = other.m_imageHeight; m_xi = other.m_xi; m_k1 = other.m_k1; m_k2 = other.m_k2; m_p1 = other.m_p1; m_p2 = other.m_p2; m_gamma1 = other.m_gamma1; m_gamma2 = other.m_gamma2; m_u0 = other.m_u0; m_v0 = other.m_v0; } return *this; } std::ostream& operator<< (std::ostream& out, const CataCamera::Parameters& params) { out << "Camera Parameters:" << std::endl; out << " model_type " << "MEI" << std::endl; out << " camera_name " << params.m_cameraName << std::endl; out << " image_width " << params.m_imageWidth << std::endl; out << " image_height " << params.m_imageHeight << std::endl; out << "Mirror Parameters" << std::endl; out << std::fixed << std::setprecision(10); out << " xi " << params.m_xi << std::endl; // radial distortion: k1, k2 // tangential distortion: p1, p2 out << "Distortion Parameters" << std::endl; out << " k1 " << params.m_k1 << std::endl << " k2 " << params.m_k2 << std::endl << " p1 " << params.m_p1 << std::endl << " p2 " << params.m_p2 << std::endl; // projection: gamma1, gamma2, u0, v0 out << "Projection Parameters" << std::endl; out << " gamma1 " << params.m_gamma1 << std::endl << " gamma2 " << params.m_gamma2 << std::endl << " u0 " << params.m_u0 << std::endl << " v0 " << params.m_v0 << std::endl; return out; } CataCamera::CataCamera() : m_inv_K11(1.0) , m_inv_K13(0.0) , m_inv_K22(1.0) , m_inv_K23(0.0) , m_noDistortion(true) { } CataCamera::CataCamera(const std::string& cameraName, int imageWidth, int imageHeight, double xi, double k1, double k2, double p1, double p2, double gamma1, double gamma2, double u0, double v0) : mParameters(cameraName, imageWidth, imageHeight, xi, k1, k2, p1, p2, gamma1, gamma2, u0, v0) { if ((mParameters.k1() == 0.0) && (mParameters.k2() == 0.0) && (mParameters.p1() == 0.0) && (mParameters.p2() == 0.0)) { m_noDistortion = true; } else { m_noDistortion = false; } // Inverse camera projection matrix parameters m_inv_K11 = 1.0 / mParameters.gamma1(); m_inv_K13 = -mParameters.u0() / mParameters.gamma1(); m_inv_K22 = 1.0 / mParameters.gamma2(); m_inv_K23 = -mParameters.v0() / mParameters.gamma2(); } CataCamera::CataCamera(const CataCamera::Parameters& params) : mParameters(params) { if ((mParameters.k1() == 0.0) && (mParameters.k2() == 0.0) && (mParameters.p1() == 0.0) && (mParameters.p2() == 0.0)) { m_noDistortion = true; } else { m_noDistortion = false; } // Inverse camera projection matrix parameters m_inv_K11 = 1.0 / mParameters.gamma1(); m_inv_K13 = -mParameters.u0() / mParameters.gamma1(); m_inv_K22 = 1.0 / mParameters.gamma2(); m_inv_K23 = -mParameters.v0() / mParameters.gamma2(); } Camera::ModelType CataCamera::modelType(void) const { return mParameters.modelType(); } const std::string& CataCamera::cameraName(void) const { return mParameters.cameraName(); } int CataCamera::imageWidth(void) const { return mParameters.imageWidth(); } int CataCamera::imageHeight(void) const { return mParameters.imageHeight(); } void CataCamera::estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints) { Parameters params = getParameters(); double u0 = params.imageWidth() / 2.0; double v0 = params.imageHeight() / 2.0; double gamma0 = 0.0; double minReprojErr = std::numeric_limits::max(); std::vector rvecs, tvecs; rvecs.assign(objectPoints.size(), cv::Mat()); tvecs.assign(objectPoints.size(), cv::Mat()); params.xi() = 1.0; params.k1() = 0.0; params.k2() = 0.0; params.p1() = 0.0; params.p2() = 0.0; params.u0() = u0; params.v0() = v0; // Initialize gamma (focal length) // Use non-radial line image and xi = 1 for (size_t i = 0; i < imagePoints.size(); ++i) { for (int r = 0; r < boardSize.height; ++r) { cv::Mat P(boardSize.width, 4, CV_64F); for (int c = 0; c < boardSize.width; ++c) { const cv::Point2f& imagePoint = imagePoints.at(i).at(r * boardSize.width + c); double u = imagePoint.x - u0; double v = imagePoint.y - v0; P.at(c, 0) = u; P.at(c, 1) = v; P.at(c, 2) = 0.5; P.at(c, 3) = -0.5 * (square(u) + square(v)); } cv::Mat C; cv::SVD::solveZ(P, C); double t = square(C.at(0)) + square(C.at(1)) + C.at(2) * C.at(3); if (t < 0.0) { continue; } // check that line image is not radial double d = sqrt(1.0 / t); double nx = C.at(0) * d; double ny = C.at(1) * d; if (hypot(nx, ny) > 0.95) { continue; } double gamma = sqrt(C.at(2) / C.at(3)); params.gamma1() = gamma; params.gamma2() = gamma; setParameters(params); for (size_t j = 0; j < objectPoints.size(); ++j) { estimateExtrinsics(objectPoints.at(j), imagePoints.at(j), rvecs.at(j), tvecs.at(j)); } double reprojErr = reprojectionError(objectPoints, imagePoints, rvecs, tvecs, cv::noArray()); if (reprojErr < minReprojErr) { minReprojErr = reprojErr; gamma0 = gamma; } } } if (gamma0 <= 0.0 && minReprojErr >= std::numeric_limits::max()) { std::cout << "[" << params.cameraName() << "] " << "# INFO: CataCamera model fails with given data. " << std::endl; return; } params.gamma1() = gamma0; params.gamma2() = gamma0; setParameters(params); } /** * \brief Lifts a point from the image plane to the unit sphere * * \param p image coordinates * \param P coordinates of the point on the sphere */ void CataCamera::liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const { double mx_d, my_d,mx2_d, mxy_d, my2_d, mx_u, my_u; double rho2_d, rho4_d, radDist_d, Dx_d, Dy_d, inv_denom_d; double lambda; // Lift points to normalised plane mx_d = m_inv_K11 * p(0) + m_inv_K13; my_d = m_inv_K22 * p(1) + m_inv_K23; if (m_noDistortion) { mx_u = mx_d; my_u = my_d; } else { // Apply inverse distortion model if (0) { double k1 = mParameters.k1(); double k2 = mParameters.k2(); double p1 = mParameters.p1(); double p2 = mParameters.p2(); // Inverse distortion model // proposed by Heikkila mx2_d = mx_d*mx_d; my2_d = my_d*my_d; mxy_d = mx_d*my_d; rho2_d = mx2_d+my2_d; rho4_d = rho2_d*rho2_d; radDist_d = k1*rho2_d+k2*rho4_d; Dx_d = mx_d*radDist_d + p2*(rho2_d+2*mx2_d) + 2*p1*mxy_d; Dy_d = my_d*radDist_d + p1*(rho2_d+2*my2_d) + 2*p2*mxy_d; inv_denom_d = 1/(1+4*k1*rho2_d+6*k2*rho4_d+8*p1*my_d+8*p2*mx_d); mx_u = mx_d - inv_denom_d*Dx_d; my_u = my_d - inv_denom_d*Dy_d; } else { // Recursive distortion model int n = 6; Eigen::Vector2d d_u; distortion(Eigen::Vector2d(mx_d, my_d), d_u); // Approximate value mx_u = mx_d - d_u(0); my_u = my_d - d_u(1); for (int i = 1; i < n; ++i) { distortion(Eigen::Vector2d(mx_u, my_u), d_u); mx_u = mx_d - d_u(0); my_u = my_d - d_u(1); } } } // Lift normalised points to the sphere (inv_hslash) double xi = mParameters.xi(); if (xi == 1.0) { lambda = 2.0 / (mx_u * mx_u + my_u * my_u + 1.0); P << lambda * mx_u, lambda * my_u, lambda - 1.0; } else { lambda = (xi + sqrt(1.0 + (1.0 - xi * xi) * (mx_u * mx_u + my_u * my_u))) / (1.0 + mx_u * mx_u + my_u * my_u); P << lambda * mx_u, lambda * my_u, lambda - xi; } } /** * \brief Lifts a point from the image plane to its projective ray * * \param p image coordinates * \param P coordinates of the projective ray */ void CataCamera::liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const { double mx_d, my_d,mx2_d, mxy_d, my2_d, mx_u, my_u; double rho2_d, rho4_d, radDist_d, Dx_d, Dy_d, inv_denom_d; //double lambda; // Lift points to normalised plane mx_d = m_inv_K11 * p(0) + m_inv_K13; my_d = m_inv_K22 * p(1) + m_inv_K23; if (m_noDistortion) { mx_u = mx_d; my_u = my_d; } else { if (0) { double k1 = mParameters.k1(); double k2 = mParameters.k2(); double p1 = mParameters.p1(); double p2 = mParameters.p2(); // Apply inverse distortion model // proposed by Heikkila mx2_d = mx_d*mx_d; my2_d = my_d*my_d; mxy_d = mx_d*my_d; rho2_d = mx2_d+my2_d; rho4_d = rho2_d*rho2_d; radDist_d = k1*rho2_d+k2*rho4_d; Dx_d = mx_d*radDist_d + p2*(rho2_d+2*mx2_d) + 2*p1*mxy_d; Dy_d = my_d*radDist_d + p1*(rho2_d+2*my2_d) + 2*p2*mxy_d; inv_denom_d = 1/(1+4*k1*rho2_d+6*k2*rho4_d+8*p1*my_d+8*p2*mx_d); mx_u = mx_d - inv_denom_d*Dx_d; my_u = my_d - inv_denom_d*Dy_d; } else { // Recursive distortion model int n = 8; Eigen::Vector2d d_u; distortion(Eigen::Vector2d(mx_d, my_d), d_u); // Approximate value mx_u = mx_d - d_u(0); my_u = my_d - d_u(1); for (int i = 1; i < n; ++i) { distortion(Eigen::Vector2d(mx_u, my_u), d_u); mx_u = mx_d - d_u(0); my_u = my_d - d_u(1); } } } // Obtain a projective ray double xi = mParameters.xi(); if (xi == 1.0) { P << mx_u, my_u, (1.0 - mx_u * mx_u - my_u * my_u) / 2.0; } else { // Reuse variable rho2_d = mx_u * mx_u + my_u * my_u; P << mx_u, my_u, 1.0 - xi * (rho2_d + 1.0) / (xi + sqrt(1.0 + (1.0 - xi * xi) * rho2_d)); } } /** * \brief Project a 3D point (\a x,\a y,\a z) to the image plane in (\a u,\a v) * * \param P 3D point coordinates * \param p return value, contains the image point coordinates */ void CataCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const { Eigen::Vector2d p_u, p_d; // Project points to the normalised plane double z = P(2) + mParameters.xi() * P.norm(); p_u << P(0) / z, P(1) / z; if (m_noDistortion) { p_d = p_u; } else { // Apply distortion Eigen::Vector2d d_u; distortion(p_u, d_u); p_d = p_u + d_u; } // Apply generalised projection matrix p << mParameters.gamma1() * p_d(0) + mParameters.u0(), mParameters.gamma2() * p_d(1) + mParameters.v0(); } #if 0 /** * \brief Project a 3D point to the image plane and calculate Jacobian * * \param P 3D point coordinates * \param p return value, contains the image point coordinates */ void CataCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p, Eigen::Matrix& J) const { double xi = mParameters.xi(); Eigen::Vector2d p_u, p_d; double norm, inv_denom; double dxdmx, dydmx, dxdmy, dydmy; norm = P.norm(); // Project points to the normalised plane inv_denom = 1.0 / (P(2) + xi * norm); p_u << inv_denom * P(0), inv_denom * P(1); // Calculate jacobian inv_denom = inv_denom * inv_denom / norm; double dudx = inv_denom * (norm * P(2) + xi * (P(1) * P(1) + P(2) * P(2))); double dvdx = -inv_denom * xi * P(0) * P(1); double dudy = dvdx; double dvdy = inv_denom * (norm * P(2) + xi * (P(0) * P(0) + P(2) * P(2))); inv_denom = inv_denom * (-xi * P(2) - norm); // reuse variable double dudz = P(0) * inv_denom; double dvdz = P(1) * inv_denom; if (m_noDistortion) { p_d = p_u; } else { // Apply distortion Eigen::Vector2d d_u; distortion(p_u, d_u); p_d = p_u + d_u; } double gamma1 = mParameters.gamma1(); double gamma2 = mParameters.gamma2(); // Make the product of the jacobians // and add projection matrix jacobian inv_denom = gamma1 * (dudx * dxdmx + dvdx * dxdmy); // reuse dvdx = gamma2 * (dudx * dydmx + dvdx * dydmy); dudx = inv_denom; inv_denom = gamma1 * (dudy * dxdmx + dvdy * dxdmy); // reuse dvdy = gamma2 * (dudy * dydmx + dvdy * dydmy); dudy = inv_denom; inv_denom = gamma1 * (dudz * dxdmx + dvdz * dxdmy); // reuse dvdz = gamma2 * (dudz * dydmx + dvdz * dydmy); dudz = inv_denom; // Apply generalised projection matrix p << gamma1 * p_d(0) + mParameters.u0(), gamma2 * p_d(1) + mParameters.v0(); J << dudx, dudy, dudz, dvdx, dvdy, dvdz; } #endif /** * \brief Projects an undistorted 2D point p_u to the image plane * * \param p_u 2D point coordinates * \return image point coordinates */ void CataCamera::undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const { Eigen::Vector2d p_d; if (m_noDistortion) { p_d = p_u; } else { // Apply distortion Eigen::Vector2d d_u; distortion(p_u, d_u); p_d = p_u + d_u; } // Apply generalised projection matrix p << mParameters.gamma1() * p_d(0) + mParameters.u0(), mParameters.gamma2() * p_d(1) + mParameters.v0(); } /** * \brief Apply distortion to input point (from the normalised plane) * * \param p_u undistorted coordinates of point on the normalised plane * \return to obtain the distorted point: p_d = p_u + d_u */ void CataCamera::distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u) const { double k1 = mParameters.k1(); double k2 = mParameters.k2(); double p1 = mParameters.p1(); double p2 = mParameters.p2(); double mx2_u, my2_u, mxy_u, rho2_u, rad_dist_u; mx2_u = p_u(0) * p_u(0); my2_u = p_u(1) * p_u(1); mxy_u = p_u(0) * p_u(1); rho2_u = mx2_u + my2_u; rad_dist_u = k1 * rho2_u + k2 * rho2_u * rho2_u; d_u << p_u(0) * rad_dist_u + 2.0 * p1 * mxy_u + p2 * (rho2_u + 2.0 * mx2_u), p_u(1) * rad_dist_u + 2.0 * p2 * mxy_u + p1 * (rho2_u + 2.0 * my2_u); } /** * \brief Apply distortion to input point (from the normalised plane) * and calculate Jacobian * * \param p_u undistorted coordinates of point on the normalised plane * \return to obtain the distorted point: p_d = p_u + d_u */ void CataCamera::distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u, Eigen::Matrix2d& J) const { double k1 = mParameters.k1(); double k2 = mParameters.k2(); double p1 = mParameters.p1(); double p2 = mParameters.p2(); double mx2_u, my2_u, mxy_u, rho2_u, rad_dist_u; mx2_u = p_u(0) * p_u(0); my2_u = p_u(1) * p_u(1); mxy_u = p_u(0) * p_u(1); rho2_u = mx2_u + my2_u; rad_dist_u = k1 * rho2_u + k2 * rho2_u * rho2_u; d_u << p_u(0) * rad_dist_u + 2.0 * p1 * mxy_u + p2 * (rho2_u + 2.0 * mx2_u), p_u(1) * rad_dist_u + 2.0 * p2 * mxy_u + p1 * (rho2_u + 2.0 * my2_u); double dxdmx = 1.0 + rad_dist_u + k1 * 2.0 * mx2_u + k2 * rho2_u * 4.0 * mx2_u + 2.0 * p1 * p_u(1) + 6.0 * p2 * p_u(0); double dydmx = k1 * 2.0 * p_u(0) * p_u(1) + k2 * 4.0 * rho2_u * p_u(0) * p_u(1) + p1 * 2.0 * p_u(0) + 2.0 * p2 * p_u(1); double dxdmy = dydmx; double dydmy = 1.0 + rad_dist_u + k1 * 2.0 * my2_u + k2 * rho2_u * 4.0 * my2_u + 6.0 * p1 * p_u(1) + 2.0 * p2 * p_u(0); J << dxdmx, dxdmy, dydmx, dydmy; } void CataCamera::initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale) const { cv::Size imageSize(mParameters.imageWidth(), mParameters.imageHeight()); cv::Mat mapX = cv::Mat::zeros(imageSize, CV_32F); cv::Mat mapY = cv::Mat::zeros(imageSize, CV_32F); for (int v = 0; v < imageSize.height; ++v) { for (int u = 0; u < imageSize.width; ++u) { double mx_u = m_inv_K11 / fScale * u + m_inv_K13 / fScale; double my_u = m_inv_K22 / fScale * v + m_inv_K23 / fScale; double xi = mParameters.xi(); double d2 = mx_u * mx_u + my_u * my_u; Eigen::Vector3d P; P << mx_u, my_u, 1.0 - xi * (d2 + 1.0) / (xi + sqrt(1.0 + (1.0 - xi * xi) * d2)); Eigen::Vector2d p; spaceToPlane(P, p); mapX.at(v,u) = p(0); mapY.at(v,u) = p(1); } } cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false); } cv::Mat CataCamera::initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx, float fy, cv::Size imageSize, float cx, float cy, cv::Mat rmat) const { if (imageSize == cv::Size(0, 0)) { imageSize = cv::Size(mParameters.imageWidth(), mParameters.imageHeight()); } cv::Mat mapX = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F); cv::Mat mapY = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F); Eigen::Matrix3f K_rect; if (cx == -1.0f && cy == -1.0f) { K_rect << fx, 0, imageSize.width / 2, 0, fy, imageSize.height / 2, 0, 0, 1; } else { K_rect << fx, 0, cx, 0, fy, cy, 0, 0, 1; } if (fx == -1.0f || fy == -1.0f) { K_rect(0,0) = mParameters.gamma1(); K_rect(1,1) = mParameters.gamma2(); } Eigen::Matrix3f K_rect_inv = K_rect.inverse(); Eigen::Matrix3f R, R_inv; cv::cv2eigen(rmat, R); R_inv = R.inverse(); for (int v = 0; v < imageSize.height; ++v) { for (int u = 0; u < imageSize.width; ++u) { Eigen::Vector3f xo; xo << u, v, 1; Eigen::Vector3f uo = R_inv * K_rect_inv * xo; Eigen::Vector2d p; spaceToPlane(uo.cast(), p); mapX.at(v,u) = p(0); mapY.at(v,u) = p(1); } } cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false); cv::Mat K_rect_cv; cv::eigen2cv(K_rect, K_rect_cv); return K_rect_cv; } int CataCamera::parameterCount(void) const { return 9; } const CataCamera::Parameters& CataCamera::getParameters(void) const { return mParameters; } void CataCamera::setParameters(const CataCamera::Parameters& parameters) { mParameters = parameters; if ((mParameters.k1() == 0.0) && (mParameters.k2() == 0.0) && (mParameters.p1() == 0.0) && (mParameters.p2() == 0.0)) { m_noDistortion = true; } else { m_noDistortion = false; } m_inv_K11 = 1.0 / mParameters.gamma1(); m_inv_K13 = -mParameters.u0() / mParameters.gamma1(); m_inv_K22 = 1.0 / mParameters.gamma2(); m_inv_K23 = -mParameters.v0() / mParameters.gamma2(); } void CataCamera::readParameters(const std::vector& parameterVec) { if ((int)parameterVec.size() != parameterCount()) { return; } Parameters params = getParameters(); params.xi() = parameterVec.at(0); params.k1() = parameterVec.at(1); params.k2() = parameterVec.at(2); params.p1() = parameterVec.at(3); params.p2() = parameterVec.at(4); params.gamma1() = parameterVec.at(5); params.gamma2() = parameterVec.at(6); params.u0() = parameterVec.at(7); params.v0() = parameterVec.at(8); setParameters(params); } void CataCamera::writeParameters(std::vector& parameterVec) const { parameterVec.resize(parameterCount()); parameterVec.at(0) = mParameters.xi(); parameterVec.at(1) = mParameters.k1(); parameterVec.at(2) = mParameters.k2(); parameterVec.at(3) = mParameters.p1(); parameterVec.at(4) = mParameters.p2(); parameterVec.at(5) = mParameters.gamma1(); parameterVec.at(6) = mParameters.gamma2(); parameterVec.at(7) = mParameters.u0(); parameterVec.at(8) = mParameters.v0(); } void CataCamera::writeParametersToYamlFile(const std::string& filename) const { mParameters.writeToYamlFile(filename); } std::string CataCamera::parametersToString(void) const { std::ostringstream oss; oss << mParameters; return oss.str(); } } ================================================ FILE: camera_model/src/camera_models/CostFunctionFactory.cc ================================================ #include "camodocal/camera_models/CostFunctionFactory.h" #include "ceres/ceres.h" #include "camodocal/camera_models/CataCamera.h" #include "camodocal/camera_models/EquidistantCamera.h" #include "camodocal/camera_models/PinholeCamera.h" #include "camodocal/camera_models/ScaramuzzaCamera.h" namespace camodocal { template void worldToCameraTransform(const T* const q_cam_odo, const T* const t_cam_odo, const T* const p_odo, const T* const att_odo, T* q, T* t, bool optimize_cam_odo_z = true) { Eigen::Quaternion q_z_inv(cos(att_odo[0] / T(2)), T(0), T(0), -sin(att_odo[0] / T(2))); Eigen::Quaternion q_y_inv(cos(att_odo[1] / T(2)), T(0), -sin(att_odo[1] / T(2)), T(0)); Eigen::Quaternion q_x_inv(cos(att_odo[2] / T(2)), -sin(att_odo[2] / T(2)), T(0), T(0)); Eigen::Quaternion q_zyx_inv = q_x_inv * q_y_inv * q_z_inv; T q_odo[4] = {q_zyx_inv.w(), q_zyx_inv.x(), q_zyx_inv.y(), q_zyx_inv.z()}; T q_odo_cam[4] = {q_cam_odo[3], -q_cam_odo[0], -q_cam_odo[1], -q_cam_odo[2]}; T q0[4]; ceres::QuaternionProduct(q_odo_cam, q_odo, q0); T t0[3]; T t_odo[3] = {p_odo[0], p_odo[1], p_odo[2]}; ceres::QuaternionRotatePoint(q_odo, t_odo, t0); t0[0] += t_cam_odo[0]; t0[1] += t_cam_odo[1]; if (optimize_cam_odo_z) { t0[2] += t_cam_odo[2]; } ceres::QuaternionRotatePoint(q_odo_cam, t0, t); t[0] = -t[0]; t[1] = -t[1]; t[2] = -t[2]; // Convert quaternion from Ceres convention (w, x, y, z) // to Eigen convention (x, y, z, w) q[0] = q0[1]; q[1] = q0[2]; q[2] = q0[3]; q[3] = q0[0]; } template class ReprojectionError1 { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW ReprojectionError1(const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p) : m_observed_P(observed_P), m_observed_p(observed_p) , m_sqrtPrecisionMat(Eigen::Matrix2d::Identity()) {} ReprojectionError1(const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p, const Eigen::Matrix2d& sqrtPrecisionMat) : m_observed_P(observed_P), m_observed_p(observed_p) , m_sqrtPrecisionMat(sqrtPrecisionMat) {} ReprojectionError1(const std::vector& intrinsic_params, const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p) : m_intrinsic_params(intrinsic_params) , m_observed_P(observed_P), m_observed_p(observed_p) {} // variables: camera intrinsics and camera extrinsics template bool operator()(const T* const intrinsic_params, const T* const q, const T* const t, T* residuals) const { Eigen::Matrix P = m_observed_P.cast(); Eigen::Matrix predicted_p; CameraT::spaceToPlane(intrinsic_params, q, t, P, predicted_p); Eigen::Matrix e = predicted_p - m_observed_p.cast(); Eigen::Matrix e_weighted = m_sqrtPrecisionMat.cast() * e; residuals[0] = e_weighted(0); residuals[1] = e_weighted(1); return true; } // variables: camera-odometry transforms and odometry poses template bool operator()(const T* const q_cam_odo, const T* const t_cam_odo, const T* const p_odo, const T* const att_odo, T* residuals) const { T q[4], t[3]; worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t); Eigen::Matrix P = m_observed_P.cast(); std::vector intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end()); // project 3D object point to the image plane Eigen::Matrix predicted_p; CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p); residuals[0] = predicted_p(0) - T(m_observed_p(0)); residuals[1] = predicted_p(1) - T(m_observed_p(1)); return true; } //private: // camera intrinsics std::vector m_intrinsic_params; // observed 3D point Eigen::Vector3d m_observed_P; // observed 2D point Eigen::Vector2d m_observed_p; // square root of precision matrix Eigen::Matrix2d m_sqrtPrecisionMat; }; // variables: camera extrinsics, 3D point template class ReprojectionError2 { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW ReprojectionError2(const std::vector& intrinsic_params, const Eigen::Vector2d& observed_p) : m_intrinsic_params(intrinsic_params), m_observed_p(observed_p) {} template bool operator()(const T* const q, const T* const t, const T* const point, T* residuals) const { Eigen::Matrix P; P(0) = T(point[0]); P(1) = T(point[1]); P(2) = T(point[2]); std::vector intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end()); // project 3D object point to the image plane Eigen::Matrix predicted_p; CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p); residuals[0] = predicted_p(0) - T(m_observed_p(0)); residuals[1] = predicted_p(1) - T(m_observed_p(1)); return true; } private: // camera intrinsics std::vector m_intrinsic_params; // observed 2D point Eigen::Vector2d m_observed_p; }; template class ReprojectionError3 { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW ReprojectionError3(const Eigen::Vector2d& observed_p) : m_observed_p(observed_p) , m_sqrtPrecisionMat(Eigen::Matrix2d::Identity()) , m_optimize_cam_odo_z(true) {} ReprojectionError3(const Eigen::Vector2d& observed_p, const Eigen::Matrix2d& sqrtPrecisionMat) : m_observed_p(observed_p) , m_sqrtPrecisionMat(sqrtPrecisionMat) , m_optimize_cam_odo_z(true) {} ReprojectionError3(const std::vector& intrinsic_params, const Eigen::Vector2d& observed_p) : m_intrinsic_params(intrinsic_params) , m_observed_p(observed_p) , m_sqrtPrecisionMat(Eigen::Matrix2d::Identity()) , m_optimize_cam_odo_z(true) {} ReprojectionError3(const std::vector& intrinsic_params, const Eigen::Vector2d& observed_p, const Eigen::Matrix2d& sqrtPrecisionMat) : m_intrinsic_params(intrinsic_params) , m_observed_p(observed_p) , m_sqrtPrecisionMat(sqrtPrecisionMat) , m_optimize_cam_odo_z(true) {} ReprojectionError3(const std::vector& intrinsic_params, const Eigen::Vector3d& odo_pos, const Eigen::Vector3d& odo_att, const Eigen::Vector2d& observed_p, bool optimize_cam_odo_z) : m_intrinsic_params(intrinsic_params) , m_odo_pos(odo_pos), m_odo_att(odo_att) , m_observed_p(observed_p) , m_optimize_cam_odo_z(optimize_cam_odo_z) {} ReprojectionError3(const std::vector& intrinsic_params, const Eigen::Quaterniond& cam_odo_q, const Eigen::Vector3d& cam_odo_t, const Eigen::Vector3d& odo_pos, const Eigen::Vector3d& odo_att, const Eigen::Vector2d& observed_p) : m_intrinsic_params(intrinsic_params) , m_cam_odo_q(cam_odo_q), m_cam_odo_t(cam_odo_t) , m_odo_pos(odo_pos), m_odo_att(odo_att) , m_observed_p(observed_p) , m_optimize_cam_odo_z(true) {} // variables: camera intrinsics, camera-to-odometry transform, // odometry extrinsics, 3D point template bool operator()(const T* const intrinsic_params, const T* const q_cam_odo, const T* const t_cam_odo, const T* const p_odo, const T* const att_odo, const T* const point, T* residuals) const { T q[4], t[3]; worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t, m_optimize_cam_odo_z); Eigen::Matrix P(point[0], point[1], point[2]); // project 3D object point to the image plane Eigen::Matrix predicted_p; CameraT::spaceToPlane(intrinsic_params, q, t, P, predicted_p); Eigen::Matrix err = predicted_p - m_observed_p.cast(); Eigen::Matrix err_weighted = m_sqrtPrecisionMat.cast() * err; residuals[0] = err_weighted(0); residuals[1] = err_weighted(1); return true; } // variables: camera-to-odometry transform, 3D point template bool operator()(const T* const q_cam_odo, const T* const t_cam_odo, const T* const point, T* residuals) const { T p_odo[3] = {T(m_odo_pos(0)), T(m_odo_pos(1)), T(m_odo_pos(2))}; T att_odo[3] = {T(m_odo_att(0)), T(m_odo_att(1)), T(m_odo_att(2))}; T q[4], t[3]; worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t, m_optimize_cam_odo_z); std::vector intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end()); Eigen::Matrix P(point[0], point[1], point[2]); // project 3D object point to the image plane Eigen::Matrix predicted_p; CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p); residuals[0] = predicted_p(0) - T(m_observed_p(0)); residuals[1] = predicted_p(1) - T(m_observed_p(1)); return true; } // variables: camera-to-odometry transform, odometry extrinsics, 3D point template bool operator()(const T* const q_cam_odo, const T* const t_cam_odo, const T* const p_odo, const T* const att_odo, const T* const point, T* residuals) const { T q[4], t[3]; worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t, m_optimize_cam_odo_z); std::vector intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end()); Eigen::Matrix P(point[0], point[1], point[2]); // project 3D object point to the image plane Eigen::Matrix predicted_p; CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p); Eigen::Matrix err = predicted_p - m_observed_p.cast(); Eigen::Matrix err_weighted = m_sqrtPrecisionMat.cast() * err; residuals[0] = err_weighted(0); residuals[1] = err_weighted(1); return true; } // variables: 3D point template bool operator()(const T* const point, T* residuals) const { T q_cam_odo[4] = {T(m_cam_odo_q.coeffs()(0)), T(m_cam_odo_q.coeffs()(1)), T(m_cam_odo_q.coeffs()(2)), T(m_cam_odo_q.coeffs()(3))}; T t_cam_odo[3] = {T(m_cam_odo_t(0)), T(m_cam_odo_t(1)), T(m_cam_odo_t(2))}; T p_odo[3] = {T(m_odo_pos(0)), T(m_odo_pos(1)), T(m_odo_pos(2))}; T att_odo[3] = {T(m_odo_att(0)), T(m_odo_att(1)), T(m_odo_att(2))}; T q[4], t[3]; worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t, m_optimize_cam_odo_z); std::vector intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end()); Eigen::Matrix P(point[0], point[1], point[2]); // project 3D object point to the image plane Eigen::Matrix predicted_p; CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p); residuals[0] = predicted_p(0) - T(m_observed_p(0)); residuals[1] = predicted_p(1) - T(m_observed_p(1)); return true; } private: // camera intrinsics std::vector m_intrinsic_params; // observed camera-odometry transform Eigen::Quaterniond m_cam_odo_q; Eigen::Vector3d m_cam_odo_t; // observed odometry Eigen::Vector3d m_odo_pos; Eigen::Vector3d m_odo_att; // observed 2D point Eigen::Vector2d m_observed_p; Eigen::Matrix2d m_sqrtPrecisionMat; bool m_optimize_cam_odo_z; }; // variables: camera intrinsics and camera extrinsics template class StereoReprojectionError { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW StereoReprojectionError(const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p_l, const Eigen::Vector2d& observed_p_r) : m_observed_P(observed_P) , m_observed_p_l(observed_p_l) , m_observed_p_r(observed_p_r) { } template bool operator()(const T* const intrinsic_params_l, const T* const intrinsic_params_r, const T* const q_l, const T* const t_l, const T* const q_l_r, const T* const t_l_r, T* residuals) const { Eigen::Matrix P; P(0) = T(m_observed_P(0)); P(1) = T(m_observed_P(1)); P(2) = T(m_observed_P(2)); Eigen::Matrix predicted_p_l; CameraT::spaceToPlane(intrinsic_params_l, q_l, t_l, P, predicted_p_l); Eigen::Quaternion q_r = Eigen::Quaternion(q_l_r) * Eigen::Quaternion(q_l); Eigen::Matrix t_r; t_r(0) = t_l[0]; t_r(1) = t_l[1]; t_r(2) = t_l[2]; t_r = Eigen::Quaternion(q_l_r) * t_r; t_r(0) += t_l_r[0]; t_r(1) += t_l_r[1]; t_r(2) += t_l_r[2]; Eigen::Matrix predicted_p_r; CameraT::spaceToPlane(intrinsic_params_r, q_r.coeffs().data(), t_r.data(), P, predicted_p_r); residuals[0] = predicted_p_l(0) - T(m_observed_p_l(0)); residuals[1] = predicted_p_l(1) - T(m_observed_p_l(1)); residuals[2] = predicted_p_r(0) - T(m_observed_p_r(0)); residuals[3] = predicted_p_r(1) - T(m_observed_p_r(1)); return true; } private: // observed 3D point Eigen::Vector3d m_observed_P; // observed 2D point Eigen::Vector2d m_observed_p_l; Eigen::Vector2d m_observed_p_r; }; template class ComprehensionError { public: EIGEN_MAKE_ALIGNED_OPERATOR_NEW ComprehensionError(const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p) : m_observed_P(observed_P), m_observed_p(observed_p), m_sqrtPrecisionMat(Eigen::Matrix2d::Identity()) { } template bool operator()(const T* const intrinsic_params, const T* const q, const T* const t, T* residuals) const { { Eigen::Matrix p = m_observed_p.cast(); Eigen::Matrix predicted_img_P; CameraT::LiftToSphere(intrinsic_params, p, predicted_img_P); Eigen::Matrix predicted_p; CameraT::SphereToPlane(intrinsic_params, predicted_img_P, predicted_p); Eigen::Matrix e = predicted_p - m_observed_p.cast(); Eigen::Matrix e_weighted = m_sqrtPrecisionMat.cast() * e; residuals[0] = e_weighted(0); residuals[1] = e_weighted(1); } { Eigen::Matrix P = m_observed_P.cast(); Eigen::Matrix predicted_p; CameraT::spaceToPlane(intrinsic_params, q, t, P, predicted_p); Eigen::Matrix e = predicted_p - m_observed_p.cast(); Eigen::Matrix e_weighted = m_sqrtPrecisionMat.cast() * e; residuals[2] = e_weighted(0); residuals[3] = e_weighted(1); } return true; } // private: // camera intrinsics // std::vector m_intrinsic_params; // observed 3D point Eigen::Vector3d m_observed_P; // observed 2D point Eigen::Vector2d m_observed_p; // square root of precision matrix Eigen::Matrix2d m_sqrtPrecisionMat; }; boost::shared_ptr CostFunctionFactory::m_instance; CostFunctionFactory::CostFunctionFactory() { } boost::shared_ptr CostFunctionFactory::instance(void) { if (m_instance.get() == 0) { m_instance.reset(new CostFunctionFactory); } return m_instance; } ceres::CostFunction* CostFunctionFactory::generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p, int flags) const { ceres::CostFunction* costFunction = 0; std::vector intrinsic_params; camera->writeParameters(intrinsic_params); switch (flags) { case CAMERA_INTRINSICS | CAMERA_POSE: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3>( new ReprojectionError1(observed_P, observed_p)); break; case Camera::PINHOLE: costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3>( new ReprojectionError1(observed_P, observed_p)); break; case Camera::MEI: costFunction = new ceres::AutoDiffCostFunction, 2, 9, 4, 3>( new ReprojectionError1(observed_P, observed_p)); break; case Camera::SCARAMUZZA: costFunction = new ceres::AutoDiffCostFunction, 4, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3>( new ComprehensionError(observed_P, observed_p)); // new ceres::AutoDiffCostFunction, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3>( // new ReprojectionError1(observed_P, observed_p)); break; } break; case CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3>( new ReprojectionError1(intrinsic_params, observed_P, observed_p)); break; case Camera::PINHOLE: costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3>( new ReprojectionError1(intrinsic_params, observed_P, observed_p)); break; case Camera::MEI: costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3>( new ReprojectionError1(intrinsic_params, observed_P, observed_p)); break; case Camera::SCARAMUZZA: costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3>( new ReprojectionError1(intrinsic_params, observed_P, observed_p)); break; } break; } return costFunction; } ceres::CostFunction* CostFunctionFactory::generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p, const Eigen::Matrix2d& sqrtPrecisionMat, int flags) const { ceres::CostFunction* costFunction = 0; std::vector intrinsic_params; camera->writeParameters(intrinsic_params); switch (flags) { case CAMERA_INTRINSICS | CAMERA_POSE: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3>( new ReprojectionError1(observed_P, observed_p, sqrtPrecisionMat)); break; case Camera::PINHOLE: costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3>( new ReprojectionError1(observed_P, observed_p, sqrtPrecisionMat)); break; case Camera::MEI: costFunction = new ceres::AutoDiffCostFunction, 2, 9, 4, 3>( new ReprojectionError1(observed_P, observed_p, sqrtPrecisionMat)); break; case Camera::SCARAMUZZA: costFunction = new ceres::AutoDiffCostFunction, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3>( new ReprojectionError1(observed_P, observed_p, sqrtPrecisionMat)); break; } break; } return costFunction; } ceres::CostFunction* CostFunctionFactory::generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector2d& observed_p, int flags, bool optimize_cam_odo_z) const { ceres::CostFunction* costFunction = 0; std::vector intrinsic_params; camera->writeParameters(intrinsic_params); switch (flags) { case CAMERA_POSE | POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3>( new ReprojectionError2(intrinsic_params, observed_p)); break; case Camera::PINHOLE: costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3>( new ReprojectionError2(intrinsic_params, observed_p)); break; case Camera::MEI: costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3>( new ReprojectionError2(intrinsic_params, observed_p)); break; case Camera::SCARAMUZZA: costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3>( new ReprojectionError2(intrinsic_params, observed_p)); break; } break; case CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_3D_POSE | POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 2, 1, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 2, 1, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } break; case Camera::PINHOLE: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 2, 1, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 2, 1, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } break; case Camera::MEI: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 2, 1, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 2, 1, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } break; case Camera::SCARAMUZZA: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 2, 1, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 2, 1, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } break; } break; case CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE | POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } break; case Camera::PINHOLE: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } break; case Camera::MEI: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } break; case Camera::SCARAMUZZA: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p)); } break; } break; case CAMERA_INTRINSICS | CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_3D_POSE | POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3, 2, 1, 3>( new ReprojectionError3(observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 2, 2, 1, 3>( new ReprojectionError3(observed_p)); } break; case Camera::PINHOLE: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3, 2, 1, 3>( new ReprojectionError3(observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 2, 2, 1, 3>( new ReprojectionError3(observed_p)); } break; case Camera::MEI: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 9, 4, 3, 2, 1, 3>( new ReprojectionError3(observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 9, 4, 2, 2, 1, 3>( new ReprojectionError3(observed_p)); } break; case Camera::SCARAMUZZA: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3, 2, 1, 3>( new ReprojectionError3(observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 2, 2, 1, 3>( new ReprojectionError3(observed_p)); } break; } break; case CAMERA_INTRINSICS | CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE | POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3, 3, 3, 3>( new ReprojectionError3(observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 2, 3, 3, 3>( new ReprojectionError3(observed_p)); } break; case Camera::PINHOLE: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3, 3, 3, 3>( new ReprojectionError3(observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 2, 3, 3, 3>( new ReprojectionError3(observed_p)); } break; case Camera::MEI: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 9, 4, 3, 3, 3, 3>( new ReprojectionError3(observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 9, 4, 2, 3, 3, 3>( new ReprojectionError3(observed_p)); } break; case Camera::SCARAMUZZA: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3, 3, 3, 3>( new ReprojectionError3(observed_p)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 2, 3, 3, 3>( new ReprojectionError3(observed_p)); } break; } break; } return costFunction; } ceres::CostFunction* CostFunctionFactory::generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector2d& observed_p, const Eigen::Matrix2d& sqrtPrecisionMat, int flags, bool optimize_cam_odo_z) const { ceres::CostFunction* costFunction = 0; std::vector intrinsic_params; camera->writeParameters(intrinsic_params); switch (flags) { case CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE | POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p, sqrtPrecisionMat)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p, sqrtPrecisionMat)); } break; case Camera::PINHOLE: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p, sqrtPrecisionMat)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p, sqrtPrecisionMat)); } break; case Camera::MEI: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p, sqrtPrecisionMat)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p, sqrtPrecisionMat)); } break; case Camera::SCARAMUZZA: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p, sqrtPrecisionMat)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3, 3, 3>( new ReprojectionError3(intrinsic_params, observed_p, sqrtPrecisionMat)); } break; } break; case CAMERA_INTRINSICS | CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE | POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3, 3, 3, 3>( new ReprojectionError3(observed_p, sqrtPrecisionMat)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 2, 3, 3, 3>( new ReprojectionError3(observed_p, sqrtPrecisionMat)); } break; case Camera::PINHOLE: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 3, 3, 3, 3>( new ReprojectionError3(observed_p, sqrtPrecisionMat)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 8, 4, 2, 3, 3, 3>( new ReprojectionError3(observed_p, sqrtPrecisionMat)); } break; case Camera::MEI: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 9, 4, 3, 3, 3, 3>( new ReprojectionError3(observed_p, sqrtPrecisionMat)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 9, 4, 2, 3, 3, 3>( new ReprojectionError3(observed_p, sqrtPrecisionMat)); } break; case Camera::SCARAMUZZA: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3, 3, 3, 3>( new ReprojectionError3(observed_p, sqrtPrecisionMat)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 2, 3, 3, 3>( new ReprojectionError3(observed_p, sqrtPrecisionMat)); } break; } break; } return costFunction; } ceres::CostFunction* CostFunctionFactory::generateCostFunction(const CameraConstPtr& camera, const Eigen::Vector3d& odo_pos, const Eigen::Vector3d& odo_att, const Eigen::Vector2d& observed_p, int flags, bool optimize_cam_odo_z) const { ceres::CostFunction* costFunction = 0; std::vector intrinsic_params; camera->writeParameters(intrinsic_params); switch (flags) { case CAMERA_ODOMETRY_TRANSFORM | POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3>( new ReprojectionError3(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3>( new ReprojectionError3(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z)); } break; case Camera::PINHOLE: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3>( new ReprojectionError3(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3>( new ReprojectionError3(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z)); } break; case Camera::MEI: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3>( new ReprojectionError3(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3>( new ReprojectionError3(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z)); } break; case Camera::SCARAMUZZA: if (optimize_cam_odo_z) { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 3, 3>( new ReprojectionError3(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z)); } else { costFunction = new ceres::AutoDiffCostFunction, 2, 4, 2, 3>( new ReprojectionError3(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z)); } break; } break; } return costFunction; } ceres::CostFunction* CostFunctionFactory::generateCostFunction(const CameraConstPtr& camera, const Eigen::Quaterniond& cam_odo_q, const Eigen::Vector3d& cam_odo_t, const Eigen::Vector3d& odo_pos, const Eigen::Vector3d& odo_att, const Eigen::Vector2d& observed_p, int flags) const { ceres::CostFunction* costFunction = 0; std::vector intrinsic_params; camera->writeParameters(intrinsic_params); switch (flags) { case POINT_3D: switch (camera->modelType()) { case Camera::KANNALA_BRANDT: costFunction = new ceres::AutoDiffCostFunction, 2, 3>( new ReprojectionError3(intrinsic_params, cam_odo_q, cam_odo_t, odo_pos, odo_att, observed_p)); break; case Camera::PINHOLE: costFunction = new ceres::AutoDiffCostFunction, 2, 3>( new ReprojectionError3(intrinsic_params, cam_odo_q, cam_odo_t, odo_pos, odo_att, observed_p)); break; case Camera::MEI: costFunction = new ceres::AutoDiffCostFunction, 2, 3>( new ReprojectionError3(intrinsic_params, cam_odo_q, cam_odo_t, odo_pos, odo_att, observed_p)); break; case Camera::SCARAMUZZA: costFunction = new ceres::AutoDiffCostFunction, 2, 3>( new ReprojectionError3(intrinsic_params, cam_odo_q, cam_odo_t, odo_pos, odo_att, observed_p)); break; } break; } return costFunction; } ceres::CostFunction* CostFunctionFactory::generateCostFunction(const CameraConstPtr& cameraL, const CameraConstPtr& cameraR, const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p_l, const Eigen::Vector2d& observed_p_r) const { ceres::CostFunction* costFunction = 0; if (cameraL->modelType() != cameraR->modelType()) { return costFunction; } switch (cameraL->modelType()) { case Camera::KANNALA_BRANDT: costFunction = new ceres::AutoDiffCostFunction, 4, 8, 8, 4, 3, 4, 3>( new StereoReprojectionError(observed_P, observed_p_l, observed_p_r)); break; case Camera::PINHOLE: costFunction = new ceres::AutoDiffCostFunction, 4, 8, 8, 4, 3, 4, 3>( new StereoReprojectionError(observed_P, observed_p_l, observed_p_r)); break; case Camera::MEI: costFunction = new ceres::AutoDiffCostFunction, 4, 9, 9, 4, 3, 4, 3>( new StereoReprojectionError(observed_P, observed_p_l, observed_p_r)); break; case Camera::SCARAMUZZA: costFunction = new ceres::AutoDiffCostFunction, 4, SCARAMUZZA_CAMERA_NUM_PARAMS, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3, 4, 3>( new StereoReprojectionError(observed_P, observed_p_l, observed_p_r)); break; } return costFunction; } } ================================================ FILE: camera_model/src/camera_models/EquidistantCamera.cc ================================================ #include "camodocal/camera_models/EquidistantCamera.h" #include #include #include #include #include #include #include #include #include "camodocal/gpl/gpl.h" namespace camodocal { EquidistantCamera::Parameters::Parameters() : Camera::Parameters(KANNALA_BRANDT) , m_k2(0.0) , m_k3(0.0) , m_k4(0.0) , m_k5(0.0) , m_mu(0.0) , m_mv(0.0) , m_u0(0.0) , m_v0(0.0) { } EquidistantCamera::Parameters::Parameters(const std::string& cameraName, int w, int h, double k2, double k3, double k4, double k5, double mu, double mv, double u0, double v0) : Camera::Parameters(KANNALA_BRANDT, cameraName, w, h) , m_k2(k2) , m_k3(k3) , m_k4(k4) , m_k5(k5) , m_mu(mu) , m_mv(mv) , m_u0(u0) , m_v0(v0) { } double& EquidistantCamera::Parameters::k2(void) { return m_k2; } double& EquidistantCamera::Parameters::k3(void) { return m_k3; } double& EquidistantCamera::Parameters::k4(void) { return m_k4; } double& EquidistantCamera::Parameters::k5(void) { return m_k5; } double& EquidistantCamera::Parameters::mu(void) { return m_mu; } double& EquidistantCamera::Parameters::mv(void) { return m_mv; } double& EquidistantCamera::Parameters::u0(void) { return m_u0; } double& EquidistantCamera::Parameters::v0(void) { return m_v0; } double EquidistantCamera::Parameters::k2(void) const { return m_k2; } double EquidistantCamera::Parameters::k3(void) const { return m_k3; } double EquidistantCamera::Parameters::k4(void) const { return m_k4; } double EquidistantCamera::Parameters::k5(void) const { return m_k5; } double EquidistantCamera::Parameters::mu(void) const { return m_mu; } double EquidistantCamera::Parameters::mv(void) const { return m_mv; } double EquidistantCamera::Parameters::u0(void) const { return m_u0; } double EquidistantCamera::Parameters::v0(void) const { return m_v0; } bool EquidistantCamera::Parameters::readFromYamlFile(const std::string& filename) { cv::FileStorage fs(filename, cv::FileStorage::READ); if (!fs.isOpened()) { return false; } if (!fs["model_type"].isNone()) { std::string sModelType; fs["model_type"] >> sModelType; if (sModelType.compare("KANNALA_BRANDT") != 0) { return false; } } m_modelType = KANNALA_BRANDT; fs["camera_name"] >> m_cameraName; m_imageWidth = static_cast(fs["image_width"]); m_imageHeight = static_cast(fs["image_height"]); cv::FileNode n = fs["projection_parameters"]; m_k2 = static_cast(n["k2"]); m_k3 = static_cast(n["k3"]); m_k4 = static_cast(n["k4"]); m_k5 = static_cast(n["k5"]); m_mu = static_cast(n["mu"]); m_mv = static_cast(n["mv"]); m_u0 = static_cast(n["u0"]); m_v0 = static_cast(n["v0"]); return true; } void EquidistantCamera::Parameters::writeToYamlFile(const std::string& filename) const { cv::FileStorage fs(filename, cv::FileStorage::WRITE); fs << "model_type" << "KANNALA_BRANDT"; fs << "camera_name" << m_cameraName; fs << "image_width" << m_imageWidth; fs << "image_height" << m_imageHeight; // projection: k2, k3, k4, k5, mu, mv, u0, v0 fs << "projection_parameters"; fs << "{" << "k2" << m_k2 << "k3" << m_k3 << "k4" << m_k4 << "k5" << m_k5 << "mu" << m_mu << "mv" << m_mv << "u0" << m_u0 << "v0" << m_v0 << "}"; fs.release(); } EquidistantCamera::Parameters& EquidistantCamera::Parameters::operator=(const EquidistantCamera::Parameters& other) { if (this != &other) { m_modelType = other.m_modelType; m_cameraName = other.m_cameraName; m_imageWidth = other.m_imageWidth; m_imageHeight = other.m_imageHeight; m_k2 = other.m_k2; m_k3 = other.m_k3; m_k4 = other.m_k4; m_k5 = other.m_k5; m_mu = other.m_mu; m_mv = other.m_mv; m_u0 = other.m_u0; m_v0 = other.m_v0; } return *this; } std::ostream& operator<< (std::ostream& out, const EquidistantCamera::Parameters& params) { out << "Camera Parameters:" << std::endl; out << " model_type " << "KANNALA_BRANDT" << std::endl; out << " camera_name " << params.m_cameraName << std::endl; out << " image_width " << params.m_imageWidth << std::endl; out << " image_height " << params.m_imageHeight << std::endl; // projection: k2, k3, k4, k5, mu, mv, u0, v0 out << "Projection Parameters" << std::endl; out << " k2 " << params.m_k2 << std::endl << " k3 " << params.m_k3 << std::endl << " k4 " << params.m_k4 << std::endl << " k5 " << params.m_k5 << std::endl << " mu " << params.m_mu << std::endl << " mv " << params.m_mv << std::endl << " u0 " << params.m_u0 << std::endl << " v0 " << params.m_v0 << std::endl; return out; } EquidistantCamera::EquidistantCamera() : m_inv_K11(1.0) , m_inv_K13(0.0) , m_inv_K22(1.0) , m_inv_K23(0.0) { } EquidistantCamera::EquidistantCamera(const std::string& cameraName, int imageWidth, int imageHeight, double k2, double k3, double k4, double k5, double mu, double mv, double u0, double v0) : mParameters(cameraName, imageWidth, imageHeight, k2, k3, k4, k5, mu, mv, u0, v0) { // Inverse camera projection matrix parameters m_inv_K11 = 1.0 / mParameters.mu(); m_inv_K13 = -mParameters.u0() / mParameters.mu(); m_inv_K22 = 1.0 / mParameters.mv(); m_inv_K23 = -mParameters.v0() / mParameters.mv(); } EquidistantCamera::EquidistantCamera(const EquidistantCamera::Parameters& params) : mParameters(params) { // Inverse camera projection matrix parameters m_inv_K11 = 1.0 / mParameters.mu(); m_inv_K13 = -mParameters.u0() / mParameters.mu(); m_inv_K22 = 1.0 / mParameters.mv(); m_inv_K23 = -mParameters.v0() / mParameters.mv(); } Camera::ModelType EquidistantCamera::modelType(void) const { return mParameters.modelType(); } const std::string& EquidistantCamera::cameraName(void) const { return mParameters.cameraName(); } int EquidistantCamera::imageWidth(void) const { return mParameters.imageWidth(); } int EquidistantCamera::imageHeight(void) const { return mParameters.imageHeight(); } void EquidistantCamera::estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints) { Parameters params = getParameters(); double u0 = params.imageWidth() / 2.0; double v0 = params.imageHeight() / 2.0; double minReprojErr = std::numeric_limits::max(); std::vector rvecs, tvecs; rvecs.assign(objectPoints.size(), cv::Mat()); tvecs.assign(objectPoints.size(), cv::Mat()); params.k2() = 0.0; params.k3() = 0.0; params.k4() = 0.0; params.k5() = 0.0; params.u0() = u0; params.v0() = v0; // Initialize focal length // C. Hughes, P. Denny, M. Glavin, and E. Jones, // Equidistant Fish-Eye Calibration and Rectification by Vanishing Point // Extraction, PAMI 2010 // Find circles from rows of chessboard corners, and for each pair // of circles, find vanishing points: v1 and v2. // f = ||v1 - v2|| / PI; double f0 = 0.0; for (size_t i = 0; i < imagePoints.size(); ++i) { std::vector center(boardSize.height); double radius[boardSize.height]; for (int r = 0; r < boardSize.height; ++r) { std::vector circle; for (int c = 0; c < boardSize.width; ++c) { circle.push_back(imagePoints.at(i).at(r * boardSize.width + c)); } fitCircle(circle, center[r](0), center[r](1), radius[r]); } for (int j = 0; j < boardSize.height; ++j) { for (int k = j + 1; k < boardSize.height; ++k) { // find distance between pair of vanishing points which // correspond to intersection points of 2 circles std::vector ipts; ipts = intersectCircles(center[j](0), center[j](1), radius[j], center[k](0), center[k](1), radius[k]); if (ipts.size() < 2) { continue; } double f = cv::norm(ipts.at(0) - ipts.at(1)) / M_PI; params.mu() = f; params.mv() = f; setParameters(params); for (size_t l = 0; l < objectPoints.size(); ++l) { estimateExtrinsics(objectPoints.at(l), imagePoints.at(l), rvecs.at(l), tvecs.at(l)); } double reprojErr = reprojectionError(objectPoints, imagePoints, rvecs, tvecs, cv::noArray()); if (reprojErr < minReprojErr) { minReprojErr = reprojErr; f0 = f; } } } } if (f0 <= 0.0 && minReprojErr >= std::numeric_limits::max()) { std::cout << "[" << params.cameraName() << "] " << "# INFO: kannala-Brandt model fails with given data. " << std::endl; return; } params.mu() = f0; params.mv() = f0; setParameters(params); } /** * \brief Lifts a point from the image plane to the unit sphere * * \param p image coordinates * \param P coordinates of the point on the sphere */ void EquidistantCamera::liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const { liftProjective(p, P); } /** * \brief Lifts a point from the image plane to its projective ray * * \param p image coordinates * \param P coordinates of the projective ray */ void EquidistantCamera::liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const { // Lift points to normalised plane Eigen::Vector2d p_u; p_u << m_inv_K11 * p(0) + m_inv_K13, m_inv_K22 * p(1) + m_inv_K23; // Obtain a projective ray double theta, phi; backprojectSymmetric(p_u, theta, phi); P(0) = sin(theta) * cos(phi); P(1) = sin(theta) * sin(phi); P(2) = cos(theta); } /** * \brief Project a 3D point (\a x,\a y,\a z) to the image plane in (\a u,\a v) * * \param P 3D point coordinates * \param p return value, contains the image point coordinates */ void EquidistantCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const { double theta = acos(P(2) / P.norm()); double phi = atan2(P(1), P(0)); Eigen::Vector2d p_u = r(mParameters.k2(), mParameters.k3(), mParameters.k4(), mParameters.k5(), theta) * Eigen::Vector2d(cos(phi), sin(phi)); // Apply generalised projection matrix p << mParameters.mu() * p_u(0) + mParameters.u0(), mParameters.mv() * p_u(1) + mParameters.v0(); } /** * \brief Project a 3D point to the image plane and calculate Jacobian * * \param P 3D point coordinates * \param p return value, contains the image point coordinates */ void EquidistantCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p, Eigen::Matrix& J) const { double theta = acos(P(2) / P.norm()); double phi = atan2(P(1), P(0)); Eigen::Vector2d p_u = r(mParameters.k2(), mParameters.k3(), mParameters.k4(), mParameters.k5(), theta) * Eigen::Vector2d(cos(phi), sin(phi)); // Apply generalised projection matrix p << mParameters.mu() * p_u(0) + mParameters.u0(), mParameters.mv() * p_u(1) + mParameters.v0(); } /** * \brief Projects an undistorted 2D point p_u to the image plane * * \param p_u 2D point coordinates * \return image point coordinates */ void EquidistantCamera::undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const { // Eigen::Vector2d p_d; // // if (m_noDistortion) // { // p_d = p_u; // } // else // { // // Apply distortion // Eigen::Vector2d d_u; // distortion(p_u, d_u); // p_d = p_u + d_u; // } // // // Apply generalised projection matrix // p << mParameters.gamma1() * p_d(0) + mParameters.u0(), // mParameters.gamma2() * p_d(1) + mParameters.v0(); } void EquidistantCamera::initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale) const { cv::Size imageSize(mParameters.imageWidth(), mParameters.imageHeight()); cv::Mat mapX = cv::Mat::zeros(imageSize, CV_32F); cv::Mat mapY = cv::Mat::zeros(imageSize, CV_32F); for (int v = 0; v < imageSize.height; ++v) { for (int u = 0; u < imageSize.width; ++u) { double mx_u = m_inv_K11 / fScale * u + m_inv_K13 / fScale; double my_u = m_inv_K22 / fScale * v + m_inv_K23 / fScale; double theta, phi; backprojectSymmetric(Eigen::Vector2d(mx_u, my_u), theta, phi); Eigen::Vector3d P; P << sin(theta) * cos(phi), sin(theta) * sin(phi), cos(theta); Eigen::Vector2d p; spaceToPlane(P, p); mapX.at(v,u) = p(0); mapY.at(v,u) = p(1); } } cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false); } cv::Mat EquidistantCamera::initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx, float fy, cv::Size imageSize, float cx, float cy, cv::Mat rmat) const { if (imageSize == cv::Size(0, 0)) { imageSize = cv::Size(mParameters.imageWidth(), mParameters.imageHeight()); } cv::Mat mapX = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F); cv::Mat mapY = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F); Eigen::Matrix3f K_rect; if (cx == -1.0f && cy == -1.0f) { K_rect << fx, 0, imageSize.width / 2, 0, fy, imageSize.height / 2, 0, 0, 1; } else { K_rect << fx, 0, cx, 0, fy, cy, 0, 0, 1; } if (fx == -1.0f || fy == -1.0f) { K_rect(0,0) = mParameters.mu(); K_rect(1,1) = mParameters.mv(); } Eigen::Matrix3f K_rect_inv = K_rect.inverse(); Eigen::Matrix3f R, R_inv; cv::cv2eigen(rmat, R); R_inv = R.inverse(); for (int v = 0; v < imageSize.height; ++v) { for (int u = 0; u < imageSize.width; ++u) { Eigen::Vector3f xo; xo << u, v, 1; Eigen::Vector3f uo = R_inv * K_rect_inv * xo; Eigen::Vector2d p; spaceToPlane(uo.cast(), p); mapX.at(v,u) = p(0); mapY.at(v,u) = p(1); } } cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false); cv::Mat K_rect_cv; cv::eigen2cv(K_rect, K_rect_cv); return K_rect_cv; } int EquidistantCamera::parameterCount(void) const { return 8; } const EquidistantCamera::Parameters& EquidistantCamera::getParameters(void) const { return mParameters; } void EquidistantCamera::setParameters(const EquidistantCamera::Parameters& parameters) { mParameters = parameters; // Inverse camera projection matrix parameters m_inv_K11 = 1.0 / mParameters.mu(); m_inv_K13 = -mParameters.u0() / mParameters.mu(); m_inv_K22 = 1.0 / mParameters.mv(); m_inv_K23 = -mParameters.v0() / mParameters.mv(); } void EquidistantCamera::readParameters(const std::vector& parameterVec) { if (parameterVec.size() != parameterCount()) { return; } Parameters params = getParameters(); params.k2() = parameterVec.at(0); params.k3() = parameterVec.at(1); params.k4() = parameterVec.at(2); params.k5() = parameterVec.at(3); params.mu() = parameterVec.at(4); params.mv() = parameterVec.at(5); params.u0() = parameterVec.at(6); params.v0() = parameterVec.at(7); setParameters(params); } void EquidistantCamera::writeParameters(std::vector& parameterVec) const { parameterVec.resize(parameterCount()); parameterVec.at(0) = mParameters.k2(); parameterVec.at(1) = mParameters.k3(); parameterVec.at(2) = mParameters.k4(); parameterVec.at(3) = mParameters.k5(); parameterVec.at(4) = mParameters.mu(); parameterVec.at(5) = mParameters.mv(); parameterVec.at(6) = mParameters.u0(); parameterVec.at(7) = mParameters.v0(); } void EquidistantCamera::writeParametersToYamlFile(const std::string& filename) const { mParameters.writeToYamlFile(filename); } std::string EquidistantCamera::parametersToString(void) const { std::ostringstream oss; oss << mParameters; return oss.str(); } void EquidistantCamera::fitOddPoly(const std::vector& x, const std::vector& y, int n, std::vector& coeffs) const { std::vector pows; for (int i = 1; i <= n; i += 2) { pows.push_back(i); } Eigen::MatrixXd X(x.size(), pows.size()); Eigen::MatrixXd Y(y.size(), 1); for (size_t i = 0; i < x.size(); ++i) { for (size_t j = 0; j < pows.size(); ++j) { X(i,j) = pow(x.at(i), pows.at(j)); } Y(i,0) = y.at(i); } Eigen::MatrixXd A = (X.transpose() * X).inverse() * X.transpose() * Y; coeffs.resize(A.rows()); for (int i = 0; i < A.rows(); ++i) { coeffs.at(i) = A(i,0); } } void EquidistantCamera::backprojectSymmetric(const Eigen::Vector2d& p_u, double& theta, double& phi) const { double tol = 1e-10; double p_u_norm = p_u.norm(); if (p_u_norm < 1e-10) { phi = 0.0; } else { phi = atan2(p_u(1), p_u(0)); } int npow = 9; if (mParameters.k5() == 0.0) { npow -= 2; } if (mParameters.k4() == 0.0) { npow -= 2; } if (mParameters.k3() == 0.0) { npow -= 2; } if (mParameters.k2() == 0.0) { npow -= 2; } Eigen::MatrixXd coeffs(npow + 1, 1); coeffs.setZero(); coeffs(0) = -p_u_norm; coeffs(1) = 1.0; if (npow >= 3) { coeffs(3) = mParameters.k2(); } if (npow >= 5) { coeffs(5) = mParameters.k3(); } if (npow >= 7) { coeffs(7) = mParameters.k4(); } if (npow >= 9) { coeffs(9) = mParameters.k5(); } if (npow == 1) { theta = p_u_norm; } else { // Get eigenvalues of companion matrix corresponding to polynomial. // Eigenvalues correspond to roots of polynomial. Eigen::MatrixXd A(npow, npow); A.setZero(); A.block(1, 0, npow - 1, npow - 1).setIdentity(); A.col(npow - 1) = - coeffs.block(0, 0, npow, 1) / coeffs(npow); Eigen::EigenSolver es(A); Eigen::MatrixXcd eigval = es.eigenvalues(); std::vector thetas; for (int i = 0; i < eigval.rows(); ++i) { if (fabs(eigval(i).imag()) > tol) { continue; } double t = eigval(i).real(); if (t < -tol) { continue; } else if (t < 0.0) { t = 0.0; } thetas.push_back(t); } if (thetas.empty()) { theta = p_u_norm; } else { theta = *std::min_element(thetas.begin(), thetas.end()); } } } } ================================================ FILE: camera_model/src/camera_models/PinholeCamera.cc ================================================ #include "camodocal/camera_models/PinholeCamera.h" #include #include #include #include #include #include #include #include "camodocal/gpl/gpl.h" namespace camodocal { PinholeCamera::Parameters::Parameters() : Camera::Parameters(PINHOLE) , m_k1(0.0) , m_k2(0.0) , m_p1(0.0) , m_p2(0.0) , m_fx(0.0) , m_fy(0.0) , m_cx(0.0) , m_cy(0.0) { } PinholeCamera::Parameters::Parameters(const std::string& cameraName, int w, int h, double k1, double k2, double p1, double p2, double fx, double fy, double cx, double cy) : Camera::Parameters(PINHOLE, cameraName, w, h) , m_k1(k1) , m_k2(k2) , m_p1(p1) , m_p2(p2) , m_fx(fx) , m_fy(fy) , m_cx(cx) , m_cy(cy) { } double& PinholeCamera::Parameters::k1(void) { return m_k1; } double& PinholeCamera::Parameters::k2(void) { return m_k2; } double& PinholeCamera::Parameters::p1(void) { return m_p1; } double& PinholeCamera::Parameters::p2(void) { return m_p2; } double& PinholeCamera::Parameters::fx(void) { return m_fx; } double& PinholeCamera::Parameters::fy(void) { return m_fy; } double& PinholeCamera::Parameters::cx(void) { return m_cx; } double& PinholeCamera::Parameters::cy(void) { return m_cy; } double PinholeCamera::Parameters::k1(void) const { return m_k1; } double PinholeCamera::Parameters::k2(void) const { return m_k2; } double PinholeCamera::Parameters::p1(void) const { return m_p1; } double PinholeCamera::Parameters::p2(void) const { return m_p2; } double PinholeCamera::Parameters::fx(void) const { return m_fx; } double PinholeCamera::Parameters::fy(void) const { return m_fy; } double PinholeCamera::Parameters::cx(void) const { return m_cx; } double PinholeCamera::Parameters::cy(void) const { return m_cy; } bool PinholeCamera::Parameters::readFromYamlFile(const std::string& filename) { cv::FileStorage fs(filename, cv::FileStorage::READ); if (!fs.isOpened()) { return false; } if (!fs["model_type"].isNone()) { std::string sModelType; fs["model_type"] >> sModelType; if (sModelType.compare("PINHOLE") != 0) { return false; } } m_modelType = PINHOLE; fs["camera_name"] >> m_cameraName; m_imageWidth = static_cast(fs["image_width"]); m_imageHeight = static_cast(fs["image_height"]); cv::FileNode n = fs["distortion_parameters"]; m_k1 = static_cast(n["k1"]); m_k2 = static_cast(n["k2"]); m_p1 = static_cast(n["p1"]); m_p2 = static_cast(n["p2"]); n = fs["projection_parameters"]; m_fx = static_cast(n["fx"]); m_fy = static_cast(n["fy"]); m_cx = static_cast(n["cx"]); m_cy = static_cast(n["cy"]); return true; } void PinholeCamera::Parameters::writeToYamlFile(const std::string& filename) const { cv::FileStorage fs(filename, cv::FileStorage::WRITE); fs << "model_type" << "PINHOLE"; fs << "camera_name" << m_cameraName; fs << "image_width" << m_imageWidth; fs << "image_height" << m_imageHeight; // radial distortion: k1, k2 // tangential distortion: p1, p2 fs << "distortion_parameters"; fs << "{" << "k1" << m_k1 << "k2" << m_k2 << "p1" << m_p1 << "p2" << m_p2 << "}"; // projection: fx, fy, cx, cy fs << "projection_parameters"; fs << "{" << "fx" << m_fx << "fy" << m_fy << "cx" << m_cx << "cy" << m_cy << "}"; fs.release(); } PinholeCamera::Parameters& PinholeCamera::Parameters::operator=(const PinholeCamera::Parameters& other) { if (this != &other) { m_modelType = other.m_modelType; m_cameraName = other.m_cameraName; m_imageWidth = other.m_imageWidth; m_imageHeight = other.m_imageHeight; m_k1 = other.m_k1; m_k2 = other.m_k2; m_p1 = other.m_p1; m_p2 = other.m_p2; m_fx = other.m_fx; m_fy = other.m_fy; m_cx = other.m_cx; m_cy = other.m_cy; } return *this; } std::ostream& operator<< (std::ostream& out, const PinholeCamera::Parameters& params) { out << "Camera Parameters:" << std::endl; out << " model_type " << "PINHOLE" << std::endl; out << " camera_name " << params.m_cameraName << std::endl; out << " image_width " << params.m_imageWidth << std::endl; out << " image_height " << params.m_imageHeight << std::endl; // radial distortion: k1, k2 // tangential distortion: p1, p2 out << "Distortion Parameters" << std::endl; out << " k1 " << params.m_k1 << std::endl << " k2 " << params.m_k2 << std::endl << " p1 " << params.m_p1 << std::endl << " p2 " << params.m_p2 << std::endl; // projection: fx, fy, cx, cy out << "Projection Parameters" << std::endl; out << " fx " << params.m_fx << std::endl << " fy " << params.m_fy << std::endl << " cx " << params.m_cx << std::endl << " cy " << params.m_cy << std::endl; return out; } PinholeCamera::PinholeCamera() : m_inv_K11(1.0) , m_inv_K13(0.0) , m_inv_K22(1.0) , m_inv_K23(0.0) , m_noDistortion(true) { } PinholeCamera::PinholeCamera(const std::string& cameraName, int imageWidth, int imageHeight, double k1, double k2, double p1, double p2, double fx, double fy, double cx, double cy) : mParameters(cameraName, imageWidth, imageHeight, k1, k2, p1, p2, fx, fy, cx, cy) { if ((mParameters.k1() == 0.0) && (mParameters.k2() == 0.0) && (mParameters.p1() == 0.0) && (mParameters.p2() == 0.0)) { m_noDistortion = true; } else { m_noDistortion = false; } // Inverse camera projection matrix parameters m_inv_K11 = 1.0 / mParameters.fx(); m_inv_K13 = -mParameters.cx() / mParameters.fx(); m_inv_K22 = 1.0 / mParameters.fy(); m_inv_K23 = -mParameters.cy() / mParameters.fy(); } PinholeCamera::PinholeCamera(const PinholeCamera::Parameters& params) : mParameters(params) { if ((mParameters.k1() == 0.0) && (mParameters.k2() == 0.0) && (mParameters.p1() == 0.0) && (mParameters.p2() == 0.0)) { m_noDistortion = true; } else { m_noDistortion = false; } // Inverse camera projection matrix parameters m_inv_K11 = 1.0 / mParameters.fx(); m_inv_K13 = -mParameters.cx() / mParameters.fx(); m_inv_K22 = 1.0 / mParameters.fy(); m_inv_K23 = -mParameters.cy() / mParameters.fy(); } Camera::ModelType PinholeCamera::modelType(void) const { return mParameters.modelType(); } const std::string& PinholeCamera::cameraName(void) const { return mParameters.cameraName(); } int PinholeCamera::imageWidth(void) const { return mParameters.imageWidth(); } int PinholeCamera::imageHeight(void) const { return mParameters.imageHeight(); } void PinholeCamera::estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints) { // Z. Zhang, A Flexible New Technique for Camera Calibration, PAMI 2000 Parameters params = getParameters(); params.k1() = 0.0; params.k2() = 0.0; params.p1() = 0.0; params.p2() = 0.0; double cx = params.imageWidth() / 2.0; double cy = params.imageHeight() / 2.0; params.cx() = cx; params.cy() = cy; size_t nImages = imagePoints.size(); cv::Mat A(nImages * 2, 2, CV_64F); cv::Mat b(nImages * 2, 1, CV_64F); for (size_t i = 0; i < nImages; ++i) { const std::vector& oPoints = objectPoints.at(i); std::vector M(oPoints.size()); for (size_t j = 0; j < M.size(); ++j) { M.at(j) = cv::Point2f(oPoints.at(j).x, oPoints.at(j).y); } cv::Mat H = cv::findHomography(M, imagePoints.at(i)); H.at(0,0) -= H.at(2,0) * cx; H.at(0,1) -= H.at(2,1) * cx; H.at(0,2) -= H.at(2,2) * cx; H.at(1,0) -= H.at(2,0) * cy; H.at(1,1) -= H.at(2,1) * cy; H.at(1,2) -= H.at(2,2) * cy; double h[3], v[3], d1[3], d2[3]; double n[4] = {0,0,0,0}; for (int j = 0; j < 3; ++j) { double t0 = H.at(j,0); double t1 = H.at(j,1); h[j] = t0; v[j] = t1; d1[j] = (t0 + t1) * 0.5; d2[j] = (t0 - t1) * 0.5; n[0] += t0 * t0; n[1] += t1 * t1; n[2] += d1[j] * d1[j]; n[3] += d2[j] * d2[j]; } for (int j = 0; j < 4; ++j) { n[j] = 1.0 / sqrt(n[j]); } for (int j = 0; j < 3; ++j) { h[j] *= n[0]; v[j] *= n[1]; d1[j] *= n[2]; d2[j] *= n[3]; } A.at(i * 2, 0) = h[0] * v[0]; A.at(i * 2, 1) = h[1] * v[1]; A.at(i * 2 + 1, 0) = d1[0] * d2[0]; A.at(i * 2 + 1, 1) = d1[1] * d2[1]; b.at(i * 2, 0) = -h[2] * v[2]; b.at(i * 2 + 1, 0) = -d1[2] * d2[2]; } cv::Mat f(2, 1, CV_64F); cv::solve(A, b, f, cv::DECOMP_NORMAL | cv::DECOMP_LU); params.fx() = sqrt(fabs(1.0 / f.at(0))); params.fy() = sqrt(fabs(1.0 / f.at(1))); setParameters(params); } /** * \brief Lifts a point from the image plane to the unit sphere * * \param p image coordinates * \param P coordinates of the point on the sphere */ void PinholeCamera::liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const { liftProjective(p, P); P.normalize(); } /** * \brief Lifts a point from the image plane to its projective ray * * \param p image coordinates * \param P coordinates of the projective ray */ void PinholeCamera::liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const { double mx_d, my_d,mx2_d, mxy_d, my2_d, mx_u, my_u; double rho2_d, rho4_d, radDist_d, Dx_d, Dy_d, inv_denom_d; //double lambda; // Lift points to normalised plane mx_d = m_inv_K11 * p(0) + m_inv_K13; my_d = m_inv_K22 * p(1) + m_inv_K23; if (m_noDistortion) { mx_u = mx_d; my_u = my_d; } else { if (0) { double k1 = mParameters.k1(); double k2 = mParameters.k2(); double p1 = mParameters.p1(); double p2 = mParameters.p2(); // Apply inverse distortion model // proposed by Heikkila mx2_d = mx_d*mx_d; my2_d = my_d*my_d; mxy_d = mx_d*my_d; rho2_d = mx2_d+my2_d; rho4_d = rho2_d*rho2_d; radDist_d = k1*rho2_d+k2*rho4_d; Dx_d = mx_d*radDist_d + p2*(rho2_d+2*mx2_d) + 2*p1*mxy_d; Dy_d = my_d*radDist_d + p1*(rho2_d+2*my2_d) + 2*p2*mxy_d; inv_denom_d = 1/(1+4*k1*rho2_d+6*k2*rho4_d+8*p1*my_d+8*p2*mx_d); mx_u = mx_d - inv_denom_d*Dx_d; my_u = my_d - inv_denom_d*Dy_d; } else { // Recursive distortion model int n = 8; Eigen::Vector2d d_u; distortion(Eigen::Vector2d(mx_d, my_d), d_u); // Approximate value mx_u = mx_d - d_u(0); my_u = my_d - d_u(1); for (int i = 1; i < n; ++i) { distortion(Eigen::Vector2d(mx_u, my_u), d_u); mx_u = mx_d - d_u(0); my_u = my_d - d_u(1); } } } // Obtain a projective ray P << mx_u, my_u, 1.0; } /** * \brief Project a 3D point (\a x,\a y,\a z) to the image plane in (\a u,\a v) * * \param P 3D point coordinates * \param p return value, contains the image point coordinates */ void PinholeCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const { Eigen::Vector2d p_u, p_d; // Project points to the normalised plane p_u << P(0) / P(2), P(1) / P(2); if (m_noDistortion) { p_d = p_u; } else { // Apply distortion Eigen::Vector2d d_u; distortion(p_u, d_u); p_d = p_u + d_u; } // Apply generalised projection matrix p << mParameters.fx() * p_d(0) + mParameters.cx(), mParameters.fy() * p_d(1) + mParameters.cy(); } #if 0 /** * \brief Project a 3D point to the image plane and calculate Jacobian * * \param P 3D point coordinates * \param p return value, contains the image point coordinates */ void PinholeCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p, Eigen::Matrix& J) const { Eigen::Vector2d p_u, p_d; double norm, inv_denom; double dxdmx, dydmx, dxdmy, dydmy; norm = P.norm(); // Project points to the normalised plane inv_denom = 1.0 / P(2); p_u << inv_denom * P(0), inv_denom * P(1); // Calculate jacobian double dudx = inv_denom; double dvdx = 0.0; double dudy = 0.0; double dvdy = inv_denom; inv_denom = - inv_denom * inv_denom; double dudz = P(0) * inv_denom; double dvdz = P(1) * inv_denom; if (m_noDistortion) { p_d = p_u; } else { // Apply distortion Eigen::Vector2d d_u; distortion(p_u, d_u); p_d = p_u + d_u; } double fx = mParameters.fx(); double fy = mParameters.fy(); // Make the product of the jacobians // and add projection matrix jacobian inv_denom = fx * (dudx * dxdmx + dvdx * dxdmy); // reuse dvdx = fy * (dudx * dydmx + dvdx * dydmy); dudx = inv_denom; inv_denom = fx * (dudy * dxdmx + dvdy * dxdmy); // reuse dvdy = fy * (dudy * dydmx + dvdy * dydmy); dudy = inv_denom; inv_denom = fx * (dudz * dxdmx + dvdz * dxdmy); // reuse dvdz = fy * (dudz * dydmx + dvdz * dydmy); dudz = inv_denom; // Apply generalised projection matrix p << fx * p_d(0) + mParameters.cx(), fy * p_d(1) + mParameters.cy(); J << dudx, dudy, dudz, dvdx, dvdy, dvdz; } #endif /** * \brief Projects an undistorted 2D point p_u to the image plane * * \param p_u 2D point coordinates * \return image point coordinates */ void PinholeCamera::undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const { Eigen::Vector2d p_d; if (m_noDistortion) { p_d = p_u; } else { // Apply distortion Eigen::Vector2d d_u; distortion(p_u, d_u); p_d = p_u + d_u; } // Apply generalised projection matrix p << mParameters.fx() * p_d(0) + mParameters.cx(), mParameters.fy() * p_d(1) + mParameters.cy(); } /** * \brief Apply distortion to input point (from the normalised plane) * * \param p_u undistorted coordinates of point on the normalised plane * \return to obtain the distorted point: p_d = p_u + d_u */ void PinholeCamera::distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u) const { double k1 = mParameters.k1(); double k2 = mParameters.k2(); double p1 = mParameters.p1(); double p2 = mParameters.p2(); double mx2_u, my2_u, mxy_u, rho2_u, rad_dist_u; mx2_u = p_u(0) * p_u(0); my2_u = p_u(1) * p_u(1); mxy_u = p_u(0) * p_u(1); rho2_u = mx2_u + my2_u; rad_dist_u = k1 * rho2_u + k2 * rho2_u * rho2_u; d_u << p_u(0) * rad_dist_u + 2.0 * p1 * mxy_u + p2 * (rho2_u + 2.0 * mx2_u), p_u(1) * rad_dist_u + 2.0 * p2 * mxy_u + p1 * (rho2_u + 2.0 * my2_u); } /** * \brief Apply distortion to input point (from the normalised plane) * and calculate Jacobian * * \param p_u undistorted coordinates of point on the normalised plane * \return to obtain the distorted point: p_d = p_u + d_u */ void PinholeCamera::distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u, Eigen::Matrix2d& J) const { double k1 = mParameters.k1(); double k2 = mParameters.k2(); double p1 = mParameters.p1(); double p2 = mParameters.p2(); double mx2_u, my2_u, mxy_u, rho2_u, rad_dist_u; mx2_u = p_u(0) * p_u(0); my2_u = p_u(1) * p_u(1); mxy_u = p_u(0) * p_u(1); rho2_u = mx2_u + my2_u; rad_dist_u = k1 * rho2_u + k2 * rho2_u * rho2_u; d_u << p_u(0) * rad_dist_u + 2.0 * p1 * mxy_u + p2 * (rho2_u + 2.0 * mx2_u), p_u(1) * rad_dist_u + 2.0 * p2 * mxy_u + p1 * (rho2_u + 2.0 * my2_u); double dxdmx = 1.0 + rad_dist_u + k1 * 2.0 * mx2_u + k2 * rho2_u * 4.0 * mx2_u + 2.0 * p1 * p_u(1) + 6.0 * p2 * p_u(0); double dydmx = k1 * 2.0 * p_u(0) * p_u(1) + k2 * 4.0 * rho2_u * p_u(0) * p_u(1) + p1 * 2.0 * p_u(0) + 2.0 * p2 * p_u(1); double dxdmy = dydmx; double dydmy = 1.0 + rad_dist_u + k1 * 2.0 * my2_u + k2 * rho2_u * 4.0 * my2_u + 6.0 * p1 * p_u(1) + 2.0 * p2 * p_u(0); J << dxdmx, dxdmy, dydmx, dydmy; } void PinholeCamera::initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale) const { cv::Size imageSize(mParameters.imageWidth(), mParameters.imageHeight()); cv::Mat mapX = cv::Mat::zeros(imageSize, CV_32F); cv::Mat mapY = cv::Mat::zeros(imageSize, CV_32F); for (int v = 0; v < imageSize.height; ++v) { for (int u = 0; u < imageSize.width; ++u) { double mx_u = m_inv_K11 / fScale * u + m_inv_K13 / fScale; double my_u = m_inv_K22 / fScale * v + m_inv_K23 / fScale; Eigen::Vector3d P; P << mx_u, my_u, 1.0; Eigen::Vector2d p; spaceToPlane(P, p); mapX.at(v,u) = p(0); mapY.at(v,u) = p(1); } } cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false); } cv::Mat PinholeCamera::initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx, float fy, cv::Size imageSize, float cx, float cy, cv::Mat rmat) const { if (imageSize == cv::Size(0, 0)) { imageSize = cv::Size(mParameters.imageWidth(), mParameters.imageHeight()); } cv::Mat mapX = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F); cv::Mat mapY = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F); Eigen::Matrix3f R, R_inv; cv::cv2eigen(rmat, R); R_inv = R.inverse(); // assume no skew Eigen::Matrix3f K_rect; if (cx == -1.0f || cy == -1.0f) { K_rect << fx, 0, imageSize.width / 2, 0, fy, imageSize.height / 2, 0, 0, 1; } else { K_rect << fx, 0, cx, 0, fy, cy, 0, 0, 1; } if (fx == -1.0f || fy == -1.0f) { K_rect(0,0) = mParameters.fx(); K_rect(1,1) = mParameters.fy(); } Eigen::Matrix3f K_rect_inv = K_rect.inverse(); for (int v = 0; v < imageSize.height; ++v) { for (int u = 0; u < imageSize.width; ++u) { Eigen::Vector3f xo; xo << u, v, 1; Eigen::Vector3f uo = R_inv * K_rect_inv * xo; Eigen::Vector2d p; spaceToPlane(uo.cast(), p); mapX.at(v,u) = p(0); mapY.at(v,u) = p(1); } } cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false); cv::Mat K_rect_cv; cv::eigen2cv(K_rect, K_rect_cv); return K_rect_cv; } int PinholeCamera::parameterCount(void) const { return 8; } const PinholeCamera::Parameters& PinholeCamera::getParameters(void) const { return mParameters; } void PinholeCamera::setParameters(const PinholeCamera::Parameters& parameters) { mParameters = parameters; if ((mParameters.k1() == 0.0) && (mParameters.k2() == 0.0) && (mParameters.p1() == 0.0) && (mParameters.p2() == 0.0)) { m_noDistortion = true; } else { m_noDistortion = false; } m_inv_K11 = 1.0 / mParameters.fx(); m_inv_K13 = -mParameters.cx() / mParameters.fx(); m_inv_K22 = 1.0 / mParameters.fy(); m_inv_K23 = -mParameters.cy() / mParameters.fy(); } void PinholeCamera::readParameters(const std::vector& parameterVec) { if ((int)parameterVec.size() != parameterCount()) { return; } Parameters params = getParameters(); params.k1() = parameterVec.at(0); params.k2() = parameterVec.at(1); params.p1() = parameterVec.at(2); params.p2() = parameterVec.at(3); params.fx() = parameterVec.at(4); params.fy() = parameterVec.at(5); params.cx() = parameterVec.at(6); params.cy() = parameterVec.at(7); setParameters(params); } void PinholeCamera::writeParameters(std::vector& parameterVec) const { parameterVec.resize(parameterCount()); parameterVec.at(0) = mParameters.k1(); parameterVec.at(1) = mParameters.k2(); parameterVec.at(2) = mParameters.p1(); parameterVec.at(3) = mParameters.p2(); parameterVec.at(4) = mParameters.fx(); parameterVec.at(5) = mParameters.fy(); parameterVec.at(6) = mParameters.cx(); parameterVec.at(7) = mParameters.cy(); } void PinholeCamera::writeParametersToYamlFile(const std::string& filename) const { mParameters.writeToYamlFile(filename); } std::string PinholeCamera::parametersToString(void) const { std::ostringstream oss; oss << mParameters; return oss.str(); } } ================================================ FILE: camera_model/src/camera_models/ScaramuzzaCamera.cc ================================================ #include "camodocal/camera_models/ScaramuzzaCamera.h" #include #include #include #include #include #include #include #include #include #include #include #include "camodocal/gpl/gpl.h" Eigen::VectorXd polyfit(Eigen::VectorXd& xVec, Eigen::VectorXd& yVec, int poly_order) { assert(poly_order > 0); assert(xVec.size() > poly_order); assert(xVec.size() == yVec.size()); Eigen::MatrixXd A(xVec.size(), poly_order+1); Eigen::VectorXd B(xVec.size()); for(int i = 0; i < xVec.size(); ++i) { const double x = xVec(i); const double y = yVec(i); double x_pow_k = 1.0; for(int k=0; k<=poly_order; ++k) { A(i,k) = x_pow_k; x_pow_k *= x; } B(i) = y; } Eigen::JacobiSVD svd(A, Eigen::ComputeThinU | Eigen::ComputeThinV); Eigen::VectorXd x = svd.solve(B); return x; } namespace camodocal { OCAMCamera::Parameters::Parameters() : Camera::Parameters(SCARAMUZZA) , m_C(0.0) , m_D(0.0) , m_E(0.0) , m_center_x(0.0) , m_center_y(0.0) { memset(m_poly, 0, sizeof(double) * SCARAMUZZA_POLY_SIZE); memset(m_inv_poly, 0, sizeof(double) * SCARAMUZZA_INV_POLY_SIZE); } bool OCAMCamera::Parameters::readFromYamlFile(const std::string& filename) { cv::FileStorage fs(filename, cv::FileStorage::READ); if (!fs.isOpened()) { return false; } if (!fs["model_type"].isNone()) { std::string sModelType; fs["model_type"] >> sModelType; if (!boost::iequals(sModelType, "scaramuzza")) { return false; } } m_modelType = SCARAMUZZA; fs["camera_name"] >> m_cameraName; m_imageWidth = static_cast(fs["image_width"]); m_imageHeight = static_cast(fs["image_height"]); cv::FileNode n = fs["poly_parameters"]; for(int i=0; i < SCARAMUZZA_POLY_SIZE; i++) m_poly[i] = static_cast(n[std::string("p") + boost::lexical_cast(i)]); n = fs["inv_poly_parameters"]; for(int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++) m_inv_poly[i] = static_cast(n[std::string("p") + boost::lexical_cast(i)]); n = fs["affine_parameters"]; m_C = static_cast(n["ac"]); m_D = static_cast(n["ad"]); m_E = static_cast(n["ae"]); m_center_x = static_cast(n["cx"]); m_center_y = static_cast(n["cy"]); return true; } void OCAMCamera::Parameters::writeToYamlFile(const std::string& filename) const { cv::FileStorage fs(filename, cv::FileStorage::WRITE); fs << "model_type" << "scaramuzza"; fs << "camera_name" << m_cameraName; fs << "image_width" << m_imageWidth; fs << "image_height" << m_imageHeight; fs << "poly_parameters"; fs << "{"; for(int i=0; i < SCARAMUZZA_POLY_SIZE; i++) fs << std::string("p") + boost::lexical_cast(i) << m_poly[i]; fs << "}"; fs << "inv_poly_parameters"; fs << "{"; for(int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++) fs << std::string("p") + boost::lexical_cast(i) << m_inv_poly[i]; fs << "}"; fs << "affine_parameters"; fs << "{" << "ac" << m_C << "ad" << m_D << "ae" << m_E << "cx" << m_center_x << "cy" << m_center_y << "}"; fs.release(); } OCAMCamera::Parameters& OCAMCamera::Parameters::operator=(const OCAMCamera::Parameters& other) { if (this != &other) { m_modelType = other.m_modelType; m_cameraName = other.m_cameraName; m_imageWidth = other.m_imageWidth; m_imageHeight = other.m_imageHeight; m_C = other.m_C; m_D = other.m_D; m_E = other.m_E; m_center_x = other.m_center_x; m_center_y = other.m_center_y; memcpy(m_poly, other.m_poly, sizeof(double) * SCARAMUZZA_POLY_SIZE); memcpy(m_inv_poly, other.m_inv_poly, sizeof(double) * SCARAMUZZA_INV_POLY_SIZE); } return *this; } std::ostream& operator<< (std::ostream& out, const OCAMCamera::Parameters& params) { out << "Camera Parameters:" << std::endl; out << " model_type " << "scaramuzza" << std::endl; out << " camera_name " << params.m_cameraName << std::endl; out << " image_width " << params.m_imageWidth << std::endl; out << " image_height " << params.m_imageHeight << std::endl; out << std::fixed << std::setprecision(10); out << "Poly Parameters" << std::endl; for(int i=0; i < SCARAMUZZA_POLY_SIZE; i++) out << std::string("p") + boost::lexical_cast(i) << ": " << params.m_poly[i] << std::endl; out << "Inverse Poly Parameters" << std::endl; for(int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++) out << std::string("p") + boost::lexical_cast(i) << ": " << params.m_inv_poly[i] << std::endl; out << "Affine Parameters" << std::endl; out << " ac " << params.m_C << std::endl << " ad " << params.m_D << std::endl << " ae " << params.m_E << std::endl; out << " cx " << params.m_center_x << std::endl << " cy " << params.m_center_y << std::endl; return out; } OCAMCamera::OCAMCamera() : m_inv_scale(0.0) { } OCAMCamera::OCAMCamera(const OCAMCamera::Parameters& params) : mParameters(params) { m_inv_scale = 1.0 / (params.C() - params.D() * params.E()); } Camera::ModelType OCAMCamera::modelType(void) const { return mParameters.modelType(); } const std::string& OCAMCamera::cameraName(void) const { return mParameters.cameraName(); } int OCAMCamera::imageWidth(void) const { return mParameters.imageWidth(); } int OCAMCamera::imageHeight(void) const { return mParameters.imageHeight(); } void OCAMCamera::estimateIntrinsics(const cv::Size& boardSize, const std::vector< std::vector >& objectPoints, const std::vector< std::vector >& imagePoints) { // std::cout << "OCAMCamera::estimateIntrinsics - NOT IMPLEMENTED" << std::endl; // throw std::string("OCAMCamera::estimateIntrinsics - NOT IMPLEMENTED"); // Reference: Page 30 of // " Scaramuzza, D. Omnidirectional Vision: from Calibration to Robot Motion Estimation, ETH Zurich. Thesis no. 17635." // http://e-collection.library.ethz.ch/eserv/eth:30301/eth-30301-02.pdf // Matlab code: calibrate.m // First, estimate every image's extrinsics parameters std::vector< Eigen::Matrix3d > RList; std::vector< Eigen::Vector3d > TList; RList.reserve(imagePoints.size()); TList.reserve(imagePoints.size()); // i-th image for (size_t image_index = 0; image_index < imagePoints.size(); ++image_index) { const std::vector& objPts = objectPoints.at(image_index); const std::vector& imgPts = imagePoints.at(image_index); assert(objPts.size() == imgPts.size()); assert(objPts.size() == static_cast(boardSize.width * boardSize.height)); Eigen::MatrixXd M(objPts.size(), 6); for(size_t corner_index = 0; corner_index < objPts.size(); ++corner_index) { double X = objPts.at(corner_index).x; double Y = objPts.at(corner_index).y; assert(objPts.at(corner_index).z==0.0); double u = imgPts.at(corner_index).x; double v = imgPts.at(corner_index).y; M(corner_index, 0) = -v * X; M(corner_index, 1) = -v * Y; M(corner_index, 2) = u * X; M(corner_index, 3) = u * Y; M(corner_index, 4) = -v; M(corner_index, 5) = u; } Eigen::JacobiSVD svd(M, Eigen::ComputeFullU | Eigen::ComputeFullV); assert(svd.matrixV().cols() == 6); Eigen::VectorXd h = -svd.matrixV().col(5); // scaled version of R and T const double sr11 = h(0); const double sr12 = h(1); const double sr21 = h(2); const double sr22 = h(3); const double st1 = h(4); const double st2 = h(5); const double AA = square(sr11*sr12 + sr21*sr22); const double BB = square(sr11) + square(sr21); const double CC = square(sr12) + square(sr22); const double sr32_squared_1 = (- (CC-BB) + sqrt(square(CC-BB) + 4.0 * AA)) / 2.0; const double sr32_squared_2 = (- (CC-BB) - sqrt(square(CC-BB) + 4.0 * AA)) / 2.0; // printf("rst = %.12f\n", sr32_squared_1*sr32_squared_1 + (CC-BB)*sr32_squared_1 - AA); std::vector sr32_squared_values; if (sr32_squared_1 > 0) sr32_squared_values.push_back(sr32_squared_1); if (sr32_squared_2 > 0) sr32_squared_values.push_back(sr32_squared_2); assert(!sr32_squared_values.empty()); std::vector sr32_values; std::vector sr31_values; for (auto sr32_squared : sr32_squared_values) { for(int sign = -1; sign <= 1; sign += 2) { const double sr32 = static_cast(sign) * std::sqrt(sr32_squared); sr32_values.push_back( sr32 ); if (sr32_squared == 0.0) { // sr31 can be calculated through norm equality, // but it has positive and negative posibilities // positive one sr31_values.push_back(std::sqrt(CC-BB)); // negative one sr32_values.push_back( sr32 ); sr31_values.push_back(-std::sqrt(CC-BB)); break; // skip the same situation } else { // sr31 can be calculated throught dot product == 0 sr31_values.push_back(- (sr11*sr12 + sr21*sr22) / sr32); } } } // std::cout << "h= " << std::setprecision(12) << h.transpose() << std::endl; // std::cout << "length: " << sr32_values.size() << " & " << sr31_values.size() << std::endl; assert(!sr31_values.empty()); assert(sr31_values.size() == sr32_values.size()); std::vector H_values; for(size_t i=0;i(v,u) = p(0); mapY.at(v,u) = p(1); } } cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false); } #endif cv::Mat OCAMCamera::initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2, float fx, float fy, cv::Size imageSize, float cx, float cy, cv::Mat rmat) const { if (imageSize == cv::Size(0, 0)) { imageSize = cv::Size(mParameters.imageWidth(), mParameters.imageHeight()); } cv::Mat mapX = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F); cv::Mat mapY = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F); Eigen::Matrix3f K_rect; K_rect << fx, 0, cx < 0 ? imageSize.width / 2 : cx, 0, fy, cy < 0 ? imageSize.height / 2 : cy, 0, 0, 1; if (fx < 0 || fy < 0) { throw std::string(std::string(__FUNCTION__) + ": Focal length must be specified"); } Eigen::Matrix3f K_rect_inv = K_rect.inverse(); Eigen::Matrix3f R, R_inv; cv::cv2eigen(rmat, R); R_inv = R.inverse(); for (int v = 0; v < imageSize.height; ++v) { for (int u = 0; u < imageSize.width; ++u) { Eigen::Vector3f xo; xo << u, v, 1; Eigen::Vector3f uo = R_inv * K_rect_inv * xo; Eigen::Vector2d p; spaceToPlane(uo.cast(), p); mapX.at(v,u) = p(0); mapY.at(v,u) = p(1); } } cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false); cv::Mat K_rect_cv; cv::eigen2cv(K_rect, K_rect_cv); return K_rect_cv; } int OCAMCamera::parameterCount(void) const { return SCARAMUZZA_CAMERA_NUM_PARAMS; } const OCAMCamera::Parameters& OCAMCamera::getParameters(void) const { return mParameters; } void OCAMCamera::setParameters(const OCAMCamera::Parameters& parameters) { mParameters = parameters; m_inv_scale = 1.0 / (parameters.C() - parameters.D() * parameters.E()); } void OCAMCamera::readParameters(const std::vector& parameterVec) { if ((int)parameterVec.size() != parameterCount()) { return; } Parameters params = getParameters(); params.C() = parameterVec.at(0); params.D() = parameterVec.at(1); params.E() = parameterVec.at(2); params.center_x() = parameterVec.at(3); params.center_y() = parameterVec.at(4); for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++) params.poly(i) = parameterVec.at(5+i); for (int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++) params.inv_poly(i) = parameterVec.at(5 + SCARAMUZZA_POLY_SIZE + i); setParameters(params); } void OCAMCamera::writeParameters(std::vector& parameterVec) const { parameterVec.resize(parameterCount()); parameterVec.at(0) = mParameters.C(); parameterVec.at(1) = mParameters.D(); parameterVec.at(2) = mParameters.E(); parameterVec.at(3) = mParameters.center_x(); parameterVec.at(4) = mParameters.center_y(); for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++) parameterVec.at(5+i) = mParameters.poly(i); for (int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++) parameterVec.at(5 + SCARAMUZZA_POLY_SIZE + i) = mParameters.inv_poly(i); } void OCAMCamera::writeParametersToYamlFile(const std::string& filename) const { mParameters.writeToYamlFile(filename); } std::string OCAMCamera::parametersToString(void) const { std::ostringstream oss; oss << mParameters; return oss.str(); } } ================================================ FILE: camera_model/src/chessboard/Chessboard.cc ================================================ #include "camodocal/chessboard/Chessboard.h" #include #include #include "camodocal/chessboard/ChessboardQuad.h" #include "camodocal/chessboard/Spline.h" #define MAX_CONTOUR_APPROX 7 namespace camodocal { Chessboard::Chessboard(cv::Size boardSize, cv::Mat& image) : mBoardSize(boardSize) , mCornersFound(false) { if (image.channels() == 1) { cv::cvtColor(image, mSketch, CV_GRAY2BGR); image.copyTo(mImage); } else { image.copyTo(mSketch); cv::cvtColor(image, mImage, CV_BGR2GRAY); } } void Chessboard::findCorners(bool useOpenCV) { mCornersFound = findChessboardCorners(mImage, mBoardSize, mCorners, CV_CALIB_CB_ADAPTIVE_THRESH + CV_CALIB_CB_NORMALIZE_IMAGE + CV_CALIB_CB_FILTER_QUADS + CV_CALIB_CB_FAST_CHECK, useOpenCV); if (mCornersFound) { // draw chessboard corners cv::drawChessboardCorners(mSketch, mBoardSize, mCorners, mCornersFound); } } const std::vector& Chessboard::getCorners(void) const { return mCorners; } bool Chessboard::cornersFound(void) const { return mCornersFound; } const cv::Mat& Chessboard::getImage(void) const { return mImage; } const cv::Mat& Chessboard::getSketch(void) const { return mSketch; } bool Chessboard::findChessboardCorners(const cv::Mat& image, const cv::Size& patternSize, std::vector& corners, int flags, bool useOpenCV) { if (useOpenCV) { return cv::findChessboardCorners(image, patternSize, corners, flags); } else { return findChessboardCornersImproved(image, patternSize, corners, flags); } } bool Chessboard::findChessboardCornersImproved(const cv::Mat& image, const cv::Size& patternSize, std::vector& corners, int flags) { /************************************************************************************\ This is improved variant of chessboard corner detection algorithm that uses a graph of connected quads. It is based on the code contributed by Vladimir Vezhnevets and Philip Gruebele. Here is the copyright notice from the original Vladimir's code: =============================================================== The algorithms developed and implemented by Vezhnevets Vldimir aka Dead Moroz (vvp@graphics.cs.msu.ru) See http://graphics.cs.msu.su/en/research/calibration/opencv.html for detailed information. Reliability additions and modifications made by Philip Gruebele. pgruebele@cox.net Some improvements were made by: 1) Martin Rufli - increased chance of correct corner matching Rufli, M., Scaramuzza, D., and Siegwart, R. (2008), Automatic Detection of Checkerboards on Blurred and Distorted Images, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2008), Nice, France, September 2008. 2) Lionel Heng - post-detection checks and better thresholding \************************************************************************************/ //int bestDilation = -1; const int minDilations = 0; const int maxDilations = 7; std::vector outputQuadGroup; if (image.depth() != CV_8U || image.channels() == 2) { return false; } if (patternSize.width < 2 || patternSize.height < 2) { return false; } if (patternSize.width > 127 || patternSize.height > 127) { return false; } cv::Mat img = image; // Image histogram normalization and // BGR to Grayscale image conversion (if applicable) // MARTIN: Set to "false" if (image.channels() != 1 || (flags & CV_CALIB_CB_NORMALIZE_IMAGE)) { cv::Mat norm_img(image.rows, image.cols, CV_8UC1); if (image.channels() != 1) { cv::cvtColor(image, norm_img, CV_BGR2GRAY); img = norm_img; } if (flags & CV_CALIB_CB_NORMALIZE_IMAGE) { cv::equalizeHist(image, norm_img); img = norm_img; } } if (flags & CV_CALIB_CB_FAST_CHECK) { if (!checkChessboard(img, patternSize)) { return false; } } // PART 1: FIND LARGEST PATTERN //----------------------------------------------------------------------- // Checker patterns are tried to be found by dilating the background and // then applying a canny edge finder on the closed contours (checkers). // Try one dilation run, but if the pattern is not found, repeat until // max_dilations is reached. int prevSqrSize = 0; bool found = false; std::vector outputCorners; for (int k = 0; k < 6; ++k) { for (int dilations = minDilations; dilations <= maxDilations; ++dilations) { if (found) { break; } cv::Mat thresh_img; // convert the input grayscale image to binary (black-n-white) if (flags & CV_CALIB_CB_ADAPTIVE_THRESH) { int blockSize = lround(prevSqrSize == 0 ? std::min(img.cols,img.rows)*(k%2 == 0 ? 0.2 : 0.1): prevSqrSize*2)|1; // convert to binary cv::adaptiveThreshold(img, thresh_img, 255, CV_ADAPTIVE_THRESH_MEAN_C, CV_THRESH_BINARY, blockSize, (k/2)*5); } else { // empiric threshold level double mean = (cv::mean(img))[0]; int thresh_level = lround(mean - 10); thresh_level = std::max(thresh_level, 10); cv::threshold(img, thresh_img, thresh_level, 255, CV_THRESH_BINARY); } // MARTIN's Code // Use both a rectangular and a cross kernel. In this way, a more // homogeneous dilation is performed, which is crucial for small, // distorted checkers. Use the CROSS kernel first, since its action // on the image is more subtle cv::Mat kernel1 = cv::getStructuringElement(CV_SHAPE_CROSS, cv::Size(3,3), cv::Point(1,1)); cv::Mat kernel2 = cv::getStructuringElement(CV_SHAPE_RECT, cv::Size(3,3), cv::Point(1,1)); if (dilations >= 1) cv::dilate(thresh_img, thresh_img, kernel1); if (dilations >= 2) cv::dilate(thresh_img, thresh_img, kernel2); if (dilations >= 3) cv::dilate(thresh_img, thresh_img, kernel1); if (dilations >= 4) cv::dilate(thresh_img, thresh_img, kernel2); if (dilations >= 5) cv::dilate(thresh_img, thresh_img, kernel1); if (dilations >= 6) cv::dilate(thresh_img, thresh_img, kernel2); // In order to find rectangles that go to the edge, we draw a white // line around the image edge. Otherwise FindContours will miss those // clipped rectangle contours. The border color will be the image mean, // because otherwise we risk screwing up filters like cvSmooth() cv::rectangle(thresh_img, cv::Point(0,0), cv::Point(thresh_img.cols - 1, thresh_img.rows - 1), CV_RGB(255,255,255), 3, 8); // Generate quadrangles in the following function std::vector quads; generateQuads(quads, thresh_img, flags, dilations, true); if (quads.empty()) { continue; } // The following function finds and assigns neighbor quads to every // quadrangle in the immediate vicinity fulfilling certain // prerequisites findQuadNeighbors(quads, dilations); // The connected quads will be organized in groups. The following loop // increases a "group_idx" identifier. // The function "findConnectedQuads assigns all connected quads // a unique group ID. // If more quadrangles were assigned to a given group (i.e. connected) // than are expected by the input variable "patternSize", the // function "cleanFoundConnectedQuads" erases the surplus // quadrangles by minimizing the convex hull of the remaining pattern. for (int group_idx = 0; ; ++group_idx) { std::vector quadGroup; findConnectedQuads(quads, quadGroup, group_idx, dilations); if (quadGroup.empty()) { break; } cleanFoundConnectedQuads(quadGroup, patternSize); // The following function labels all corners of every quad // with a row and column entry. // "count" specifies the number of found quads in "quad_group" // with group identifier "group_idx" // The last parameter is set to "true", because this is the // first function call and some initializations need to be // made. labelQuadGroup(quadGroup, patternSize, true); found = checkQuadGroup(quadGroup, outputCorners, patternSize); float sumDist = 0; int total = 0; for (int i = 0; i < (int)outputCorners.size(); ++i) { int ni = 0; float avgi = outputCorners.at(i)->meanDist(ni); sumDist += avgi * ni; total += ni; } prevSqrSize = lround(sumDist / std::max(total, 1)); if (found && !checkBoardMonotony(outputCorners, patternSize)) { found = false; } } } } if (!found) { return false; } else { corners.clear(); corners.reserve(outputCorners.size()); for (size_t i = 0; i < outputCorners.size(); ++i) { corners.push_back(outputCorners.at(i)->pt); } cv::cornerSubPix(image, corners, cv::Size(11, 11), cv::Size(-1,-1), cv::TermCriteria(CV_TERMCRIT_EPS + CV_TERMCRIT_ITER, 30, 0.1)); return true; } } //=========================================================================== // ERASE OVERHEAD //=========================================================================== // If we found too many connected quads, remove those which probably do not // belong. void Chessboard::cleanFoundConnectedQuads(std::vector& quadGroup, cv::Size patternSize) { cv::Point2f center(0.0f, 0.0f); // Number of quads this pattern should contain int count = ((patternSize.width + 1)*(patternSize.height + 1) + 1)/2; // If we have more quadrangles than we should, try to eliminate duplicates // or ones which don't belong to the pattern rectangle. Else go to the end // of the function if ((int)quadGroup.size() <= count) { return; } // Create an array of quadrangle centers std::vector centers; centers.resize(quadGroup.size()); for (size_t i = 0; i < quadGroup.size(); ++i) { cv::Point2f ci(0.0f, 0.0f); ChessboardQuadPtr& q = quadGroup[i]; for (int j = 0; j < 4; ++j) { ci += q->corners[j]->pt; } ci *= 0.25f; // Centers(i), is the geometric center of quad(i) // Center, is the center of all found quads centers[i] = ci; center += ci; } center *= 1.0f / quadGroup.size(); // If we have more quadrangles than we should, we try to eliminate bad // ones based on minimizing the bounding box. We iteratively remove the // point which reduces the size of the bounding box of the blobs the most // (since we want the rectangle to be as small as possible) remove the // quadrange that causes the biggest reduction in pattern size until we // have the correct number while ((int)quadGroup.size() > count) { double minBoxArea = DBL_MAX; int minBoxAreaIndex = -1; // For each point, calculate box area without that point for (size_t skip = 0; skip < quadGroup.size(); ++skip) { // get bounding rectangle cv::Point2f temp = centers[skip]; centers[skip] = center; std::vector hull; cv::convexHull(centers, hull, true, true); centers[skip] = temp; double hull_area = fabs(cv::contourArea(hull)); // remember smallest box area if (hull_area < minBoxArea) { minBoxArea = hull_area; minBoxAreaIndex = skip; } } ChessboardQuadPtr& q0 = quadGroup[minBoxAreaIndex]; // remove any references to this quad as a neighbor for (size_t i = 0; i < quadGroup.size(); ++i) { ChessboardQuadPtr& q = quadGroup.at(i); for (int j = 0; j < 4; ++j) { if (q->neighbors[j] == q0) { q->neighbors[j].reset(); q->count--; for (int k = 0; k < 4; ++k) { if (q0->neighbors[k] == q) { q0->neighbors[k].reset(); q0->count--; break; } } break; } } } // remove the quad quadGroup.at(minBoxAreaIndex) = quadGroup.back(); centers.at(minBoxAreaIndex) = centers.back(); quadGroup.pop_back(); centers.pop_back(); } } //=========================================================================== // FIND COONECTED QUADS //=========================================================================== void Chessboard::findConnectedQuads(std::vector& quads, std::vector& group, int group_idx, int dilation) { ChessboardQuadPtr q; // Scan the array for a first unlabeled quad for (size_t i = 0; i < quads.size(); ++i) { ChessboardQuadPtr& quad = quads.at(i); if (quad->count > 0 && quad->group_idx < 0) { q = quad; break; } } if (q.get() == 0) { return; } // Recursively find a group of connected quads starting from the seed quad std::vector stack; stack.push_back(q); group.push_back(q); q->group_idx = group_idx; while (!stack.empty()) { q = stack.back(); stack.pop_back(); for (int i = 0; i < 4; ++i) { ChessboardQuadPtr& neighbor = q->neighbors[i]; // If he neighbor exists and the neighbor has more than 0 // neighbors and the neighbor has not been classified yet. if (neighbor.get() && neighbor->count > 0 && neighbor->group_idx < 0) { stack.push_back(neighbor); group.push_back(neighbor); neighbor->group_idx = group_idx; } } } } void Chessboard::labelQuadGroup(std::vector& quadGroup, cv::Size patternSize, bool firstRun) { // If this is the first function call, a seed quad needs to be selected if (firstRun) { // Search for the (first) quad with the maximum number of neighbors // (usually 4). This will be our starting point. int mark = -1; int maxNeighborCount = 0; for (size_t i = 0; i < quadGroup.size(); ++i) { ChessboardQuadPtr& q = quadGroup.at(i); if (q->count > maxNeighborCount) { mark = i; maxNeighborCount = q->count; if (maxNeighborCount == 4) { break; } } } // Mark the starting quad's (per definition) upper left corner with //(0,0) and then proceed clockwise // The following labeling sequence enshures a "right coordinate system" ChessboardQuadPtr& q = quadGroup.at(mark); q->labeled = true; q->corners[0]->row = 0; q->corners[0]->column = 0; q->corners[1]->row = 0; q->corners[1]->column = 1; q->corners[2]->row = 1; q->corners[2]->column = 1; q->corners[3]->row = 1; q->corners[3]->column = 0; } // Mark all other corners with their respective row and column bool flagChanged = true; while (flagChanged) { // First reset the flag to "false" flagChanged = false; // Go through all quads top down is faster, since unlabeled quads will // be inserted at the end of the list for (int i = quadGroup.size() - 1; i >= 0; --i) { ChessboardQuadPtr& quad = quadGroup.at(i); // Check whether quad "i" has been labeled already if (!quad->labeled) { // Check its neighbors, whether some of them have been labeled // already for (int j = 0; j < 4; j++) { // Check whether the neighbor exists (i.e. is not the NULL // pointer) if (quad->neighbors[j]) { ChessboardQuadPtr& quadNeighbor = quad->neighbors[j]; // Only proceed, if neighbor "j" was labeled if (quadNeighbor->labeled) { // For every quad it could happen to pass here // multiple times. We therefore "break" later. // Check whitch of the neighbors corners is // connected to the current quad int connectedNeighborCornerId = -1; for (int k = 0; k < 4; ++k) { if (quadNeighbor->neighbors[k] == quad) { connectedNeighborCornerId = k; // there is only one, therefore break; } } // For the following calculations we need the row // and column of the connected neighbor corner and // all other corners of the connected quad "j", // clockwise (CW) ChessboardCornerPtr& conCorner = quadNeighbor->corners[connectedNeighborCornerId]; ChessboardCornerPtr& conCornerCW1 = quadNeighbor->corners[(connectedNeighborCornerId+1)%4]; ChessboardCornerPtr& conCornerCW2 = quadNeighbor->corners[(connectedNeighborCornerId+2)%4]; ChessboardCornerPtr& conCornerCW3 = quadNeighbor->corners[(connectedNeighborCornerId+3)%4]; quad->corners[j]->row = conCorner->row; quad->corners[j]->column = conCorner->column; quad->corners[(j+1)%4]->row = conCorner->row - conCornerCW2->row + conCornerCW3->row; quad->corners[(j+1)%4]->column = conCorner->column - conCornerCW2->column + conCornerCW3->column; quad->corners[(j+2)%4]->row = conCorner->row + conCorner->row - conCornerCW2->row; quad->corners[(j+2)%4]->column = conCorner->column + conCorner->column - conCornerCW2->column; quad->corners[(j+3)%4]->row = conCorner->row - conCornerCW2->row + conCornerCW1->row; quad->corners[(j+3)%4]->column = conCorner->column - conCornerCW2->column + conCornerCW1->column; // Mark this quad as labeled quad->labeled = true; // Changes have taken place, set the flag flagChanged = true; // once is enough! break; } } } } } } // All corners are marked with row and column // Record the minimal and maximal row and column indices // It is unlikely that more than 8bit checkers are used per dimension, if there are // an error would have been thrown at the beginning of "cvFindChessboardCorners2" int min_row = 127; int max_row = -127; int min_column = 127; int max_column = -127; for (int i = 0; i < (int)quadGroup.size(); ++i) { ChessboardQuadPtr& q = quadGroup.at(i); for (int j = 0; j < 4; ++j) { ChessboardCornerPtr& c = q->corners[j]; if (c->row > max_row) { max_row = c->row; } if (c->row < min_row) { min_row = c->row; } if (c->column > max_column) { max_column = c->column; } if (c->column < min_column) { min_column = c->column; } } } // Label all internal corners with "needsNeighbor" = false // Label all external corners with "needsNeighbor" = true, // except if in a given dimension the pattern size is reached for (int i = min_row; i <= max_row; ++i) { for (int j = min_column; j <= max_column; ++j) { // A flag that indicates, whether a row/column combination is // executed multiple times bool flag = false; // Remember corner and quad int cornerID; int quadID; for (int k = 0; k < (int)quadGroup.size(); ++k) { ChessboardQuadPtr& q = quadGroup.at(k); for (int l = 0; l < 4; ++l) { if ((q->corners[l]->row == i) && (q->corners[l]->column == j)) { if (flag) { // Passed at least twice through here q->corners[l]->needsNeighbor = false; quadGroup[quadID]->corners[cornerID]->needsNeighbor = false; } else { // Mark with needs a neighbor, but note the // address q->corners[l]->needsNeighbor = true; cornerID = l; quadID = k; } // set the flag to true flag = true; } } } } } // Complete Linking: // sometimes not all corners were properly linked in "findQuadNeighbors", // but after labeling each corner with its respective row and column, it is // possible to match them anyway. for (int i = min_row; i <= max_row; ++i) { for (int j = min_column; j <= max_column; ++j) { // the following "number" indicates the number of corners which // correspond to the given (i,j) value // 1 is a border corner or a conrer which still needs a neighbor // 2 is a fully connected internal corner // >2 something went wrong during labeling, report a warning int number = 1; // remember corner and quad int cornerID; int quadID; for (int k = 0; k < (int)quadGroup.size(); ++k) { ChessboardQuadPtr& q = quadGroup.at(k); for (int l = 0; l < 4; ++l) { if ((q->corners[l]->row == i) && (q->corners[l]->column == j)) { if (number == 1) { // First corner, note its ID cornerID = l; quadID = k; } else if (number == 2) { // Second corner, check wheter this and the // first one have equal coordinates, else // interpolate cv::Point2f delta = q->corners[l]->pt - quadGroup[quadID]->corners[cornerID]->pt; if (delta.x != 0.0f || delta.y != 0.0f) { // Interpolate q->corners[l]->pt -= delta * 0.5f; quadGroup[quadID]->corners[cornerID]->pt += delta * 0.5f; } } else if (number > 2) { // Something went wrong during row/column labeling // Report a Warning // ->Implemented in the function "mrWriteCorners" } // increase the number by one ++number; } } } } } // Bordercorners don't need any neighbors, if the pattern size in the // respective direction is reached // The only time we can make shure that the target pattern size is reached in a given // dimension, is when the larger side has reached the target size in the maximal // direction, or if the larger side is larger than the smaller target size and the // smaller side equals the smaller target size int largerDimPattern = std::max(patternSize.height,patternSize.width); int smallerDimPattern = std::min(patternSize.height,patternSize.width); bool flagSmallerDim1 = false; bool flagSmallerDim2 = false; if ((largerDimPattern + 1) == max_column - min_column) { flagSmallerDim1 = true; // We found out that in the column direction the target pattern size is reached // Therefore border column corners do not need a neighbor anymore // Go through all corners for (int k = 0; k < (int)quadGroup.size(); ++k) { ChessboardQuadPtr& q = quadGroup.at(k); for (int l = 0; l < 4; ++l) { ChessboardCornerPtr& c = q->corners[l]; if (c->column == min_column || c->column == max_column) { // Needs no neighbor anymore c->needsNeighbor = false; } } } } if ((largerDimPattern + 1) == max_row - min_row) { flagSmallerDim2 = true; // We found out that in the column direction the target pattern size is reached // Therefore border column corners do not need a neighbor anymore // Go through all corners for (int k = 0; k < (int)quadGroup.size(); ++k) { ChessboardQuadPtr& q = quadGroup.at(k); for (int l = 0; l < 4; ++l) { ChessboardCornerPtr& c = q->corners[l]; if (c->row == min_row || c->row == max_row) { // Needs no neighbor anymore c->needsNeighbor = false; } } } } // Check the two flags: // - If one is true and the other false, then the pattern target // size was reached in in one direction -> We can check, whether the target // pattern size is also reached in the other direction // - If both are set to true, then we deal with a square board -> do nothing // - If both are set to false -> There is a possibility that the larger side is // larger than the smaller target size -> Check and if true, then check whether // the other side has the same size as the smaller target size if ((flagSmallerDim1 == false && flagSmallerDim2 == true)) { // Larger target pattern size is in row direction, check wheter smaller target // pattern size is reached in column direction if ((smallerDimPattern + 1) == max_column - min_column) { for (int k = 0; k < (int)quadGroup.size(); ++k) { ChessboardQuadPtr& q = quadGroup.at(k); for (int l = 0; l < 4; ++l) { ChessboardCornerPtr& c = q->corners[l]; if (c->column == min_column || c->column == max_column) { // Needs no neighbor anymore c->needsNeighbor = false; } } } } } if ((flagSmallerDim1 == true && flagSmallerDim2 == false)) { // Larger target pattern size is in column direction, check wheter smaller target // pattern size is reached in row direction if ((smallerDimPattern + 1) == max_row - min_row) { for (int k = 0; k < (int)quadGroup.size(); ++k) { ChessboardQuadPtr& q = quadGroup.at(k); for (int l = 0; l < 4; ++l) { ChessboardCornerPtr& c = q->corners[l]; if (c->row == min_row || c->row == max_row) { // Needs no neighbor anymore c->needsNeighbor = false; } } } } } if ((flagSmallerDim1 == false && flagSmallerDim2 == false) && smallerDimPattern + 1 < max_column - min_column) { // Larger target pattern size is in column direction, check wheter smaller target // pattern size is reached in row direction if ((smallerDimPattern + 1) == max_row - min_row) { for (int k = 0; k < (int)quadGroup.size(); ++k) { ChessboardQuadPtr& q = quadGroup.at(k); for (int l = 0; l < 4; ++l) { ChessboardCornerPtr& c = q->corners[l]; if (c->row == min_row || c->row == max_row) { // Needs no neighbor anymore c->needsNeighbor = false; } } } } } if ((flagSmallerDim1 == false && flagSmallerDim2 == false) && smallerDimPattern + 1 < max_row - min_row) { // Larger target pattern size is in row direction, check wheter smaller target // pattern size is reached in column direction if ((smallerDimPattern + 1) == max_column - min_column) { for (int k = 0; k < (int)quadGroup.size(); ++k) { ChessboardQuadPtr& q = quadGroup.at(k); for (int l = 0; l < 4; ++l) { ChessboardCornerPtr& c = q->corners[l]; if (c->column == min_column || c->column == max_column) { // Needs no neighbor anymore c->needsNeighbor = false; } } } } } } //=========================================================================== // GIVE A GROUP IDX //=========================================================================== void Chessboard::findQuadNeighbors(std::vector& quads, int dilation) { // Thresh dilation is used to counter the effect of dilation on the // distance between 2 neighboring corners. Since the distance below is // computed as its square, we do here the same. Additionally, we take the // conservative assumption that dilation was performed using the 3x3 CROSS // kernel, which coresponds to the 4-neighborhood. const float thresh_dilation = (float)(2*dilation+3)*(2*dilation+3)*2; // the "*2" is for the x and y component // the "3" is for initial corner mismatch // Find quad neighbors for (size_t idx = 0; idx < quads.size(); ++idx) { ChessboardQuadPtr& curQuad = quads.at(idx); // Go through all quadrangles and label them in groups // For each corner of this quadrangle for (int i = 0; i < 4; ++i) { float minDist = FLT_MAX; int closestCornerIdx = -1; ChessboardQuadPtr closestQuad; if (curQuad->neighbors[i]) { continue; } cv::Point2f pt = curQuad->corners[i]->pt; // Find the closest corner in all other quadrangles for (size_t k = 0; k < quads.size(); ++k) { if (k == idx) { continue; } ChessboardQuadPtr& quad = quads.at(k); for (int j = 0; j < 4; ++j) { // If it already has a neighbor if (quad->neighbors[j]) { continue; } cv::Point2f dp = pt - quad->corners[j]->pt; float dist = dp.dot(dp); // The following "if" checks, whether "dist" is the // shortest so far and smaller than the smallest // edge length of the current and target quads if (dist < minDist && dist <= (curQuad->edge_len + thresh_dilation) && dist <= (quad->edge_len + thresh_dilation) ) { // Check whether conditions are fulfilled if (matchCorners(curQuad, i, quad, j)) { closestCornerIdx = j; closestQuad = quad; minDist = dist; } } } } // Have we found a matching corner point? if (closestCornerIdx >= 0 && minDist < FLT_MAX) { ChessboardCornerPtr closestCorner = closestQuad->corners[closestCornerIdx]; // Make sure that the closest quad does not have the current // quad as neighbor already bool valid = true; for (int j = 0; j < 4; ++j) { if (closestQuad->neighbors[j] == curQuad) { valid = false; break; } } if (!valid) { continue; } // We've found one more corner - remember it closestCorner->pt = (pt + closestCorner->pt) * 0.5f; curQuad->count++; curQuad->neighbors[i] = closestQuad; curQuad->corners[i] = closestCorner; closestQuad->count++; closestQuad->neighbors[closestCornerIdx] = curQuad; closestQuad->corners[closestCornerIdx] = closestCorner; } } } } //=========================================================================== // AUGMENT PATTERN WITH ADDITIONAL QUADS //=========================================================================== // The first part of the function is basically a copy of // "findQuadNeighbors" // The comparisons between two points and two lines could be computed in their // own function int Chessboard::augmentBestRun(std::vector& candidateQuads, int candidateDilation, std::vector& existingQuads, int existingDilation) { // thresh dilation is used to counter the effect of dilation on the // distance between 2 neighboring corners. Since the distance below is // computed as its square, we do here the same. Additionally, we take the // conservative assumption that dilation was performed using the 3x3 CROSS // kernel, which coresponds to the 4-neighborhood. const float thresh_dilation = (2*candidateDilation+3)*(2*existingDilation+3)*2; // the "*2" is for the x and y component // Search all old quads which have a neighbor that needs to be linked for (size_t idx = 0; idx < existingQuads.size(); ++idx) { ChessboardQuadPtr& curQuad = existingQuads.at(idx); // For each corner of this quadrangle for (int i = 0; i < 4; ++i) { float minDist = FLT_MAX; int closestCornerIdx = -1; ChessboardQuadPtr closestQuad; // If curQuad corner[i] is already linked, continue if (!curQuad->corners[i]->needsNeighbor) { continue; } cv::Point2f pt = curQuad->corners[i]->pt; // Look for a match in all candidateQuads' corners for (size_t k = 0; k < candidateQuads.size(); ++k) { ChessboardQuadPtr& candidateQuad = candidateQuads.at(k); // Only look at unlabeled new quads if (candidateQuad->labeled) { continue; } for (int j = 0; j < 4; ++j) { // Only proceed if they are less than dist away from each // other cv::Point2f dp = pt - candidateQuad->corners[j]->pt; float dist = dp.dot(dp); if ((dist < minDist) && dist <= (curQuad->edge_len + thresh_dilation) && dist <= (candidateQuad->edge_len + thresh_dilation)) { if (matchCorners(curQuad, i, candidateQuad, j)) { closestCornerIdx = j; closestQuad = candidateQuad; minDist = dist; } } } } // Have we found a matching corner point? if (closestCornerIdx >= 0 && minDist < FLT_MAX) { ChessboardCornerPtr closestCorner = closestQuad->corners[closestCornerIdx]; closestCorner->pt = (pt + closestCorner->pt) * 0.5f; // We've found one more corner - remember it // ATTENTION: write the corner x and y coordinates separately, // else the crucial row/column entries will be overwritten !!! curQuad->corners[i]->pt = closestCorner->pt; curQuad->neighbors[i] = closestQuad; closestQuad->corners[closestCornerIdx]->pt = closestCorner->pt; // Label closest quad as labeled. In this way we exclude it // being considered again during the next loop iteration closestQuad->labeled = true; // We have a new member of the final pattern, copy it over ChessboardQuadPtr newQuad(new ChessboardQuad); newQuad->count = 1; newQuad->edge_len = closestQuad->edge_len; newQuad->group_idx = curQuad->group_idx; //the same as the current quad newQuad->labeled = false; //do it right afterwards curQuad->neighbors[i] = newQuad; // We only know one neighbor for sure newQuad->neighbors[closestCornerIdx] = curQuad; for (int j = 0; j < 4; j++) { newQuad->corners[j].reset(new ChessboardCorner); newQuad->corners[j]->pt = closestQuad->corners[j]->pt; } existingQuads.push_back(newQuad); // Start the function again return -1; } } } // Finished, don't start the function again return 1; } //=========================================================================== // GENERATE QUADRANGLES //=========================================================================== void Chessboard::generateQuads(std::vector& quads, cv::Mat& image, int flags, int dilation, bool firstRun) { // Empirical lower bound for the size of allowable quadrangles. // MARTIN, modified: Added "*0.1" in order to find smaller quads. int minSize = lround(image.cols * image.rows * .03 * 0.01 * 0.92 * 0.1); std::vector< std::vector > contours; std::vector hierarchy; // Initialize contour retrieving routine cv::findContours(image, contours, hierarchy, CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE); std::vector< std::vector > quadContours; for (size_t i = 0; i < contours.size(); ++i) { // Reject contours with a too small perimeter and contours which are // completely surrounded by another contour // MARTIN: If this function is called during PART 1, then the parameter "first run" // is set to "true". This guarantees, that only "nice" squares are detected. // During PART 2, we allow the polygonal approximation function below to // approximate more freely, which can result in recognized "squares" that are in // reality multiple blurred and sticked together squares. std::vector& contour = contours.at(i); if (hierarchy[i][3] == -1 || cv::contourArea(contour) < minSize) { continue; } int min_approx_level = 1, max_approx_level; if (firstRun) { max_approx_level = 3; } else { max_approx_level = MAX_CONTOUR_APPROX; } std::vector approxContour; for (int approx_level = min_approx_level; approx_level <= max_approx_level; approx_level++) { cv::approxPolyDP(contour, approxContour, approx_level, true); if (approxContour.size() == 4) { break; } } // Reject non-quadrangles if (approxContour.size() == 4 && cv::isContourConvex(approxContour)) { double p = cv::arcLength(approxContour, true); double area = cv::contourArea(approxContour); cv::Point pt[4]; for (int i = 0; i < 4; i++) { pt[i] = approxContour[i]; } cv::Point dp = pt[0] - pt[2]; double d1 = sqrt(dp.dot(dp)); dp = pt[1] - pt[3]; double d2 = sqrt(dp.dot(dp)); // PHILIPG: Only accept those quadrangles which are more // square than rectangular and which are big enough dp = pt[0] - pt[1]; double d3 = sqrt(dp.dot(dp)); dp = pt[1] - pt[2]; double d4 = sqrt(dp.dot(dp)); if (!(flags & CV_CALIB_CB_FILTER_QUADS) || (d3*4 > d4 && d4*4 > d3 && d3*d4 < area*1.5 && area > minSize && d1 >= 0.15 * p && d2 >= 0.15 * p)) { quadContours.push_back(approxContour); } } } // Allocate quad & corner buffers quads.resize(quadContours.size()); // Create array of quads structures for (size_t idx = 0; idx < quadContours.size(); ++idx) { ChessboardQuadPtr& q = quads.at(idx); std::vector& contour = quadContours.at(idx); // Reset group ID q.reset(new ChessboardQuad); assert(contour.size() == 4); for (int i = 0; i < 4; ++i) { cv::Point2f pt = contour.at(i); ChessboardCornerPtr corner(new ChessboardCorner); corner->pt = pt; q->corners[i] = corner; } for (int i = 0; i < 4; ++i) { cv::Point2f dp = q->corners[i]->pt - q->corners[(i+1)&3]->pt; float d = dp.dot(dp); if (q->edge_len > d) { q->edge_len = d; } } } } bool Chessboard::checkQuadGroup(std::vector& quads, std::vector& corners, cv::Size patternSize) { // Initialize bool flagRow = false; bool flagColumn = false; int height = -1; int width = -1; // Compute the minimum and maximum row / column ID // (it is unlikely that more than 8bit checkers are used per dimension) int min_row = 127; int max_row = -127; int min_col = 127; int max_col = -127; for (size_t i = 0; i < quads.size(); ++i) { ChessboardQuadPtr& q = quads.at(i); for (int j = 0; j < 4; ++j) { ChessboardCornerPtr& c = q->corners[j]; if (c->row > max_row) { max_row = c->row; } if (c->row < min_row) { min_row = c->row; } if (c->column > max_col) { max_col = c->column; } if (c->column < min_col) { min_col = c->column; } } } // If in a given direction the target pattern size is reached, we know exactly how // the checkerboard is oriented. // Else we need to prepare enough "dummy" corners for the worst case. for (size_t i = 0; i < quads.size(); ++i) { ChessboardQuadPtr& q = quads.at(i); for (int j = 0; j < 4; ++j) { ChessboardCornerPtr& c = q->corners[j]; if (c->column == max_col && c->row != min_row && c->row != max_row && !c->needsNeighbor) { flagColumn = true; } if (c->row == max_row && c->column != min_col && c->column != max_col && !c->needsNeighbor) { flagRow = true; } } } if (flagColumn) { if (max_col - min_col == patternSize.width + 1) { width = patternSize.width; height = patternSize.height; } else { width = patternSize.height; height = patternSize.width; } } else if (flagRow) { if (max_row - min_row == patternSize.width + 1) { height = patternSize.width; width = patternSize.height; } else { height = patternSize.height; width = patternSize.width; } } else { // If target pattern size is not reached in at least one of the two // directions, then we do not know where the remaining corners are // located. Account for this. width = std::max(patternSize.width, patternSize.height); height = std::max(patternSize.width, patternSize.height); } ++min_row; ++min_col; max_row = min_row + height - 1; max_col = min_col + width - 1; corners.clear(); int linkedBorderCorners = 0; // Write the corners in increasing order to the output file for (int i = min_row; i <= max_row; ++i) { for (int j = min_col; j <= max_col; ++j) { // Reset the iterator int iter = 1; for (int k = 0; k < (int)quads.size(); ++k) { ChessboardQuadPtr& quad = quads.at(k); for (int l = 0; l < 4; ++l) { ChessboardCornerPtr& c = quad->corners[l]; if (c->row == i && c->column == j) { bool boardEdge = false; if (i == min_row || i == max_row || j == min_col || j == max_col) { boardEdge = true; } if ((iter == 1 && boardEdge) || (iter == 2 && !boardEdge)) { // The respective row and column have been found corners.push_back(quad->corners[l]); } if (iter == 2 && boardEdge) { ++linkedBorderCorners; } // If the iterator is larger than two, this means that more than // two corners have the same row / column entries. Then some // linking errors must have occured and we should not use the found // pattern if (iter > 2) { return false; } iter++; } } } } } if ((int)corners.size() != patternSize.width * patternSize.height || linkedBorderCorners < (patternSize.width * 2 + patternSize.height * 2 - 2) * 0.75f) { return false; } // check that no corners lie at image boundary float border = 5.0f; for (int i = 0; i < (int)corners.size(); ++i) { ChessboardCornerPtr& c = corners.at(i); if (c->pt.x < border || c->pt.x > mImage.cols - border || c->pt.y < border || c->pt.y > mImage.rows - border) { return false; } } // check if we need to transpose the board if (width != patternSize.width) { std::swap(width, height); std::vector outputCorners; outputCorners.resize(corners.size()); for (int i = 0; i < height; ++i) { for (int j = 0; j < width; ++j) { outputCorners.at(i * width + j) = corners.at(j * height + i); } } corners = outputCorners; } // check if we need to revert the order in each row cv::Point2f p0 = corners.at(0)->pt; cv::Point2f p1 = corners.at(width-1)->pt; cv::Point2f p2 = corners.at(width)->pt; if ((p1 - p0).cross(p2 - p0) < 0.0f) { for (int i = 0; i < height; ++i) { for (int j = 0; j < width / 2; ++j) { std::swap(corners.at(i * width + j), corners.at(i * width + width - j - 1)); } } } p0 = corners.at(0)->pt; p2 = corners.at(width)->pt; // check if we need to rotate the board if (p2.y < p0.y) { std::vector outputCorners; outputCorners.resize(corners.size()); for (int i = 0; i < height; ++i) { for (int j = 0; j < width; ++j) { outputCorners.at(i * width + j) = corners.at((height - i - 1) * width + width - j - 1); } } corners = outputCorners; } return true; } void Chessboard::getQuadrangleHypotheses(const std::vector< std::vector >& contours, std::vector< std::pair >& quads, int classId) const { const float minAspectRatio = 0.2f; const float maxAspectRatio = 5.0f; const float minBoxSize = 10.0f; for (size_t i = 0; i < contours.size(); ++i) { cv::RotatedRect box = cv::minAreaRect(contours.at(i)); float boxSize = std::max(box.size.width, box.size.height); if (boxSize < minBoxSize) { continue; } float aspectRatio = box.size.width / std::max(box.size.height, 1.0f); if (aspectRatio < minAspectRatio || aspectRatio > maxAspectRatio) { continue; } quads.push_back(std::pair(boxSize, classId)); } } bool less_pred(const std::pair& p1, const std::pair& p2) { return p1.first < p2.first; } void countClasses(const std::vector >& pairs, size_t idx1, size_t idx2, std::vector& counts) { counts.assign(2, 0); for (size_t i = idx1; i != idx2; ++i) { counts[pairs[i].second]++; } } bool Chessboard::checkChessboard(const cv::Mat& image, cv::Size patternSize) const { const int erosionCount = 1; const float blackLevel = 20.f; const float whiteLevel = 130.f; const float blackWhiteGap = 70.f; cv::Mat white = image.clone(); cv::Mat black = image.clone(); cv::erode(white, white, cv::Mat(), cv::Point(-1,-1), erosionCount); cv::dilate(black, black, cv::Mat(), cv::Point(-1,-1), erosionCount); cv::Mat thresh(image.rows, image.cols, CV_8UC1); bool result = false; for (float threshLevel = blackLevel; threshLevel < whiteLevel && !result; threshLevel += 20.0f) { cv::threshold(white, thresh, threshLevel + blackWhiteGap, 255, CV_THRESH_BINARY); std::vector< std::vector > contours; std::vector hierarchy; // Initialize contour retrieving routine cv::findContours(thresh, contours, hierarchy, CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE); std::vector > quads; getQuadrangleHypotheses(contours, quads, 1); cv::threshold(black, thresh, threshLevel, 255, CV_THRESH_BINARY_INV); cv::findContours(thresh, contours, hierarchy, CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE); getQuadrangleHypotheses(contours, quads, 0); const size_t min_quads_count = patternSize.width * patternSize.height / 2; std::sort(quads.begin(), quads.end(), less_pred); // now check if there are many hypotheses with similar sizes // do this by floodfill-style algorithm const float sizeRelDev = 0.4f; for (size_t i = 0; i < quads.size(); ++i) { size_t j = i + 1; for (; j < quads.size(); ++j) { if (quads[j].first / quads[i].first > 1.0f + sizeRelDev) { break; } } if (j + 1 > min_quads_count + i) { // check the number of black and white squares std::vector counts; countClasses(quads, i, j, counts); const int blackCount = lroundf(ceilf(patternSize.width / 2.0f) * ceilf(patternSize.height / 2.0f)); const int whiteCount = lroundf(floorf(patternSize.width / 2.0f) * floorf(patternSize.height / 2.0f)); if (counts[0] < blackCount * 0.75f || counts[1] < whiteCount * 0.75f) { continue; } result = true; break; } } } return result; } bool Chessboard::checkBoardMonotony(std::vector& corners, cv::Size patternSize) { const float threshFactor = 0.2f; Spline splineXY, splineYX; splineXY.setLowBC(Spline::PARABOLIC_RUNOUT_BC); splineXY.setHighBC(Spline::PARABOLIC_RUNOUT_BC); splineYX.setLowBC(Spline::PARABOLIC_RUNOUT_BC); splineYX.setHighBC(Spline::PARABOLIC_RUNOUT_BC); // check if each row of corners approximates a cubic spline for (int i = 0; i < patternSize.height; ++i) { splineXY.clear(); splineYX.clear(); cv::Point2f p[3]; p[0] = corners.at(i * patternSize.width)->pt; p[1] = corners.at(i * patternSize.width + patternSize.width / 2)->pt; p[2] = corners.at(i * patternSize.width + patternSize.width - 1)->pt; for (int j = 0; j < 3; ++j) { splineXY.addPoint(p[j].x, p[j].y); splineYX.addPoint(p[j].y, p[j].x); } for (int j = 1; j < patternSize.width - 1; ++j) { cv::Point2f& p_j = corners.at(i * patternSize.width + j)->pt; float thresh = std::numeric_limits::max(); // up-neighbor if (i > 0) { cv::Point2f& neighbor = corners.at((i - 1) * patternSize.width + j)->pt; thresh = fminf(thresh, cv::norm(neighbor - p_j)); } // down-neighbor if (i < patternSize.height - 1) { cv::Point2f& neighbor = corners.at((i + 1) * patternSize.width + j)->pt; thresh = fminf(thresh, cv::norm(neighbor - p_j)); } // left-neighbor { cv::Point2f& neighbor = corners.at(i * patternSize.width + j - 1)->pt; thresh = fminf(thresh, cv::norm(neighbor - p_j)); } // right-neighbor { cv::Point2f& neighbor = corners.at(i * patternSize.width + j + 1)->pt; thresh = fminf(thresh, cv::norm(neighbor - p_j)); } thresh *= threshFactor; if (fminf(fabsf(splineXY(p_j.x) - p_j.y), fabsf(splineYX(p_j.y) - p_j.x)) > thresh) { return false; } } } // check if each column of corners approximates a cubic spline for (int j = 0; j < patternSize.width; ++j) { splineXY.clear(); splineYX.clear(); cv::Point2f p[3]; p[0] = corners.at(j)->pt; p[1] = corners.at(patternSize.height / 2 * patternSize.width + j)->pt; p[2] = corners.at((patternSize.height - 1) * patternSize.width + j)->pt; for (int i = 0; i < 3; ++i) { splineXY.addPoint(p[i].x, p[i].y); splineYX.addPoint(p[i].y, p[i].x); } for (int i = 1; i < patternSize.height - 1; ++i) { cv::Point2f& p_i = corners.at(i * patternSize.width + j)->pt; float thresh = std::numeric_limits::max(); // up-neighbor { cv::Point2f& neighbor = corners.at((i - 1) * patternSize.width + j)->pt; thresh = fminf(thresh, cv::norm(neighbor - p_i)); } // down-neighbor { cv::Point2f& neighbor = corners.at((i + 1) * patternSize.width + j)->pt; thresh = fminf(thresh, cv::norm(neighbor - p_i)); } // left-neighbor if (j > 0) { cv::Point2f& neighbor = corners.at(i * patternSize.width + j - 1)->pt; thresh = fminf(thresh, cv::norm(neighbor - p_i)); } // right-neighbor if (j < patternSize.width - 1) { cv::Point2f& neighbor = corners.at(i * patternSize.width + j + 1)->pt; thresh = fminf(thresh, cv::norm(neighbor - p_i)); } thresh *= threshFactor; if (fminf(fabsf(splineXY(p_i.x) - p_i.y), fabsf(splineYX(p_i.y) - p_i.x)) > thresh) { return false; } } } return true; } bool Chessboard::matchCorners(ChessboardQuadPtr& quad1, int corner1, ChessboardQuadPtr& quad2, int corner2) const { // First Check everything from the viewpoint of the // current quad compute midpoints of "parallel" quad // sides 1 float x1 = (quad1->corners[corner1]->pt.x + quad1->corners[(corner1+1)%4]->pt.x)/2; float y1 = (quad1->corners[corner1]->pt.y + quad1->corners[(corner1+1)%4]->pt.y)/2; float x2 = (quad1->corners[(corner1+2)%4]->pt.x + quad1->corners[(corner1+3)%4]->pt.x)/2; float y2 = (quad1->corners[(corner1+2)%4]->pt.y + quad1->corners[(corner1+3)%4]->pt.y)/2; // compute midpoints of "parallel" quad sides 2 float x3 = (quad1->corners[corner1]->pt.x + quad1->corners[(corner1+3)%4]->pt.x)/2; float y3 = (quad1->corners[corner1]->pt.y + quad1->corners[(corner1+3)%4]->pt.y)/2; float x4 = (quad1->corners[(corner1+1)%4]->pt.x + quad1->corners[(corner1+2)%4]->pt.x)/2; float y4 = (quad1->corners[(corner1+1)%4]->pt.y + quad1->corners[(corner1+2)%4]->pt.y)/2; // MARTIN: Heuristic // For corner2 of quad2 to be considered, // it needs to be on the same side of the two lines as // corner1. This is given, if the cross product has // the same sign for both computations below: float a1 = x1 - x2; float b1 = y1 - y2; // the current corner float c11 = quad1->corners[corner1]->pt.x - x2; float d11 = quad1->corners[corner1]->pt.y - y2; // the candidate corner float c12 = quad2->corners[corner2]->pt.x - x2; float d12 = quad2->corners[corner2]->pt.y - y2; float sign11 = a1*d11 - c11*b1; float sign12 = a1*d12 - c12*b1; float a2 = x3 - x4; float b2 = y3 - y4; // the current corner float c21 = quad1->corners[corner1]->pt.x - x4; float d21 = quad1->corners[corner1]->pt.y - y4; // the candidate corner float c22 = quad2->corners[corner2]->pt.x - x4; float d22 = quad2->corners[corner2]->pt.y - y4; float sign21 = a2*d21 - c21*b2; float sign22 = a2*d22 - c22*b2; // Also make shure that two border quads of the same row or // column don't link. Check from the current corner's view, // whether the corner diagonal from the candidate corner // is also on the same side of the two lines as the current // corner and the candidate corner. float c13 = quad2->corners[(corner2+2)%4]->pt.x - x2; float d13 = quad2->corners[(corner2+2)%4]->pt.y - y2; float c23 = quad2->corners[(corner2+2)%4]->pt.x - x4; float d23 = quad2->corners[(corner2+2)%4]->pt.y - y4; float sign13 = a1*d13 - c13*b1; float sign23 = a2*d23 - c23*b2; // Second: Then check everything from the viewpoint of // the candidate quad. Compute midpoints of "parallel" // quad sides 1 float u1 = (quad2->corners[corner2]->pt.x + quad2->corners[(corner2+1)%4]->pt.x)/2; float v1 = (quad2->corners[corner2]->pt.y + quad2->corners[(corner2+1)%4]->pt.y)/2; float u2 = (quad2->corners[(corner2+2)%4]->pt.x + quad2->corners[(corner2+3)%4]->pt.x)/2; float v2 = (quad2->corners[(corner2+2)%4]->pt.y + quad2->corners[(corner2+3)%4]->pt.y)/2; // compute midpoints of "parallel" quad sides 2 float u3 = (quad2->corners[corner2]->pt.x + quad2->corners[(corner2+3)%4]->pt.x)/2; float v3 = (quad2->corners[corner2]->pt.y + quad2->corners[(corner2+3)%4]->pt.y)/2; float u4 = (quad2->corners[(corner2+1)%4]->pt.x + quad2->corners[(corner2+2)%4]->pt.x)/2; float v4 = (quad2->corners[(corner2+1)%4]->pt.y + quad2->corners[(corner2+2)%4]->pt.y)/2; // MARTIN: Heuristic // For corner2 of quad2 to be considered, // it needs to be on the same side of the two lines as // corner1. This is given, if the cross product has // the same sign for both computations below: float a3 = u1 - u2; float b3 = v1 - v2; // the current corner float c31 = quad1->corners[corner1]->pt.x - u2; float d31 = quad1->corners[corner1]->pt.y - v2; // the candidate corner float c32 = quad2->corners[corner2]->pt.x - u2; float d32 = quad2->corners[corner2]->pt.y - v2; float sign31 = a3*d31-c31*b3; float sign32 = a3*d32-c32*b3; float a4 = u3 - u4; float b4 = v3 - v4; // the current corner float c41 = quad1->corners[corner1]->pt.x - u4; float d41 = quad1->corners[corner1]->pt.y - v4; // the candidate corner float c42 = quad2->corners[corner2]->pt.x - u4; float d42 = quad2->corners[corner2]->pt.y - v4; float sign41 = a4*d41-c41*b4; float sign42 = a4*d42-c42*b4; // Also make sure that two border quads of the same row or // column don't link. Check from the candidate corner's view, // whether the corner diagonal from the current corner // is also on the same side of the two lines as the current // corner and the candidate corner. float c33 = quad1->corners[(corner1+2)%4]->pt.x - u2; float d33 = quad1->corners[(corner1+2)%4]->pt.y - v2; float c43 = quad1->corners[(corner1+2)%4]->pt.x - u4; float d43 = quad1->corners[(corner1+2)%4]->pt.y - v4; float sign33 = a3*d33-c33*b3; float sign43 = a4*d43-c43*b4; // This time we also need to make shure, that no quad // is linked to a quad of another dilation run which // may lie INSIDE it!!! // Third: Therefore check everything from the viewpoint // of the current quad compute midpoints of "parallel" // quad sides 1 float x5 = quad1->corners[corner1]->pt.x; float y5 = quad1->corners[corner1]->pt.y; float x6 = quad1->corners[(corner1+1)%4]->pt.x; float y6 = quad1->corners[(corner1+1)%4]->pt.y; // compute midpoints of "parallel" quad sides 2 float x7 = x5; float y7 = y5; float x8 = quad1->corners[(corner1+3)%4]->pt.x; float y8 = quad1->corners[(corner1+3)%4]->pt.y; // MARTIN: Heuristic // For corner2 of quad2 to be considered, // it needs to be on the other side of the two lines than // corner1. This is given, if the cross product has // a different sign for both computations below: float a5 = x6 - x5; float b5 = y6 - y5; // the current corner float c51 = quad1->corners[(corner1+2)%4]->pt.x - x5; float d51 = quad1->corners[(corner1+2)%4]->pt.y - y5; // the candidate corner float c52 = quad2->corners[corner2]->pt.x - x5; float d52 = quad2->corners[corner2]->pt.y - y5; float sign51 = a5*d51 - c51*b5; float sign52 = a5*d52 - c52*b5; float a6 = x8 - x7; float b6 = y8 - y7; // the current corner float c61 = quad1->corners[(corner1+2)%4]->pt.x - x7; float d61 = quad1->corners[(corner1+2)%4]->pt.y - y7; // the candidate corner float c62 = quad2->corners[corner2]->pt.x - x7; float d62 = quad2->corners[corner2]->pt.y - y7; float sign61 = a6*d61 - c61*b6; float sign62 = a6*d62 - c62*b6; // Fourth: Then check everything from the viewpoint of // the candidate quad compute midpoints of "parallel" // quad sides 1 float u5 = quad2->corners[corner2]->pt.x; float v5 = quad2->corners[corner2]->pt.y; float u6 = quad2->corners[(corner2+1)%4]->pt.x; float v6 = quad2->corners[(corner2+1)%4]->pt.y; // compute midpoints of "parallel" quad sides 2 float u7 = u5; float v7 = v5; float u8 = quad2->corners[(corner2+3)%4]->pt.x; float v8 = quad2->corners[(corner2+3)%4]->pt.y; // MARTIN: Heuristic // For corner2 of quad2 to be considered, // it needs to be on the other side of the two lines than // corner1. This is given, if the cross product has // a different sign for both computations below: float a7 = u6 - u5; float b7 = v6 - v5; // the current corner float c71 = quad1->corners[corner1]->pt.x - u5; float d71 = quad1->corners[corner1]->pt.y - v5; // the candidate corner float c72 = quad2->corners[(corner2+2)%4]->pt.x - u5; float d72 = quad2->corners[(corner2+2)%4]->pt.y - v5; float sign71 = a7*d71-c71*b7; float sign72 = a7*d72-c72*b7; float a8 = u8 - u7; float b8 = v8 - v7; // the current corner float c81 = quad1->corners[corner1]->pt.x - u7; float d81 = quad1->corners[corner1]->pt.y - v7; // the candidate corner float c82 = quad2->corners[(corner2+2)%4]->pt.x - u7; float d82 = quad2->corners[(corner2+2)%4]->pt.y - v7; float sign81 = a8*d81-c81*b8; float sign82 = a8*d82-c82*b8; // Check whether conditions are fulfilled if (((sign11 < 0 && sign12 < 0) || (sign11 > 0 && sign12 > 0)) && ((sign21 < 0 && sign22 < 0) || (sign21 > 0 && sign22 > 0)) && ((sign31 < 0 && sign32 < 0) || (sign31 > 0 && sign32 > 0)) && ((sign41 < 0 && sign42 < 0) || (sign41 > 0 && sign42 > 0)) && ((sign11 < 0 && sign13 < 0) || (sign11 > 0 && sign13 > 0)) && ((sign21 < 0 && sign23 < 0) || (sign21 > 0 && sign23 > 0)) && ((sign31 < 0 && sign33 < 0) || (sign31 > 0 && sign33 > 0)) && ((sign41 < 0 && sign43 < 0) || (sign41 > 0 && sign43 > 0)) && ((sign51 < 0 && sign52 > 0) || (sign51 > 0 && sign52 < 0)) && ((sign61 < 0 && sign62 > 0) || (sign61 > 0 && sign62 < 0)) && ((sign71 < 0 && sign72 > 0) || (sign71 > 0 && sign72 < 0)) && ((sign81 < 0 && sign82 > 0) || (sign81 > 0 && sign82 < 0))) { return true; } else { return false; } } } ================================================ FILE: camera_model/src/gpl/EigenQuaternionParameterization.cc ================================================ #include "camodocal/gpl/EigenQuaternionParameterization.h" #include namespace camodocal { bool EigenQuaternionParameterization::Plus(const double* x, const double* delta, double* x_plus_delta) const { const double norm_delta = sqrt(delta[0] * delta[0] + delta[1] * delta[1] + delta[2] * delta[2]); if (norm_delta > 0.0) { const double sin_delta_by_delta = (sin(norm_delta) / norm_delta); double q_delta[4]; q_delta[0] = sin_delta_by_delta * delta[0]; q_delta[1] = sin_delta_by_delta * delta[1]; q_delta[2] = sin_delta_by_delta * delta[2]; q_delta[3] = cos(norm_delta); EigenQuaternionProduct(q_delta, x, x_plus_delta); } else { for (int i = 0; i < 4; ++i) { x_plus_delta[i] = x[i]; } } return true; } bool EigenQuaternionParameterization::ComputeJacobian(const double* x, double* jacobian) const { jacobian[0] = x[3]; jacobian[1] = x[2]; jacobian[2] = -x[1]; // NOLINT jacobian[3] = -x[2]; jacobian[4] = x[3]; jacobian[5] = x[0]; // NOLINT jacobian[6] = x[1]; jacobian[7] = -x[0]; jacobian[8] = x[3]; // NOLINT jacobian[9] = -x[0]; jacobian[10] = -x[1]; jacobian[11] = -x[2]; // NOLINT return true; } } ================================================ FILE: camera_model/src/gpl/gpl.cc ================================================ #include "camodocal/gpl/gpl.h" #include #ifdef _WIN32 #include #else #include #endif // source: https://stackoverflow.com/questions/5167269/clock-gettime-alternative-in-mac-os-x #ifdef __APPLE__ #include #define ORWL_NANO (+1.0E-9) #define ORWL_GIGA UINT64_C(1000000000) static double orwl_timebase = 0.0; static uint64_t orwl_timestart = 0; struct timespec orwl_gettime(void) { // be more careful in a multithreaded environement if (!orwl_timestart) { mach_timebase_info_data_t tb = { 0 }; mach_timebase_info(&tb); orwl_timebase = tb.numer; orwl_timebase /= tb.denom; orwl_timestart = mach_absolute_time(); } struct timespec t; double diff = (mach_absolute_time() - orwl_timestart) * orwl_timebase; t.tv_sec = diff * ORWL_NANO; t.tv_nsec = diff - (t.tv_sec * ORWL_GIGA); return t; } #endif // __APPLE__ const double WGS84_A = 6378137.0; const double WGS84_ECCSQ = 0.00669437999013; // Windows lacks fminf #ifndef fminf #define fminf(x, y) (((x) < (y)) ? (x) : (y)) #endif namespace camodocal { double hypot3(double x, double y, double z) { return sqrt(square(x) + square(y) + square(z)); } float hypot3f(float x, float y, float z) { return sqrtf(square(x) + square(y) + square(z)); } double d2r(double deg) { return deg / 180.0 * M_PI; } float d2r(float deg) { return deg / 180.0f * M_PI; } double r2d(double rad) { return rad / M_PI * 180.0; } float r2d(float rad) { return rad / M_PI * 180.0f; } double sinc(double theta) { return sin(theta) / theta; } #ifdef _WIN32 #include #include #include LARGE_INTEGER getFILETIMEoffset() { SYSTEMTIME s; FILETIME f; LARGE_INTEGER t; s.wYear = 1970; s.wMonth = 1; s.wDay = 1; s.wHour = 0; s.wMinute = 0; s.wSecond = 0; s.wMilliseconds = 0; SystemTimeToFileTime(&s, &f); t.QuadPart = f.dwHighDateTime; t.QuadPart <<= 32; t.QuadPart |= f.dwLowDateTime; return (t); } int clock_gettime(int X, struct timespec *tp) { LARGE_INTEGER t; FILETIME f; double microseconds; static LARGE_INTEGER offset; static double frequencyToMicroseconds; static int initialized = 0; static BOOL usePerformanceCounter = 0; if (!initialized) { LARGE_INTEGER performanceFrequency; initialized = 1; usePerformanceCounter = QueryPerformanceFrequency(&performanceFrequency); if (usePerformanceCounter) { QueryPerformanceCounter(&offset); frequencyToMicroseconds = (double)performanceFrequency.QuadPart / 1000000.; } else { offset = getFILETIMEoffset(); frequencyToMicroseconds = 10.; } } if (usePerformanceCounter) QueryPerformanceCounter(&t); else { GetSystemTimeAsFileTime(&f); t.QuadPart = f.dwHighDateTime; t.QuadPart <<= 32; t.QuadPart |= f.dwLowDateTime; } t.QuadPart -= offset.QuadPart; microseconds = (double)t.QuadPart / frequencyToMicroseconds; t.QuadPart = microseconds; tp->tv_sec = t.QuadPart / 1000000; tp->tv_nsec = (t.QuadPart % 1000000) * 1000; return (0); } #endif unsigned long long timeInMicroseconds(void) { struct timespec tp; #ifdef __APPLE__ tp = orwl_gettime(); #else clock_gettime(CLOCK_REALTIME, &tp); #endif return tp.tv_sec * 1000000 + tp.tv_nsec / 1000; } double timeInSeconds(void) { struct timespec tp; #ifdef __APPLE__ tp = orwl_gettime(); #else clock_gettime(CLOCK_REALTIME, &tp); #endif return static_cast(tp.tv_sec) + static_cast(tp.tv_nsec) / 1000000000.0; } float colormapAutumn[128][3] = { {1.0f,0.f,0.f}, {1.0f,0.007874f,0.f}, {1.0f,0.015748f,0.f}, {1.0f,0.023622f,0.f}, {1.0f,0.031496f,0.f}, {1.0f,0.03937f,0.f}, {1.0f,0.047244f,0.f}, {1.0f,0.055118f,0.f}, {1.0f,0.062992f,0.f}, {1.0f,0.070866f,0.f}, {1.0f,0.07874f,0.f}, {1.0f,0.086614f,0.f}, {1.0f,0.094488f,0.f}, {1.0f,0.10236f,0.f}, {1.0f,0.11024f,0.f}, {1.0f,0.11811f,0.f}, {1.0f,0.12598f,0.f}, {1.0f,0.13386f,0.f}, {1.0f,0.14173f,0.f}, {1.0f,0.14961f,0.f}, {1.0f,0.15748f,0.f}, {1.0f,0.16535f,0.f}, {1.0f,0.17323f,0.f}, {1.0f,0.1811f,0.f}, {1.0f,0.18898f,0.f}, {1.0f,0.19685f,0.f}, {1.0f,0.20472f,0.f}, {1.0f,0.2126f,0.f}, {1.0f,0.22047f,0.f}, {1.0f,0.22835f,0.f}, {1.0f,0.23622f,0.f}, {1.0f,0.24409f,0.f}, {1.0f,0.25197f,0.f}, {1.0f,0.25984f,0.f}, {1.0f,0.26772f,0.f}, {1.0f,0.27559f,0.f}, {1.0f,0.28346f,0.f}, {1.0f,0.29134f,0.f}, {1.0f,0.29921f,0.f}, {1.0f,0.30709f,0.f}, {1.0f,0.31496f,0.f}, {1.0f,0.32283f,0.f}, {1.0f,0.33071f,0.f}, {1.0f,0.33858f,0.f}, {1.0f,0.34646f,0.f}, {1.0f,0.35433f,0.f}, {1.0f,0.3622f,0.f}, {1.0f,0.37008f,0.f}, {1.0f,0.37795f,0.f}, {1.0f,0.38583f,0.f}, {1.0f,0.3937f,0.f}, {1.0f,0.40157f,0.f}, {1.0f,0.40945f,0.f}, {1.0f,0.41732f,0.f}, {1.0f,0.4252f,0.f}, {1.0f,0.43307f,0.f}, {1.0f,0.44094f,0.f}, {1.0f,0.44882f,0.f}, {1.0f,0.45669f,0.f}, {1.0f,0.46457f,0.f}, {1.0f,0.47244f,0.f}, {1.0f,0.48031f,0.f}, {1.0f,0.48819f,0.f}, {1.0f,0.49606f,0.f}, {1.0f,0.50394f,0.f}, {1.0f,0.51181f,0.f}, {1.0f,0.51969f,0.f}, {1.0f,0.52756f,0.f}, {1.0f,0.53543f,0.f}, {1.0f,0.54331f,0.f}, {1.0f,0.55118f,0.f}, {1.0f,0.55906f,0.f}, {1.0f,0.56693f,0.f}, {1.0f,0.5748f,0.f}, {1.0f,0.58268f,0.f}, {1.0f,0.59055f,0.f}, {1.0f,0.59843f,0.f}, {1.0f,0.6063f,0.f}, {1.0f,0.61417f,0.f}, {1.0f,0.62205f,0.f}, {1.0f,0.62992f,0.f}, {1.0f,0.6378f,0.f}, {1.0f,0.64567f,0.f}, {1.0f,0.65354f,0.f}, {1.0f,0.66142f,0.f}, {1.0f,0.66929f,0.f}, {1.0f,0.67717f,0.f}, {1.0f,0.68504f,0.f}, {1.0f,0.69291f,0.f}, {1.0f,0.70079f,0.f}, {1.0f,0.70866f,0.f}, {1.0f,0.71654f,0.f}, {1.0f,0.72441f,0.f}, {1.0f,0.73228f,0.f}, {1.0f,0.74016f,0.f}, {1.0f,0.74803f,0.f}, {1.0f,0.75591f,0.f}, {1.0f,0.76378f,0.f}, {1.0f,0.77165f,0.f}, {1.0f,0.77953f,0.f}, {1.0f,0.7874f,0.f}, {1.0f,0.79528f,0.f}, {1.0f,0.80315f,0.f}, {1.0f,0.81102f,0.f}, {1.0f,0.8189f,0.f}, {1.0f,0.82677f,0.f}, {1.0f,0.83465f,0.f}, {1.0f,0.84252f,0.f}, {1.0f,0.85039f,0.f}, {1.0f,0.85827f,0.f}, {1.0f,0.86614f,0.f}, {1.0f,0.87402f,0.f}, {1.0f,0.88189f,0.f}, {1.0f,0.88976f,0.f}, {1.0f,0.89764f,0.f}, {1.0f,0.90551f,0.f}, {1.0f,0.91339f,0.f}, {1.0f,0.92126f,0.f}, {1.0f,0.92913f,0.f}, {1.0f,0.93701f,0.f}, {1.0f,0.94488f,0.f}, {1.0f,0.95276f,0.f}, {1.0f,0.96063f,0.f}, {1.0f,0.9685f,0.f}, {1.0f,0.97638f,0.f}, {1.0f,0.98425f,0.f}, {1.0f,0.99213f,0.f}, {1.0f,1.0f,0.0f} }; float colormapJet[128][3] = { {0.0f,0.0f,0.53125f}, {0.0f,0.0f,0.5625f}, {0.0f,0.0f,0.59375f}, {0.0f,0.0f,0.625f}, {0.0f,0.0f,0.65625f}, {0.0f,0.0f,0.6875f}, {0.0f,0.0f,0.71875f}, {0.0f,0.0f,0.75f}, {0.0f,0.0f,0.78125f}, {0.0f,0.0f,0.8125f}, {0.0f,0.0f,0.84375f}, {0.0f,0.0f,0.875f}, {0.0f,0.0f,0.90625f}, {0.0f,0.0f,0.9375f}, {0.0f,0.0f,0.96875f}, {0.0f,0.0f,1.0f}, {0.0f,0.03125f,1.0f}, {0.0f,0.0625f,1.0f}, {0.0f,0.09375f,1.0f}, {0.0f,0.125f,1.0f}, {0.0f,0.15625f,1.0f}, {0.0f,0.1875f,1.0f}, {0.0f,0.21875f,1.0f}, {0.0f,0.25f,1.0f}, {0.0f,0.28125f,1.0f}, {0.0f,0.3125f,1.0f}, {0.0f,0.34375f,1.0f}, {0.0f,0.375f,1.0f}, {0.0f,0.40625f,1.0f}, {0.0f,0.4375f,1.0f}, {0.0f,0.46875f,1.0f}, {0.0f,0.5f,1.0f}, {0.0f,0.53125f,1.0f}, {0.0f,0.5625f,1.0f}, {0.0f,0.59375f,1.0f}, {0.0f,0.625f,1.0f}, {0.0f,0.65625f,1.0f}, {0.0f,0.6875f,1.0f}, {0.0f,0.71875f,1.0f}, {0.0f,0.75f,1.0f}, {0.0f,0.78125f,1.0f}, {0.0f,0.8125f,1.0f}, {0.0f,0.84375f,1.0f}, {0.0f,0.875f,1.0f}, {0.0f,0.90625f,1.0f}, {0.0f,0.9375f,1.0f}, {0.0f,0.96875f,1.0f}, {0.0f,1.0f,1.0f}, {0.03125f,1.0f,0.96875f}, {0.0625f,1.0f,0.9375f}, {0.09375f,1.0f,0.90625f}, {0.125f,1.0f,0.875f}, {0.15625f,1.0f,0.84375f}, {0.1875f,1.0f,0.8125f}, {0.21875f,1.0f,0.78125f}, {0.25f,1.0f,0.75f}, {0.28125f,1.0f,0.71875f}, {0.3125f,1.0f,0.6875f}, {0.34375f,1.0f,0.65625f}, {0.375f,1.0f,0.625f}, {0.40625f,1.0f,0.59375f}, {0.4375f,1.0f,0.5625f}, {0.46875f,1.0f,0.53125f}, {0.5f,1.0f,0.5f}, {0.53125f,1.0f,0.46875f}, {0.5625f,1.0f,0.4375f}, {0.59375f,1.0f,0.40625f}, {0.625f,1.0f,0.375f}, {0.65625f,1.0f,0.34375f}, {0.6875f,1.0f,0.3125f}, {0.71875f,1.0f,0.28125f}, {0.75f,1.0f,0.25f}, {0.78125f,1.0f,0.21875f}, {0.8125f,1.0f,0.1875f}, {0.84375f,1.0f,0.15625f}, {0.875f,1.0f,0.125f}, {0.90625f,1.0f,0.09375f}, {0.9375f,1.0f,0.0625f}, {0.96875f,1.0f,0.03125f}, {1.0f,1.0f,0.0f}, {1.0f,0.96875f,0.0f}, {1.0f,0.9375f,0.0f}, {1.0f,0.90625f,0.0f}, {1.0f,0.875f,0.0f}, {1.0f,0.84375f,0.0f}, {1.0f,0.8125f,0.0f}, {1.0f,0.78125f,0.0f}, {1.0f,0.75f,0.0f}, {1.0f,0.71875f,0.0f}, {1.0f,0.6875f,0.0f}, {1.0f,0.65625f,0.0f}, {1.0f,0.625f,0.0f}, {1.0f,0.59375f,0.0f}, {1.0f,0.5625f,0.0f}, {1.0f,0.53125f,0.0f}, {1.0f,0.5f,0.0f}, {1.0f,0.46875f,0.0f}, {1.0f,0.4375f,0.0f}, {1.0f,0.40625f,0.0f}, {1.0f,0.375f,0.0f}, {1.0f,0.34375f,0.0f}, {1.0f,0.3125f,0.0f}, {1.0f,0.28125f,0.0f}, {1.0f,0.25f,0.0f}, {1.0f,0.21875f,0.0f}, {1.0f,0.1875f,0.0f}, {1.0f,0.15625f,0.0f}, {1.0f,0.125f,0.0f}, {1.0f,0.09375f,0.0f}, {1.0f,0.0625f,0.0f}, {1.0f,0.03125f,0.0f}, {1.0f,0.0f,0.0f}, {0.96875f,0.0f,0.0f}, {0.9375f,0.0f,0.0f}, {0.90625f,0.0f,0.0f}, {0.875f,0.0f,0.0f}, {0.84375f,0.0f,0.0f}, {0.8125f,0.0f,0.0f}, {0.78125f,0.0f,0.0f}, {0.75f,0.0f,0.0f}, {0.71875f,0.0f,0.0f}, {0.6875f,0.0f,0.0f}, {0.65625f,0.0f,0.0f}, {0.625f,0.0f,0.0f}, {0.59375f,0.0f,0.0f}, {0.5625f,0.0f,0.0f}, {0.53125f,0.0f,0.0f}, {0.5f,0.0f,0.0f} }; void colorDepthImage(cv::Mat& imgDepth, cv::Mat& imgColoredDepth, float minRange, float maxRange) { imgColoredDepth = cv::Mat::zeros(imgDepth.size(), CV_8UC3); for (int i = 0; i < imgColoredDepth.rows; ++i) { const float* depth = imgDepth.ptr(i); unsigned char* pixel = imgColoredDepth.ptr(i); for (int j = 0; j < imgColoredDepth.cols; ++j) { if (depth[j] != 0) { int idx = fminf(depth[j] - minRange, maxRange - minRange) / (maxRange - minRange) * 127.0f; idx = 127 - idx; pixel[0] = colormapJet[idx][2] * 255.0f; pixel[1] = colormapJet[idx][1] * 255.0f; pixel[2] = colormapJet[idx][0] * 255.0f; } pixel += 3; } } } bool colormap(const std::string& name, unsigned char idx, float& r, float& g, float& b) { if (name.compare("jet") == 0) { float* color = colormapJet[idx]; r = color[0]; g = color[1]; b = color[2]; return true; } else if (name.compare("autumn") == 0) { float* color = colormapAutumn[idx]; r = color[0]; g = color[1]; b = color[2]; return true; } return false; } std::vector bresLine(int x0, int y0, int x1, int y1) { // Bresenham's line algorithm // Find cells intersected by line between (x0,y0) and (x1,y1) std::vector cells; int dx = std::abs(x1 - x0); int dy = std::abs(y1 - y0); int sx = (x0 < x1) ? 1 : -1; int sy = (y0 < y1) ? 1 : -1; int err = dx - dy; while (1) { cells.push_back(cv::Point2i(x0, y0)); if (x0 == x1 && y0 == y1) { break; } int e2 = 2 * err; if (e2 > -dy) { err -= dy; x0 += sx; } if (e2 < dx) { err += dx; y0 += sy; } } return cells; } std::vector bresCircle(int x0, int y0, int r) { // Bresenham's circle algorithm // Find cells intersected by circle with center (x0,y0) and radius r std::vector< std::vector > mask(2 * r + 1); for (int i = 0; i < 2 * r + 1; ++i) { mask[i].resize(2 * r + 1); for (int j = 0; j < 2 * r + 1; ++j) { mask[i][j] = false; } } int f = 1 - r; int ddF_x = 1; int ddF_y = -2 * r; int x = 0; int y = r; std::vector line; line = bresLine(x0, y0 - r, x0, y0 + r); for (std::vector::iterator it = line.begin(); it != line.end(); ++it) { mask[it->x - x0 + r][it->y - y0 + r] = true; } line = bresLine(x0 - r, y0, x0 + r, y0); for (std::vector::iterator it = line.begin(); it != line.end(); ++it) { mask[it->x - x0 + r][it->y - y0 + r] = true; } while (x < y) { if (f >= 0) { y--; ddF_y += 2; f += ddF_y; } x++; ddF_x += 2; f += ddF_x; line = bresLine(x0 - x, y0 + y, x0 + x, y0 + y); for (std::vector::iterator it = line.begin(); it != line.end(); ++it) { mask[it->x - x0 + r][it->y - y0 + r] = true; } line = bresLine(x0 - x, y0 - y, x0 + x, y0 - y); for (std::vector::iterator it = line.begin(); it != line.end(); ++it) { mask[it->x - x0 + r][it->y - y0 + r] = true; } line = bresLine(x0 - y, y0 + x, x0 + y, y0 + x); for (std::vector::iterator it = line.begin(); it != line.end(); ++it) { mask[it->x - x0 + r][it->y - y0 + r] = true; } line = bresLine(x0 - y, y0 - x, x0 + y, y0 - x); for (std::vector::iterator it = line.begin(); it != line.end(); ++it) { mask[it->x - x0 + r][it->y - y0 + r] = true; } } std::vector cells; for (int i = 0; i < 2 * r + 1; ++i) { for (int j = 0; j < 2 * r + 1; ++j) { if (mask[i][j]) { cells.push_back(cv::Point2i(i - r + x0, j - r + y0)); } } } return cells; } void fitCircle(const std::vector& points, double& centerX, double& centerY, double& radius) { // D. Umbach, and K. Jones, A Few Methods for Fitting Circles to Data, // IEEE Transactions on Instrumentation and Measurement, 2000 // We use the modified least squares method. double sum_x = 0.0; double sum_y = 0.0; double sum_xx = 0.0; double sum_xy = 0.0; double sum_yy = 0.0; double sum_xxx = 0.0; double sum_xxy = 0.0; double sum_xyy = 0.0; double sum_yyy = 0.0; int n = points.size(); for (int i = 0; i < n; ++i) { double x = points.at(i).x; double y = points.at(i).y; sum_x += x; sum_y += y; sum_xx += x * x; sum_xy += x * y; sum_yy += y * y; sum_xxx += x * x * x; sum_xxy += x * x * y; sum_xyy += x * y * y; sum_yyy += y * y * y; } double A = n * sum_xx - square(sum_x); double B = n * sum_xy - sum_x * sum_y; double C = n * sum_yy - square(sum_y); double D = 0.5 * (n * sum_xyy - sum_x * sum_yy + n * sum_xxx - sum_x * sum_xx); double E = 0.5 * (n * sum_xxy - sum_y * sum_xx + n * sum_yyy - sum_y * sum_yy); centerX = (D * C - B * E) / (A * C - square(B)); centerY = (A * E - B * D) / (A * C - square(B)); double sum_r = 0.0; for (int i = 0; i < n; ++i) { double x = points.at(i).x; double y = points.at(i).y; sum_r += hypot(x - centerX, y - centerY); } radius = sum_r / n; } std::vector intersectCircles(double x1, double y1, double r1, double x2, double y2, double r2) { std::vector ipts; double d = hypot(x1 - x2, y1 - y2); if (d > r1 + r2) { // circles are separate return ipts; } if (d < fabs(r1 - r2)) { // one circle is contained within the other return ipts; } double a = (square(r1) - square(r2) + square(d)) / (2.0 * d); double h = sqrt(square(r1) - square(a)); double x3 = x1 + a * (x2 - x1) / d; double y3 = y1 + a * (y2 - y1) / d; if (h < 1e-10) { // two circles touch at one point ipts.push_back(cv::Point2d(x3, y3)); return ipts; } ipts.push_back(cv::Point2d(x3 + h * (y2 - y1) / d, y3 - h * (x2 - x1) / d)); ipts.push_back(cv::Point2d(x3 - h * (y2 - y1) / d, y3 + h * (x2 - x1) / d)); return ipts; } char UTMLetterDesignator(double latitude) { // This routine determines the correct UTM letter designator for the given latitude // returns 'Z' if latitude is outside the UTM limits of 84N to 80S // Written by Chuck Gantz- chuck.gantz@globalstar.com char letterDesignator; if ((84.0 >= latitude) && (latitude >= 72.0)) letterDesignator = 'X'; else if ((72.0 > latitude) && (latitude >= 64.0)) letterDesignator = 'W'; else if ((64.0 > latitude) && (latitude >= 56.0)) letterDesignator = 'V'; else if ((56.0 > latitude) && (latitude >= 48.0)) letterDesignator = 'U'; else if ((48.0 > latitude) && (latitude >= 40.0)) letterDesignator = 'T'; else if ((40.0 > latitude) && (latitude >= 32.0)) letterDesignator = 'S'; else if ((32.0 > latitude) && (latitude >= 24.0)) letterDesignator = 'R'; else if ((24.0 > latitude) && (latitude >= 16.0)) letterDesignator = 'Q'; else if ((16.0 > latitude) && (latitude >= 8.0)) letterDesignator = 'P'; else if (( 8.0 > latitude) && (latitude >= 0.0)) letterDesignator = 'N'; else if (( 0.0 > latitude) && (latitude >= -8.0)) letterDesignator = 'M'; else if ((-8.0 > latitude) && (latitude >= -16.0)) letterDesignator = 'L'; else if ((-16.0 > latitude) && (latitude >= -24.0)) letterDesignator = 'K'; else if ((-24.0 > latitude) && (latitude >= -32.0)) letterDesignator = 'J'; else if ((-32.0 > latitude) && (latitude >= -40.0)) letterDesignator = 'H'; else if ((-40.0 > latitude) && (latitude >= -48.0)) letterDesignator = 'G'; else if ((-48.0 > latitude) && (latitude >= -56.0)) letterDesignator = 'F'; else if ((-56.0 > latitude) && (latitude >= -64.0)) letterDesignator = 'E'; else if ((-64.0 > latitude) && (latitude >= -72.0)) letterDesignator = 'D'; else if ((-72.0 > latitude) && (latitude >= -80.0)) letterDesignator = 'C'; else letterDesignator = 'Z'; //This is here as an error flag to show that the Latitude is outside the UTM limits return letterDesignator; } void LLtoUTM(double latitude, double longitude, double& utmNorthing, double& utmEasting, std::string& utmZone) { // converts lat/long to UTM coords. Equations from USGS Bulletin 1532 // East Longitudes are positive, West longitudes are negative. // North latitudes are positive, South latitudes are negative // Lat and Long are in decimal degrees // Written by Chuck Gantz- chuck.gantz@globalstar.com double k0 = 0.9996; double LongOrigin; double eccPrimeSquared; double N, T, C, A, M; double LatRad = latitude * M_PI / 180.0; double LongRad = longitude * M_PI / 180.0; double LongOriginRad; int ZoneNumber = static_cast((longitude + 180.0) / 6.0) + 1; if (latitude >= 56.0 && latitude < 64.0 && longitude >= 3.0 && longitude < 12.0) { ZoneNumber = 32; } // Special zones for Svalbard if (latitude >= 72.0 && latitude < 84.0) { if ( longitude >= 0.0 && longitude < 9.0) ZoneNumber = 31; else if (longitude >= 9.0 && longitude < 21.0) ZoneNumber = 33; else if (longitude >= 21.0 && longitude < 33.0) ZoneNumber = 35; else if (longitude >= 33.0 && longitude < 42.0) ZoneNumber = 37; } LongOrigin = static_cast((ZoneNumber - 1) * 6 - 180 + 3); //+3 puts origin in middle of zone LongOriginRad = LongOrigin * M_PI / 180.0; // compute the UTM Zone from the latitude and longitude std::ostringstream oss; oss << ZoneNumber << UTMLetterDesignator(latitude); utmZone = oss.str(); eccPrimeSquared = WGS84_ECCSQ / (1.0 - WGS84_ECCSQ); N = WGS84_A / sqrt(1.0 - WGS84_ECCSQ * sin(LatRad) * sin(LatRad)); T = tan(LatRad) * tan(LatRad); C = eccPrimeSquared * cos(LatRad) * cos(LatRad); A = cos(LatRad) * (LongRad - LongOriginRad); M = WGS84_A * ((1.0 - WGS84_ECCSQ / 4.0 - 3.0 * WGS84_ECCSQ * WGS84_ECCSQ / 64.0 - 5.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 256.0) * LatRad - (3.0 * WGS84_ECCSQ / 8.0 + 3.0 * WGS84_ECCSQ * WGS84_ECCSQ / 32.0 + 45.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 1024.0) * sin(2.0 * LatRad) + (15.0 * WGS84_ECCSQ * WGS84_ECCSQ / 256.0 + 45.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 1024.0) * sin(4.0 * LatRad) - (35.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 3072.0) * sin(6.0 * LatRad)); utmEasting = k0 * N * (A + (1.0 - T + C) * A * A * A / 6.0 + (5.0 - 18.0 * T + T * T + 72.0 * C - 58.0 * eccPrimeSquared) * A * A * A * A * A / 120.0) + 500000.0; utmNorthing = k0 * (M + N * tan(LatRad) * (A * A / 2.0 + (5.0 - T + 9.0 * C + 4.0 * C * C) * A * A * A * A / 24.0 + (61.0 - 58.0 * T + T * T + 600.0 * C - 330.0 * eccPrimeSquared) * A * A * A * A * A * A / 720.0)); if (latitude < 0.0) { utmNorthing += 10000000.0; //10000000 meter offset for southern hemisphere } } void UTMtoLL(double utmNorthing, double utmEasting, const std::string& utmZone, double& latitude, double& longitude) { // converts UTM coords to lat/long. Equations from USGS Bulletin 1532 // East Longitudes are positive, West longitudes are negative. // North latitudes are positive, South latitudes are negative // Lat and Long are in decimal degrees. // Written by Chuck Gantz- chuck.gantz@globalstar.com double k0 = 0.9996; double eccPrimeSquared; double e1 = (1.0 - sqrt(1.0 - WGS84_ECCSQ)) / (1.0 + sqrt(1.0 - WGS84_ECCSQ)); double N1, T1, C1, R1, D, M; double LongOrigin; double mu, phi1, phi1Rad; double x, y; int ZoneNumber; char ZoneLetter; bool NorthernHemisphere; x = utmEasting - 500000.0; //remove 500,000 meter offset for longitude y = utmNorthing; std::istringstream iss(utmZone); iss >> ZoneNumber >> ZoneLetter; if ((static_cast(ZoneLetter) - static_cast('N')) >= 0) { NorthernHemisphere = true;//point is in northern hemisphere } else { NorthernHemisphere = false;//point is in southern hemisphere y -= 10000000.0;//remove 10,000,000 meter offset used for southern hemisphere } LongOrigin = (ZoneNumber - 1.0) * 6.0 - 180.0 + 3.0; //+3 puts origin in middle of zone eccPrimeSquared = WGS84_ECCSQ / (1.0 - WGS84_ECCSQ); M = y / k0; mu = M / (WGS84_A * (1.0 - WGS84_ECCSQ / 4.0 - 3.0 * WGS84_ECCSQ * WGS84_ECCSQ / 64.0 - 5.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 256.0)); phi1Rad = mu + (3.0 * e1 / 2.0 - 27.0 * e1 * e1 * e1 / 32.0) * sin(2.0 * mu) + (21.0 * e1 * e1 / 16.0 - 55.0 * e1 * e1 * e1 * e1 / 32.0) * sin(4.0 * mu) + (151.0 * e1 * e1 * e1 / 96.0) * sin(6.0 * mu); phi1 = phi1Rad / M_PI * 180.0; N1 = WGS84_A / sqrt(1.0 - WGS84_ECCSQ * sin(phi1Rad) * sin(phi1Rad)); T1 = tan(phi1Rad) * tan(phi1Rad); C1 = eccPrimeSquared * cos(phi1Rad) * cos(phi1Rad); R1 = WGS84_A * (1.0 - WGS84_ECCSQ) / pow(1.0 - WGS84_ECCSQ * sin(phi1Rad) * sin(phi1Rad), 1.5); D = x / (N1 * k0); latitude = phi1Rad - (N1 * tan(phi1Rad) / R1) * (D * D / 2.0 - (5.0 + 3.0 * T1 + 10.0 * C1 - 4.0 * C1 * C1 - 9.0 * eccPrimeSquared) * D * D * D * D / 24.0 + (61.0 + 90.0 * T1 + 298.0 * C1 + 45.0 * T1 * T1 - 252.0 * eccPrimeSquared - 3.0 * C1 * C1) * D * D * D * D * D * D / 720.0); latitude *= 180.0 / M_PI; longitude = (D - (1.0 + 2.0 * T1 + C1) * D * D * D / 6.0 + (5.0 - 2.0 * C1 + 28.0 * T1 - 3.0 * C1 * C1 + 8.0 * eccPrimeSquared + 24.0 * T1 * T1) * D * D * D * D * D / 120.0) / cos(phi1Rad); longitude = LongOrigin + longitude / M_PI * 180.0; } long int timestampDiff(uint64_t t1, uint64_t t2) { if (t2 > t1) { uint64_t d = t2 - t1; if (d > std::numeric_limits::max()) { return std::numeric_limits::max(); } else { return d; } } else { uint64_t d = t1 - t2; if (d > std::numeric_limits::max()) { return std::numeric_limits::min(); } else { return - static_cast(d); } } } } ================================================ FILE: camera_model/src/intrinsic_calib.cc ================================================ #include #include #include #include #include #include #include #include #include #include "camodocal/chessboard/Chessboard.h" #include "camodocal/calib/CameraCalibration.h" #include "camodocal/gpl/gpl.h" int main(int argc, char** argv) { cv::Size boardSize; float squareSize; std::string inputDir; std::string cameraModel; std::string cameraName; std::string prefix; std::string fileExtension; bool useOpenCV; bool viewResults; bool verbose; //========= Handling Program options ========= boost::program_options::options_description desc("Allowed options"); desc.add_options() ("help", "produce help message") ("width,w", boost::program_options::value(&boardSize.width)->default_value(8), "Number of inner corners on the chessboard pattern in x direction") ("height,h", boost::program_options::value(&boardSize.height)->default_value(12), "Number of inner corners on the chessboard pattern in y direction") ("size,s", boost::program_options::value(&squareSize)->default_value(7.f), "Size of one square in mm") ("input,i", boost::program_options::value(&inputDir)->default_value("calibrationdata"), "Input directory containing chessboard images") ("prefix,p", boost::program_options::value(&prefix)->default_value("left-"), "Prefix of images") ("file-extension,e", boost::program_options::value(&fileExtension)->default_value(".png"), "File extension of images") ("camera-model", boost::program_options::value(&cameraModel)->default_value("mei"), "Camera model: kannala-brandt | mei | pinhole") ("camera-name", boost::program_options::value(&cameraName)->default_value("camera"), "Name of camera") ("opencv", boost::program_options::bool_switch(&useOpenCV)->default_value(true), "Use OpenCV to detect corners") ("view-results", boost::program_options::bool_switch(&viewResults)->default_value(false), "View results") ("verbose,v", boost::program_options::bool_switch(&verbose)->default_value(true), "Verbose output") ; boost::program_options::positional_options_description pdesc; pdesc.add("input", 1); boost::program_options::variables_map vm; boost::program_options::store(boost::program_options::command_line_parser(argc, argv).options(desc).positional(pdesc).run(), vm); boost::program_options::notify(vm); if (vm.count("help")) { std::cout << desc << std::endl; return 1; } if (!boost::filesystem::exists(inputDir) && !boost::filesystem::is_directory(inputDir)) { std::cerr << "# ERROR: Cannot find input directory " << inputDir << "." << std::endl; return 1; } camodocal::Camera::ModelType modelType; if (boost::iequals(cameraModel, "kannala-brandt")) { modelType = camodocal::Camera::KANNALA_BRANDT; } else if (boost::iequals(cameraModel, "mei")) { modelType = camodocal::Camera::MEI; } else if (boost::iequals(cameraModel, "pinhole")) { modelType = camodocal::Camera::PINHOLE; } else if (boost::iequals(cameraModel, "scaramuzza")) { modelType = camodocal::Camera::SCARAMUZZA; } else { std::cerr << "# ERROR: Unknown camera model: " << cameraModel << std::endl; return 1; } switch (modelType) { case camodocal::Camera::KANNALA_BRANDT: std::cout << "# INFO: Camera model: Kannala-Brandt" << std::endl; break; case camodocal::Camera::MEI: std::cout << "# INFO: Camera model: Mei" << std::endl; break; case camodocal::Camera::PINHOLE: std::cout << "# INFO: Camera model: Pinhole" << std::endl; break; case camodocal::Camera::SCARAMUZZA: std::cout << "# INFO: Camera model: Scaramuzza-Omnidirect" << std::endl; break; } // look for images in input directory std::vector imageFilenames; boost::filesystem::directory_iterator itr; for (boost::filesystem::directory_iterator itr(inputDir); itr != boost::filesystem::directory_iterator(); ++itr) { if (!boost::filesystem::is_regular_file(itr->status())) { continue; } std::string filename = itr->path().filename().string(); // check if prefix matches if (!prefix.empty()) { if (filename.compare(0, prefix.length(), prefix) != 0) { continue; } } // check if file extension matches if (filename.compare(filename.length() - fileExtension.length(), fileExtension.length(), fileExtension) != 0) { continue; } imageFilenames.push_back(itr->path().string()); if (verbose) { std::cerr << "# INFO: Adding " << imageFilenames.back() << std::endl; } } if (imageFilenames.empty()) { std::cerr << "# ERROR: No chessboard images found." << std::endl; return 1; } if (verbose) { std::cerr << "# INFO: # images: " << imageFilenames.size() << std::endl; } std::sort(imageFilenames.begin(), imageFilenames.end()); cv::Mat image = cv::imread(imageFilenames.front(), -1); const cv::Size frameSize = image.size(); camodocal::CameraCalibration calibration(modelType, cameraName, frameSize, boardSize, squareSize); calibration.setVerbose(verbose); std::vector chessboardFound(imageFilenames.size(), false); for (size_t i = 0; i < imageFilenames.size(); ++i) { image = cv::imread(imageFilenames.at(i), -1); camodocal::Chessboard chessboard(boardSize, image); chessboard.findCorners(useOpenCV); if (chessboard.cornersFound()) { if (verbose) { std::cerr << "# INFO: Detected chessboard in image " << i + 1 << ", " << imageFilenames.at(i) << std::endl; } calibration.addChessboardData(chessboard.getCorners()); cv::Mat sketch; chessboard.getSketch().copyTo(sketch); cv::imshow("Image", sketch); cv::waitKey(50); } else if (verbose) { std::cerr << "# INFO: Did not detect chessboard in image " << i + 1 << std::endl; } chessboardFound.at(i) = chessboard.cornersFound(); } cv::destroyWindow("Image"); if (calibration.sampleCount() < 10) { std::cerr << "# ERROR: Insufficient number of detected chessboards." << std::endl; return 1; } if (verbose) { std::cerr << "# INFO: Calibrating..." << std::endl; } double startTime = camodocal::timeInSeconds(); calibration.calibrate(); calibration.writeParams(cameraName + "_camera_calib.yaml"); calibration.writeChessboardData(cameraName + "_chessboard_data.dat"); if (verbose) { std::cout << "# INFO: Calibration took a total time of " << std::fixed << std::setprecision(3) << camodocal::timeInSeconds() - startTime << " sec.\n"; } if (verbose) { std::cerr << "# INFO: Wrote calibration file to " << cameraName + "_camera_calib.yaml" << std::endl; } if (viewResults) { std::vector cbImages; std::vector cbImageFilenames; for (size_t i = 0; i < imageFilenames.size(); ++i) { if (!chessboardFound.at(i)) { continue; } cbImages.push_back(cv::imread(imageFilenames.at(i), -1)); cbImageFilenames.push_back(imageFilenames.at(i)); } // visualize observed and reprojected points calibration.drawResults(cbImages); for (size_t i = 0; i < cbImages.size(); ++i) { cv::putText(cbImages.at(i), cbImageFilenames.at(i), cv::Point(10,20), cv::FONT_HERSHEY_COMPLEX, 0.5, cv::Scalar(255, 255, 255), 1, CV_AA); cv::imshow("Image", cbImages.at(i)); cv::waitKey(0); } } return 0; } ================================================ FILE: camera_model/src/sparse_graph/Transform.cc ================================================ #include namespace camodocal { Transform::Transform() { m_q.setIdentity(); m_t.setZero(); } Transform::Transform(const Eigen::Matrix4d& H) { m_q = Eigen::Quaterniond(H.block<3,3>(0,0)); m_t = H.block<3,1>(0,3); } Eigen::Quaterniond& Transform::rotation(void) { return m_q; } const Eigen::Quaterniond& Transform::rotation(void) const { return m_q; } double* Transform::rotationData(void) { return m_q.coeffs().data(); } const double* const Transform::rotationData(void) const { return m_q.coeffs().data(); } Eigen::Vector3d& Transform::translation(void) { return m_t; } const Eigen::Vector3d& Transform::translation(void) const { return m_t; } double* Transform::translationData(void) { return m_t.data(); } const double* const Transform::translationData(void) const { return m_t.data(); } Eigen::Matrix4d Transform::toMatrix(void) const { Eigen::Matrix4d H; H.setIdentity(); H.block<3,3>(0,0) = m_q.toRotationMatrix(); H.block<3,1>(0,3) = m_t; return H; } } ================================================ FILE: config/advio_12_config.yaml ================================================ %YAML:1.0 #common parameters imu_topic: "/imu0" image_topic: "/cam0/image_raw" mask_topic: "/cam0/mask" output_path: "~/output/" #camera calibration model_type: PINHOLE camera_name: camera image_width: 720 image_height: 1280 distortion_parameters: k1: 0.0478 k2: 0.0339 p1: -0.00033 p2: -0.00091 projection_parameters: fx: 1077.2 fy: 1079.3 cx: 362.14 cy: 636.39 estimate_extrinsic: 0 #Rotation from camera frame to imu frame, imu^R_cam extrinsicRotation: !!opencv-matrix rows: 3 cols: 3 dt: d data: [0.9999763379093255, -0.004079205042965442, -0.005539287650170447, -0.004066386342107199, -0.9999890330121858, 0.0023234365646622014, -0.00554870467502187, -0.0023008567036498766, -0.9999819588046867] #Translation from camera frame to imu frame, imu^T_cam extrinsicTranslation: !!opencv-matrix rows: 3 cols: 1 dt: d data: [-0.008977668364731128, 0.07557012320238939, -0.005545773942541918] #feature traker parameters max_cnt: 250 # max feature number in feature tracking min_dist: 30 # min distance between two features freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image H_threshold: 1.0 # ransac threshold (pixel) show_track: 1 # publish tracking image as topic equalize: 1 # if image is too dark or light, trun on equalize to find enough features fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points #optimization parameters max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time max_num_iterations: 8 # max solver itrations, to guarantee real time keyframe_parallax: 10.0 # keyframe selection threshold (pixel) #discrete-time imu noise parameters (as provided in dataset) acc_n: 0.1517 # accelerometer measurement noise standard deviation. gyr_n: 0.0758 # gyroscope measurement noise standard deviation. acc_w: 6.6407e-6 # accelerometer bias random work noise standard deviation. gyr_w: 1.6127e-6 # gyroscope bias random work noise standard deviation. g_norm: 9.8067 # gravity magnitude #unsynchronization parameters estimate_td: 0 td: 0.0 # initial value of time offset. unit s. readed image clock + td = real image clock (IMU clock) #rolling shutter parameters rolling_shutter: 0 # 0: global shutter camera, 1: rolling shutter camera rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per frame (from data sheet). #visualization parameters save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0 visualize_imu_forward: 0 # output imu forward propogation to achieve low latency and high frequence results visualize_camera_size: 0.4 # size of camera marker in RVIZ ================================================ FILE: config/ol_market1_config.yaml ================================================ %YAML:1.0 #common parameters imu_topic: "/d400/imu0" image_topic: "/d400/color/image_raw" mask_topic: "/planes/segments_crf" output_path: "~/market1" #camera calibration model_type: PINHOLE camera_name: camera image_width: 848 image_height: 480 distortion_parameters: k1: 0.0 k2: 0.0 p1: 0.0 p2: 0.0 projection_parameters: fx: 616.802734375 fy: 616.7510375976562 cx: 435.0341796875 cy: 242.90113830566406 estimate_extrinsic: 0 #Rotation from camera frame to imu frame, imu^R_cam extrinsicRotation: !!opencv-matrix rows: 3 cols: 3 dt: d data: [9.99946615e-01, 5.27582295e-03, 8.88440156e-03, -5.28478975e-03, 9.99985549e-01, 9.86100143e-04, -8.87907068e-03, -1.03299969e-03, 9.99960047e-01] #Translation from camera frame to imu frame, imu^T_cam extrinsicTranslation: !!opencv-matrix rows: 3 cols: 1 dt: d data: [ 0.020096886903,-0.00507855368778,-0.0115051260218] #feature traker paprameters max_cnt: 250 # max feature number in feature tracking min_dist: 30 # min distance between two features freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image H_threshold: 1.0 # ransac threshold (pixel) show_track: 1 # publish tracking image as topic equalize: 1 # if image is too dark or light, trun on equalize to find enough features fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points #optimization parameters max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time max_num_iterations: 8 # max solver itrations, to guarantee real time keyframe_parallax: 10.0 # keyframe selection threshold (pixel) #discrete-time imu noise parameters (as provided in dataset) acc_n: 0.0163 # accelerometer measurement noise standard deviation. gyr_n: 0.0032 # gyroscope measurement noise standard deviation. acc_w: 0.0049 # accelerometer bias random work noise standard deviation. gyr_w: 0.0004 # gyroscope bias random work noise standard deviation. g_norm: 9.81007 # gravity magnitude #unsynchronization parameters estimate_td: 0 td: 0.0 # initial value of time offset. unit s. readed image clock + td = real image clock (IMU clock) #rolling shutter parameters rolling_shutter: 0 # 0: global shutter camera, 1: rolling shutter camera rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per frame (from data sheet). #visualization parameters save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0 visualize_imu_forward: 0 # output imu forward propogation to achieve low latency and high frequence results visualize_camera_size: 0.4 # size of camera marker in RVIZ ================================================ FILE: config/rpvio_rviz_config.rviz ================================================ Panels: - Class: rviz/Displays Help Height: 0 Name: Displays Property Tree Widget: Expanded: - /Global Options1 - /VIO1 - /Path1 Splitter Ratio: 0.4651159942150116 Tree Height: 140 - Class: rviz/Selection Name: Selection - Class: rviz/Tool Properties Expanded: - /2D Pose Estimate1 - /2D Nav Goal1 - /Publish Point1 Name: Tool Properties Splitter Ratio: 0.5886790156364441 - Class: rviz/Views Expanded: - /Current View1 Name: Views Splitter Ratio: 0.5 - Class: rviz/Time Experimental: false Name: Time SyncMode: 0 SyncSource: tracked image - Class: rviz/Displays Help Height: 78 Name: Displays Property Tree Widget: Expanded: ~ Splitter Ratio: 0.5 Tree Height: 363 Preferences: PromptSaveOnExit: true Toolbars: toolButtonStyle: 2 Visualization Manager: Class: "" Displays: - Alpha: 0.5 Cell Size: 1 Class: rviz/Grid Color: 130; 130; 130 Enabled: false Line Style: Line Width: 0.029999999329447746 Value: Lines Name: Grid Normal Cell Count: 0 Offset: X: 0 Y: 0 Z: 0 Plane: XY Plane Cell Count: 10 Reference Frame: Value: false - Class: rviz/Axes Enabled: false Length: 1 Name: Axes Radius: 0.10000000149011612 Reference Frame: Value: false - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 255; 0; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.5 Name: ground_truth_path Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /benchmark_publisher/path Unreliable: false Value: true - Class: rviz/Image Enabled: true Image Topic: /rpvio_feature_tracker/feature_img Max Value: 1 Median window: 5 Min Value: 0 Name: tracked image Normalize Range: true Queue Size: 2 Transport Hint: raw Unreliable: false Value: true - Class: rviz/Image Enabled: false Image Topic: /cam0/image_raw Max Value: 1 Median window: 5 Min Value: 0 Name: raw_image Normalize Range: true Queue Size: 2 Transport Hint: raw Unreliable: false Value: false - Class: rviz/Group Displays: - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 0; 255; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.029999999329447746 Name: Path Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /vins_estimator/path Unreliable: false Value: true - Class: rviz/MarkerArray Enabled: true Marker Topic: /vins_estimator/camera_pose_visual Name: camera_visual Namespaces: CameraPoseVisualization: true Queue Size: 100 Value: true - Alpha: 1 Autocompute Intensity Bounds: true Autocompute Value Bounds: Max Value: 10 Min Value: -10 Value: true Axis: Z Channel Name: intensity Class: rviz/PointCloud Color: 0; 255; 0 Color Transformer: FlatColor Decay Time: 0 Enabled: true Invert Rainbow: false Max Color: 255; 255; 255 Max Intensity: 4096 Min Color: 0; 0; 0 Min Intensity: 0 Name: current_point Position Transformer: XYZ Queue Size: 10 Selectable: true Size (Pixels): 1 Size (m): 0.009999999776482582 Style: Points Topic: /vins_estimator/point_cloud Unreliable: false Use Fixed Frame: true Use rainbow: true Value: true - Alpha: 1 Autocompute Intensity Bounds: false Autocompute Value Bounds: Max Value: 10 Min Value: -10 Value: true Axis: Z Channel Name: intensity Class: rviz/PointCloud Color: 255; 255; 255 Color Transformer: FlatColor Decay Time: 3000 Enabled: true Invert Rainbow: true Max Color: 0; 0; 0 Max Intensity: 4096 Min Color: 0; 0; 0 Min Intensity: 0 Name: history_point Position Transformer: XYZ Queue Size: 10 Selectable: true Size (Pixels): 1 Size (m): 0.5 Style: Points Topic: /vins_estimator/history_cloud Unreliable: false Use Fixed Frame: true Use rainbow: false Value: true - Class: rviz/TF Enabled: true Frame Timeout: 15 Frames: All Enabled: true body: Value: true camera: Value: true world: Value: true Marker Scale: 1 Name: TF Show Arrows: true Show Axes: true Show Names: true Tree: world: body: camera: {} Update Interval: 0 Value: true Enabled: true Name: VIO - Class: rviz/Group Displays: - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 0; 255; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.029999999329447746 Name: pose_graph_path Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /pose_graph/pose_graph_path Unreliable: false Value: true - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 255; 170; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.029999999329447746 Name: base_path Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /pose_graph/base_path Unreliable: false Value: true - Class: rviz/MarkerArray Enabled: true Marker Topic: /pose_graph/pose_graph Name: loop_visual Namespaces: {} Queue Size: 100 Value: true - Class: rviz/MarkerArray Enabled: true Marker Topic: /pose_graph/camera_pose_visual Name: camera_visual Namespaces: {} Queue Size: 100 Value: true - Class: rviz/Image Enabled: false Image Topic: /pose_graph/match_image Max Value: 1 Median window: 5 Min Value: 0 Name: loop_match_image Normalize Range: true Queue Size: 2 Transport Hint: raw Unreliable: false Value: false - Class: rviz/Marker Enabled: true Marker Topic: /pose_graph/key_odometrys Name: Marker Namespaces: {} Queue Size: 100 Value: true - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 25; 255; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.30000001192092896 Name: Sequence1 Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_1 Unreliable: false Value: true - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 255; 0; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.30000001192092896 Name: Sequence2 Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_2 Unreliable: false Value: true - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 0; 85; 255 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.30000001192092896 Name: Sequence3 Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_3 Unreliable: false Value: true - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 255; 170; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.30000001192092896 Name: Sequence4 Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_4 Unreliable: false Value: true - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 255; 170; 255 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.029999999329447746 Name: Sequence5 Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_5 Unreliable: false Value: true - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 25; 255; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.029999999329447746 Name: no_loop_path Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /pose_graph/no_loop_path Unreliable: false Value: true Enabled: true Name: pose_graph - Alpha: 1 Buffer Length: 1 Class: rviz/Path Color: 164; 0; 0 Enabled: true Head Diameter: 0.30000001192092896 Head Length: 0.20000000298023224 Length: 0.30000001192092896 Line Style: Lines Line Width: 0.029999999329447746 Name: Path Offset: X: 0 Y: 0 Z: 0 Pose Color: 255; 85; 255 Pose Style: None Radius: 0.029999999329447746 Shaft Diameter: 0.10000000149011612 Shaft Length: 0.10000000149011612 Topic: /benchmark_publisher/path Unreliable: false Value: true Enabled: true Global Options: Background Color: 255; 255; 255 Default Light: true Fixed Frame: world Frame Rate: 30 Name: root Tools: - Class: rviz/MoveCamera - Class: rviz/Select - Class: rviz/FocusCamera - Class: rviz/Measure - Class: rviz/SetInitialPose Theta std deviation: 0.2617993950843811 Topic: /initialpose X std deviation: 0.5 Y std deviation: 0.5 - Class: rviz/SetGoal Topic: /move_base_simple/goal - Class: rviz/PublishPoint Single click: true Topic: /clicked_point Value: true Views: Current: Class: rviz/XYOrbit Distance: 63.901485443115234 Enable Stereo Rendering: Stereo Eye Separation: 0.05999999865889549 Stereo Focal Distance: 1 Swap Stereo Eyes: false Value: false Focal Point: X: 7.038261890411377 Y: -13.610830307006836 Z: 2.141062395821791e-5 Focal Shape Fixed Size: true Focal Shape Size: 0.05000000074505806 Invert Z Axis: false Name: Current View Near Clip Distance: 0.009999999776482582 Pitch: 0.5597963333129883 Target Frame: Value: XYOrbit (rviz) Yaw: 4.052229404449463 Saved: ~ Window Geometry: Displays: collapsed: false Height: 711 Hide Left Dock: false Hide Right Dock: true QMainWindow State: 000000ff00000000fd0000000400000000000001fb0000022cfc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d0000015d0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650000000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001a0000000c9000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000005130000003bfc0100000002fb0000000800540069006d0065010000000000000513000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000003120000022c00000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000 Selection: collapsed: false Time: collapsed: false Tool Properties: collapsed: false Views: collapsed: true Width: 1299 X: 67 Y: 27 loop_match_image: collapsed: false raw_image: collapsed: false tracked image: collapsed: false ================================================ FILE: config/rpvio_sim_config.yaml ================================================ %YAML:1.0 #common parameters imu_topic: "/imu_throttled" image_topic: "/image" mask_topic: "/mask" output_path: "~/output/" #camera calibration model_type: PINHOLE camera_name: camera image_width: 640 image_height: 480 distortion_parameters: k1: 0 k2: 0 p1: 0 p2: 0 projection_parameters: fx: 320 fy: 320 cx: 320 cy: 240 estimate_extrinsic: 0 #Rotation from camera frame to imu frame, imu^R_cam extrinsicRotation: !!opencv-matrix rows: 3 cols: 3 dt: d data: [0,0,1,1,0,0,0,1,0] #Translation from camera frame to imu frame, imu^T_cam extrinsicTranslation: !!opencv-matrix rows: 3 cols: 1 dt: d data: [0.50, 0, 0] #feature traker parameters max_cnt: 250 # max feature number in feature tracking min_dist: 30 # min distance between two features freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image H_threshold: 1.0 # ransac threshold (pixel) show_track: 1 # publish tracking image as topic equalize: 1 # if image is too dark or light, trun on equalize to find enough features fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points #optimization parameters max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time max_num_iterations: 8 # max solver itrations, to guarantee real time keyframe_parallax: 10.0 # keyframe selection threshold (pixel) #discrete-time imu noise parameters acc_n: 0.074427240 # accelerometer measurement noise standard deviation. gyr_n: 0.002759607 # gyroscope measurement noise standard deviation. acc_w: 3.9471004e-7 # accelerometer bias random work noise standard deviation. gyr_w: 3.1538983e-8 # gyroscope bias random work noise standard deviation. g_norm: 9.80665 # gravity magnitude #unsynchronization parameters estimate_td: 0 td: -0.03079132514112524 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock) #rolling shutter parameters rolling_shutter: 0 # 0: global shutter camera, 1: rolling shutter camera rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per frame (from data sheet). #visualization parameters save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0 visualize_imu_forward: 0 # output imu forward propogation to achieve low latency and high frequence results visualize_camera_size: 0.4 # size of camera marker in RVIZ ================================================ FILE: plane_segmentation/RecoverPlane_perpendicular.py ================================================ from __future__ import division import os import time import math from data_loader_new import DataLoader from net import * from utils import * import scipy.misc import random import math seed = 8964 tf.set_random_seed(seed) np.random.seed(seed) class RecoverPlane(object): def __init__(self): pass #do nothing # note in python all self.X are class members. And they can be used without declaration. def build_train_graph(self): opt = self.opt self.num_plane = opt.num_plane loader = DataLoader(opt.dataset_dir, opt.batch_size, opt.img_height, opt.img_width, self.num_scales) # a class defined in data_loader with tf.name_scope("data_loading"): tgt_image, tgt_depth_stack, tgt_label_stack, intrinsics = loader.load_train_batch() #, tgt2src tgt_image = self.preprocess_image(tgt_image) #4*192*320*3 with tf.name_scope("planeMask_and_planeParam_prediction"): pred_param, pred_mask, plane_net_endpoints = \ plane_pred_net(tgt_image, opt.num_plane, is_training=True) new_shape = (3,3,8) with tf.name_scope("mask_color"): # Generate random colors random.seed(seed) colors = random_colors(opt.num_plane) for i in range(opt.num_plane): colors[i] = [(m*255) for m in colors[i]] with tf.name_scope("compute_loss"): plane_loss = 0 depth_loss = 0 perpendicular_loss = 0 tgt_image_all = [] plane_mask_stack_all = [] color_plane_mask_stack_all = [] tgt_depth_all = [] depth_error_all = [] non_plane_mask_stack_all = [] for s in range(self.num_scales): # Scale the target images, depth, and label for computing loss at the according scale. curr_tgt_image = tf.image.resize_area(tgt_image, [int(opt.img_height/(2**s)), int(opt.img_width/(2**s))]) curr_depth_stack = tf.image.resize_area(tgt_depth_stack, [int(opt.img_height / (2 ** s)),int(opt.img_width / (2 ** s))]) curr_label_stack = tf.image.resize_area(tgt_label_stack, [int(opt.img_height / (2 ** s)), int(opt.img_width / (2 ** s))]) # calculate the plane_loss using cross_entropy # basically assume all the pixels belong to potential plane masks = 1 ref_plane_mask = tf.concat([tf.ones(shape=curr_label_stack.get_shape())- curr_label_stack , curr_label_stack], axis=-1) plane_loss += opt.plane_weight * \ self.compute_plane_reg_loss(pred_mask[s], ref_plane_mask) # compare with 'ref_plane_mask' pred_mask_s = tf.nn.softmax(pred_mask[s]) # get the unscaled ray, k^(-1)q in Eq.3 unscaled_ray = compute_unscaled_ray(curr_tgt_image, intrinsics[:, s, :, :]) for p in range(opt.num_plane): # the left equation of Eq.3 left_plane_eq = compute_plane_equation(curr_tgt_image, pred_param[:,p:p+1,:], unscaled_ray, curr_depth_stack) depth_error = self.compute_depth_error(left_plane_eq, 1.) # extract each plane_mask curr_plane = tf.slice(pred_mask_s, [0, 0, 0, p], [-1, -1, -1, 1]) # depth_loss compute the variation of depth_error in predicted plane region depth_loss += tf.reduce_mean(depth_error * curr_plane) if p == 0: depth_error_stack = val2uint8(depth_error,0.3) plane_mask_stack = curr_plane else: depth_error_stack = tf.concat([depth_error_stack, val2uint8(depth_error, 0.3) ], axis=-1) plane_mask_stack = tf.concat([plane_mask_stack, curr_plane], axis=-1) #normlaized depth for visulaization norm_tgt_depth = ((curr_depth_stack - tf.reduce_min(curr_depth_stack)) / (tf.reduce_max(curr_depth_stack) - tf.reduce_min(curr_depth_stack))) * 255 color_plane_mask = color_mask(self.deprocess_image(curr_tgt_image), pred_mask_s, colors, alpha=0.3) #stack all different scale results together tgt_image_all.append(curr_tgt_image) tgt_depth_all.append(norm_tgt_depth) depth_error_all.append(depth_error_stack) plane_mask_stack_all.append(plane_mask_stack) color_plane_mask_stack_all.append(color_plane_mask) non_plane_mask_stack_all.append(pred_mask_s[:,:,:,-1:]) perpendicular_loss = self.compute_perpendicular_error(pred_param , 1) total_loss = depth_loss + plane_loss + (perpendicular_loss*0.1) with tf.name_scope("train_op"): train_vars = [var for var in tf.trainable_variables()] optim = tf.train.AdamOptimizer(opt.learning_rate, opt.beta1, opt.beta2) self.train_op = slim.learning.create_train_op(total_loss, optim) self.global_step = tf.Variable(0, name='global_step', trainable=False) self.incr_global_step = tf.assign(self.global_step, self.global_step+1) # Collect tensors that are useful later (e.g. tf summary) self.pred_param = pred_param self.steps_per_epoch = loader.steps_per_epoch # how many step is need to finish one epoch self.total_loss = total_loss self.plane_loss = plane_loss self.depth_loss = depth_loss self.perpendicular_loss = perpendicular_loss self.tgt_image_all = tgt_image_all self.tgt_depth_all = tgt_depth_all self.depth_error_all = depth_error_all self.plane_mask_stack_all = plane_mask_stack_all self.color_plane_mask = color_plane_mask_stack_all self.non_plane_mask_stack_all = non_plane_mask_stack_all def compute_plane_reg_loss(self, pred_in, ref): # eq 5 in paper # - max to ensure exp() will not explode to inf pred = pred_in - tf.reduce_max(pred_in, axis=-1,keep_dims=True) pred_plane_only = pred[:, :, :, :-1] # numerical stable implement of # plane_mask = tf.reduce_logsumexp(pred_plane_only, axis=-1) - tf.reduce_logsumexp(pred, axis=-1) # ensure log() will not explode to -inf pred_plane_only_max = tf.reduce_max(pred_plane_only, axis=-1,keep_dims=True) plane_mask = tf.reduce_logsumexp(pred_plane_only - pred_plane_only_max, axis=-1,keep_dims=True) + \ pred_plane_only_max - tf.reduce_logsumexp(pred, axis=-1,keep_dims=True) # combine non plane and plane log(P_pred) together non_plane_mask = pred[:, :, :, -1:] - tf.reduce_logsumexp(pred, axis=-1,keep_dims=True) curr_pred_mask = tf.concat([non_plane_mask, plane_mask], axis=3) # caclulate the cross entropy and return return -tf.reduce_mean(tf.reduce_sum(ref * curr_pred_mask, axis=-1) ) def compute_depth_error(self,proj_homo,proj_depth): diff = proj_homo - proj_depth l1_diff = tf.reduce_sum( tf.abs(diff), axis=-1, keep_dims=True ) depth_error = l1_diff return depth_error def compute_perpendicular_error(self, normal_vectors, num): # Computes the inter-plane Loss function new_normal_vector = tf.transpose(normal_vectors , perm=[0, 2,1 ]) print(" VECTORIZED Inter Plane LOSS FUNCTION") perp_result = [] check= {} err = {} result = tf.matmul( new_normal_vector ,normal_vectors ) #shape of tensors [8, 3,3] angle_result = tf.acos(result ) label = tf.eye( 8, num_columns=3, batch_shape=None) zero = tf.zeros([8, 3, 3]) ones = tf.ones([8,3,3]) label = tf.where(angle_result > 0.785, zero, ones ) mid_res = tf.squared_difference(result, label) res_error =tf.reduce_sum((mid_res) , [0,1 ,2]) perp_loss = 0 perp_loss = res_error/8 return perp_loss def collect_summaries(self): #tf.summary can export model param opt = self.opt tf.summary.scalar("total_loss", self.total_loss) tf.summary.scalar("depth_loss", self.depth_loss) tf.summary.scalar("perpendicular_loss", self.perpendicular_loss) tf.summary.scalar("plane_loss", self.plane_loss) for s in range(self.num_scales): tf.summary.image('scale%d_target_image' % s, \ self.deprocess_image(self.tgt_image_all[s]), max_outputs=opt.batch_size) tf.summary.image('scale%d_norm_depth_image' % s, \ self.tgt_depth_all[s], max_outputs=opt.batch_size) tf.summary.image('scale%d_color_masks' % s, \ self.color_plane_mask[s], max_outputs=opt.batch_size) tf.summary.image('scale%d_non_plane_mask' % s, \ self.non_plane_mask_stack_all[s], max_outputs=opt.batch_size) for p in range(opt.num_plane): tf.summary.image('scale%d_plane_mask_num_%d' % (s, p), self.plane_mask_stack_all[s][:, :, :, p: p + 1], max_outputs=opt.batch_size) tf.summary.image('scale%d_depth_error_%d' % (s, p), self.depth_error_all[s][:, :, :, p: p + 1], max_outputs=opt.batch_size) if s == 0: tf.summary.text("plane_num_%d_n" % (p), tf.as_string(self.pred_param[:, p, :])) def train(self, opt): # TODO: currently fixed to 4 self.num_scales = 4 self.opt = opt self.build_train_graph() self.collect_summaries() #export the result to tensorboard self.i = 0 with tf.name_scope("parameter_count"): # tf.reduce_prod: compute the prodcut of element across dimensions of a tensors # parameter_count is the number of params parameter_count = tf.reduce_sum([tf.reduce_prod(tf.shape(v)) \ for v in tf.trainable_variables()]) self.saver = tf.train.Saver([var for var in tf.model_variables()] + \ [self.global_step], max_to_keep=10) # save the variables of the model and keep the max memory as 10 files sv = tf.train.Supervisor(logdir=opt.checkpoint_dir, save_summaries_secs=0, saver=None) os.environ['CUDA_VISIBLE_DEVICES'] = opt.gpu config = tf.ConfigProto() config.gpu_options.allow_growth = True with sv.managed_session(config=config) as sess: print('Trainable variables: ') for var in tf.trainable_variables(): print(var.name) print("parameter_count =", sess.run(parameter_count)) if opt.continue_train: if opt.init_checkpoint_file is None: checkpoint = tf.train.latest_checkpoint(opt.chsummaryeckpoint_dir) else: checkpoint = opt.init_checkpoint_file print("Resume training from previous checkpoint: %s" % checkpoint) self.saver.restore(sess, checkpoint) start_time = time.time() for step in range(1, opt.max_steps): fetches = { "train": self.train_op, "global_step": self.global_step, "incr_global_step": self.incr_global_step, } print(str(self.i) +"/800000 ** " +" code name = train_perpendicular.py" + " **perpendicular_loss weight =0.1" + " ** logdir =new_check2" ) self.i +=1 if step % opt.summary_freq == 0: fetches["total_loss"] = self.total_loss fetches["summary"] = sv.summary_op if step >= opt.max_steps - 100: fetches["masked_res"] = self.color_plane_mask results = sess.run(fetches) gs = results["global_step"] if step >= opt.max_steps - 100: last_res = results["masked_res"] for i in range(opt.batch_size): scipy.misc.imsave(opt.checkpoint_dir + "/res_" + str(step) +"_" + str(i) + ".jpg", last_res[0][i,:,:,:]) if step % opt.summary_freq == 0: sv.summary_writer.add_summary(results["summary"], gs) train_epoch = math.ceil(gs / self.steps_per_epoch) # one epoch means all the data in training set is explored once # steps_per_epoch is the len of the data_batch, # gs is the time fetch has been run train_step = gs - (train_epoch - 1) * self.steps_per_epoch # print the progress of every 100 iterations print("Epoch: [%2d] [%5d/%5d] time: %4.4f/it total_loss: %.3f" \ % (train_epoch, train_step, self.steps_per_epoch, \ (time.time() - start_time)/opt.summary_freq,\ results["total_loss"]\ )) start_time = time.time() if step % opt.save_latest_freq == 0: self.save(sess, opt.checkpoint_dir, 'latest') if step % self.steps_per_epoch == 0: self.save(sess, opt.checkpoint_dir, gs) # build network for testing def build_plane_test_graph(self): input_uint8 = tf.placeholder(tf.uint8, [self.batch_size, self.img_height, self.img_width, 3], name='raw_input') input_mc = self.preprocess_image(input_uint8) with tf.name_scope("plane_predication"): pred_param, pred_masks, plane_net_endpoints = plane_pred_net( input_mc, num_plane=self.num_plane, is_training=False) pred_mask_0 = tf.nn.softmax(pred_masks[0]) pred_mask = pred_mask_0 self.inputs = input_uint8 self.pred_mask = pred_mask self.pred_param = pred_param self.plane_epts = plane_net_endpoints def preprocess_image(self, image): # Assuming input image is uint8 image = tf.image.convert_image_dtype(image, dtype=tf.float32) return image * 2. -1. # centeralize def deprocess_image(self, image): # Assuming input image is float32 image = (image + 1.)/2. return tf.image.convert_image_dtype(image, dtype=tf.uint8) def setup_inference(self, img_height, img_width, num_plane, batch_size=1): self.img_height = img_height self.img_width = img_width self.num_plane = num_plane self.batch_size = batch_size self.build_plane_test_graph() def inference(self, inputs, sess): #, mode='depth' fetches = {'pred_param':self.pred_param, 'pred_mask':self.pred_mask} results = sess.run(fetches, feed_dict={self.inputs:inputs}) return results def save(self, sess, checkpoint_dir, step): model_name = 'model' print(" [*] Saving checkpoint to %s..." % checkpoint_dir) if step == 'latest': self.saver.save(sess, os.path.join(checkpoint_dir, model_name + '.latest')) else: self.saver.save(sess, os.path.join(checkpoint_dir, model_name), global_step=step) ================================================ FILE: plane_segmentation/crf_inference.py ================================================ # Author: Sudarshan import sys import numpy as np import pydensecrf.densecrf as dcrf import matplotlib.pyplot as plt import matplotlib.image as mpimg import cv2 import os import argparse try: from cv2 import imread, imwrite except ImportError: from skimage.io import imread, imsave imwrite = imsave from pydensecrf.utils import unary_from_labels, create_pairwise_bilateral, create_pairwise_gaussian def CRF_act( fn_im ,fn_anno , fn_output , fn_output2 ): anno_rgb = imread(fn_anno).astype(np.uint8) anno_lbl = anno_rgb[:,:,0] + (anno_rgb[:,:,1] << 8) + (anno_rgb[:,:,2] << 16) colors_not_used, labels = np.unique(anno_lbl, return_inverse=True) img = imread(fn_im) img = cv2.resize(img , (320,192), interpolation =cv2.INTER_AREA) n_labels = 4 colorize = np.empty((len(colors_not_used), 3), np.uint8) colorize[:,0] = (colors_not_used & 0x0000FF) colorize[:,1] = (colors_not_used & 0x0000FF) >> 8 colorize[:,2] = (colors_not_used & 0x0000FF) >> 8 d = dcrf.DenseCRF2D(img.shape[1], img.shape[0], n_labels) U = unary_from_labels(labels, n_labels, gt_prob=0.8, zero_unsure=False) d.setUnaryEnergy(U) feats = create_pairwise_gaussian(sdims=(5, 5), shape=img.shape[:2]) d.addPairwiseEnergy(feats, compat=2, kernel=dcrf.DIAG_KERNEL, normalization=dcrf.NORMALIZE_SYMMETRIC) feats = create_pairwise_bilateral(sdims=(100, 100), schan=(13, 13, 13), img=img, chdim=2) d.addPairwiseEnergy(feats, compat=7, kernel=dcrf.DIAG_KERNEL, normalization=dcrf.NORMALIZE_SYMMETRIC) Q, tmp1, tmp2 = d.startInference() for i in range(6): print("KL-divergence at {}: {}".format(i, d.klDivergence(Q))) d.stepInference(Q, tmp1, tmp2) MAP = np.argmax(Q, axis=0) MAP = colorize[MAP,:] MAP = MAP.reshape(img.shape) MAP = cv2.morphologyEx(MAP, cv2.MORPH_CLOSE, kernel) print(np.shape(MAP)) imwrite(fn_output, MAP) for rows in range(np.shape(MAP)[0]): for cols in range(np.shape(MAP)[1]): if (MAP[rows][cols][0] > 0): MAP[rows][cols] =255 if (MAP[rows][cols][0] == 0): MAP[rows][cols] =0 imwrite(fn_output2, MAP) #imwrite(fn_output, MAP.reshape(img.shape)) kernel = (10,10) parser = argparse.ArgumentParser() parser.add_argument('rgb_image_dir', help='path to original rgb images') parser.add_argument('labels_dir', help='path to plane_sgmts_modified output dir from planerecover') parser.add_argument('output_dir', help='path to output refined masks') args = parser.parse_args() RGB_image_dir = args.rgb_image_dir labels_dir = args.labels_dir output_dir = args.output_dir output_dir2= "/home/sudarshan/planerecover/CRF/Results_advio_single" images = sorted(os.listdir(RGB_image_dir)) labels = sorted(os.listdir(labels_dir)) k = 0 for i, j in zip(images , labels): k +=1 if ( k >8948): image_name = i #"img" + str(i) +".jpg" image_path = os.path.join(RGB_image_dir, image_name) labels_name = j #"OpenLORIS_images_img" + str(i) +".png" labels_path = os.path.join(labels_dir, labels_name) output_name = "img" + str(i) + "_crf_res" +".png" output_path = os.path.join(output_dir, output_name) output_path2 = os.path.join(output_dir2, output_name) CRF_act(image_path , labels_path, output_path, output_path2) print( "***********" + str(image_name) +"/" + "****** " + str(labels_name) + "*****" ) ================================================ FILE: plane_segmentation/data_loader_new.py ================================================ from __future__ import division import os import random import tensorflow as tf class DataLoader(object): def __init__(self, dataset_dir=None, batch_size=None, img_height=None, img_width=None, num_scales=None): self.dataset_dir = dataset_dir self.batch_size = batch_size self.img_height = img_height self.img_width = img_width self.num_scales = num_scales def load_train_batch(self): """Load a batch of training instances. """ seed = random.randint(0, 2**31 - 1) # Load the list of training files into queues and shuffle it file_list = self.format_file_list(self.dataset_dir, 'train_8000_recent_working') print("**************" + "LOADED NEW DATA train_8000_recent_working" + "***********") image_paths_queue = tf.train.string_input_producer( file_list['image_file_list'], seed=seed, shuffle=True) cam_paths_queue = tf.train.string_input_producer( file_list['cam_file_list'], seed=seed, shuffle=True) depth_paths_queue = tf.train.string_input_producer( file_list['depth_file_list'], seed=seed, shuffle=True) label_paths_queue = tf.train.string_input_producer( file_list['label_file_list'], seed=seed, shuffle=True) self.steps_per_epoch = int( len(file_list['image_file_list'])//self.batch_size) # Load images img_reader = tf.WholeFileReader() _, image_contents = img_reader.read(image_paths_queue) image_seq = tf.image.decode_image(image_contents, channels=3) # tgt_image = tf.reshape(image_seq, [self.img_height, self.img_width, 3]) #tgt_image= tf.cast(tgt_image, tf.float32) #print( " **** images loaded ******************") # Load labels label_reader = tf.WholeFileReader() _, label_contents = label_reader.read(label_paths_queue) label_seq = tf.image.decode_png(label_contents) tgt_label = tf.reshape(label_seq, [self.img_height, self.img_width, 1]) print(" CHECKS PASSED ") # Load depths depth_reader = tf.WholeFileReader() _, depth_contents = depth_reader.read(depth_paths_queue) # image_seq = tf.image.decode_jpeg(image_contents) tgt_image_detph = tf.image.decode_png(depth_contents,dtype=tf.uint16 ,channels=1)[:,:, 0] tgt_detph = tf.cast(tgt_image_detph, dtype=tf.float32) tgt_detph = tf.reshape(tgt_detph, [ self.img_height, self.img_width, 1]) \ / tf.constant(100., dtype=tf.float32,shape=[self.img_height, self.img_width, 1]) # Load camera intrinsics print("------------------- intrininsics inncluded ----------" ) cam_reader = tf.TextLineReader() _, raw_cam_contents = cam_reader.read(cam_paths_queue) rec_def = [] for i in range(9): rec_def.append([1.]) raw_cam_vec = tf.decode_csv(raw_cam_contents,record_defaults=rec_def) raw_cam_vec = tf.stack(raw_cam_vec) intrinsics = tf.reshape(raw_cam_vec, [3, 3]) #intrinsics = [ [1169.621094 , 0.000000, 646.295044] , [ 0.000000, 1167.105103 ,489.927032 ] , [ 0.000000 ,0.000000, 1.000000] ] # Form training batches tgt_image, tgt_detph, tgt_label, intrinsics = \ tf.train.batch([tgt_image, tgt_detph,tgt_label, intrinsics], batch_size=self.batch_size) #it will upload 4 batch from the dataset # Data augmentation tgt_image, tgt_detph, tgt_label, intrinsics = self.data_augmentation( tgt_image, tgt_detph, tgt_label,intrinsics, self.img_height, self.img_width) intrinsics = self.get_multi_scale_intrinsics( intrinsics, self.num_scales) return tgt_image, tgt_detph, tgt_label, intrinsics def make_intrinsics_matrix(self, fx, fy, cx, cy): # Assumes batch input batch_size = fx.get_shape().as_list()[0] zeros = tf.zeros_like(fx) r1 = tf.stack([fx, zeros, cx], axis=1) r2 = tf.stack([zeros, fy, cy], axis=1) r3 = tf.constant([0.,0.,1.], shape=[1, 3]) r3 = tf.tile(r3, [batch_size, 1]) intrinsics = tf.stack([r1, r2, r3], axis=1) return intrinsics def data_augmentation(self, im, depth, label, intrinsics, out_h, out_w): # Random scaling def random_scaling(im, depth, label, intrinsics): batch_size, in_h, in_w, _ = im.get_shape().as_list() scaling = tf.random_uniform([2], 1, 1.15) x_scaling = scaling[0] y_scaling = scaling[1] out_h = tf.cast(in_h * y_scaling, dtype=tf.int32) out_w = tf.cast(in_w * x_scaling, dtype=tf.int32) im = tf.image.resize_area(im, [out_h, out_w]) depth = tf.image.resize_area(depth, [out_h, out_w]) label = tf.image.resize_area(label, [out_h, out_w]) fx = intrinsics[:,0,0] * x_scaling fy = intrinsics[:,1,1] * y_scaling cx = intrinsics[:,0,2] * x_scaling cy = intrinsics[:,1,2] * y_scaling intrinsics = self.make_intrinsics_matrix(fx, fy, cx, cy) return im, depth, label, intrinsics # Random cropping def random_cropping(im, depth, label, intrinsics, out_h, out_w): # batch_size, in_h, in_w, _ = im.get_shape().as_list() batch_size, in_h, in_w, _ = tf.unstack(tf.shape(im)) offset_y = tf.random_uniform([1], 0, in_h - out_h + 1, dtype=tf.int32)[0] #the scale of in_h and out_h can be different offset_x = tf.random_uniform([1], 0, in_w - out_w + 1, dtype=tf.int32)[0] # because of the scaling process runs before it im = tf.image.crop_to_bounding_box(im, offset_y, offset_x, out_h, out_w) depth = tf.image.crop_to_bounding_box(depth, offset_y, offset_x, out_h, out_w) label = tf.image.crop_to_bounding_box(label, offset_y, offset_x, out_h, out_w) fx = intrinsics[:,0,0] fy = intrinsics[:,1,1] cx = intrinsics[:,0,2] - tf.cast(offset_x, dtype=tf.float32) cy = intrinsics[:,1,2] - tf.cast(offset_y, dtype=tf.float32) intrinsics = self.make_intrinsics_matrix(fx, fy, cx, cy) return im, depth, label, intrinsics im, depth, label, intrinsics = random_scaling(im, depth,label, intrinsics) im, depth, label, intrinsics = random_cropping(im, depth, label, intrinsics, out_h, out_w) im = tf.cast(im, dtype=tf.uint8) return im,depth,label, intrinsics def format_file_list(self, data_root, split): with open(data_root + '/%s.txt' % split, 'r') as f: frames = f.readlines() subfolders = [x.split(' ')[0] for x in frames] frame_ids = [x.split(' ')[1][:-1] for x in frames] image_file_list = [os.path.join(data_root, subfolders[i], frame_ids[i] + '.jpg') for i in range(len(frames))] #print(image_file_list) cam_file_list = [os.path.join(data_root, subfolders[i], frame_ids[i] + '_cam.txt') for i in range(len(frames))] depth_file_list = [os.path.join(data_root, subfolders[i], frame_ids[i] + '_depth.png') for i in range(len(frames))] label_file_list = [os.path.join(data_root, subfolders[i], frame_ids[i] + '_label.png') for i in range(len(frames))] all_list = {} all_list['image_file_list'] = image_file_list all_list['cam_file_list'] = cam_file_list all_list['depth_file_list'] = depth_file_list all_list['label_file_list']= label_file_list return all_list def get_multi_scale_intrinsics(self, intrinsics, num_scales): intrinsics_mscale = [] # Scale the intrinsics accordingly for each scale for s in range(num_scales): fx = intrinsics[:,0,0]/(2 ** s) fy = intrinsics[:,1,1]/(2 ** s) cx = intrinsics[:,0,2]/(2 ** s) cy = intrinsics[:,1,2]/(2 ** s) intrinsics_mscale.append( self.make_intrinsics_matrix(fx, fy, cx, cy)) intrinsics_mscale = tf.stack(intrinsics_mscale, axis=1) return intrinsics_mscale ================================================ FILE: plane_segmentation/inference.py ================================================ from __future__ import division import tensorflow as tf import numpy as np import os import random import colorsys import time import scipy.misc import PIL.Image as pil import cv2 # from utils_depth_only import * from RecoverPlane_perpendicular import RecoverPlane ''' Test the train result on SYNTHIA (for params) @author -- Fengting Yang @modified by Sudarshan @usage: test the train result(param) with depth prediction metric. @Output: 1. plane masks and the visualization, 2. pred_depth maps 3. the statistic metric of depth prediction (see on the terminal) @parameters: 1. main parameters coudl be seen in the FLAGS 2. intrinsics: The camera intrinsics, note if the image is resized, please reset the intrinsics correspondingly ''' flags = tf.app.flags flags.DEFINE_integer("batch_size", 4, "The size of of a sample batch") flags.DEFINE_integer("img_height", 192, "Image height") flags.DEFINE_integer("img_width", 320, "Image width") flags.DEFINE_integer("num_plane",3, "plane number") flags.DEFINE_boolean("use_preprocessed", True, 'if use the propocessed data we provided for test' ) flags.DEFINE_string("dataset_dir", '', "Filtered Dataset directory") flags.DEFINE_string("output_dir", '', "Output directory") flags.DEFINE_string("gpu", "0", "GPU ID") flags.DEFINE_string("test_list", 'data_pre_processing/SYNTHIA/tst_100.txt', "Test list") flags.DEFINE_string("ckpt_file", 'pre_trained_model/synthia_498000', "checkpoint file") FLAGS = flags.FLAGS os.environ['CUDA_VISIBLE_DEVICES'] = FLAGS.gpu print(FLAGS.gpu) #intrinsics = np.array(([[133.185088,0.,160.000000], [ 0., 134.587036,96.000000], [0., 0., 1.]])) focalLength_x =133.185075 focalLength_y = 134.587036 centerX = 160.000000 centerY = 96.000000 TEST_LIST = str(FLAGS.test_list) num_test = 100 MAX_DEPTH = 100. MIN_DEPTH = 0.1 seed = 999 # ****************************colorful mask part************************************* # Usage: apply different color to each plane # the plane determination is based on the plane_threshold = 0.5 now # and the area without additional color are belong to non-plane # def random_colors(N, bright=True): """ Generate random colors. To get visually distinct colors, generate them in HSV space then convert to RGB. """ brightness = 1.0 if bright else 0.7 hsv = [(i / N, 1, brightness) for i in range(N)] colors = list(map(lambda c: colorsys.hsv_to_rgb(*c), hsv)) random.seed(seed) random.shuffle(colors) return colors def apply_mask(image, mask, max_mask, color, alpha=0.5): """Apply the given mask to the image. """ for c in range(3): image[:, :, c] = np.where(mask == max_mask, image[:, :, c] * (1 - alpha) + alpha * color[c] * 255, image[:, :, c]) return image def color_mask(image, pred_masks, colors, alpha=0.5 ): '''do iteration to assign the color to the corrosponding mask Based on experiment, the first plane will be red, and the second one will be green-blue, when num=2 and drak blue ,red ,green for num=3 ''' N = FLAGS.num_plane masked_image = np.copy(image) #new_colors = [ [ 255,125,255 ] , [255,255,255] , [125,255,0] ] # masked_image = image # change the original color as well, so the model could have the plane color when we visualize it max_mask = np.max(pred_masks, axis=-1) for i in range(N): color = colors[i] mask = pred_masks[:,:,i] masked_image = apply_mask(masked_image, mask, max_mask, color, alpha) return masked_image #**************************************get the predicted plane mask ******************* def thres_mask(pred_masks, num_plane): ''' find each plane's mask using the argmax(prob) approach ''' thres_mask = np.zeros(pred_masks.shape) #num_plane + 1 non-plane max_mask = np.max(pred_masks, axis=-1) # here for depth prediction, only plane possiblity is used for p in range(num_plane + 1): plane_mask = pred_masks[:, :, p] # in each channel the region with value 1 corresponding to the plane area in this plane thres_mask[:,:,p] = np.where(plane_mask == max_mask, thres_mask[:,:,p]+1., thres_mask[:,:,p]) return thres_mask #************************************get depth from plane parameters**************************** def meshgrid(height, width, is_homogeneous=True): """Construct a 2D meshgrid. Args: batch: batch size height: height of the grid width: width of the grid is_homogeneous: whether to return in homogeneous coordinates Returns: x,y grid coordinates [batch, 2 (3 if homogeneous), height, width] """ x_t = np.matmul(np.ones(shape=[height,1]),np.expand_dims(np.linspace(-1.,1,width),1).T) y_t = np.matmul(np.expand_dims(np.linspace(-1.0, 1.0, height), 1),np.ones(shape=np.stack([1, width]))) x_t = (x_t + 1.0) * 0.5 * (width - 1) y_t = (y_t + 1.0) * 0.5 * (height - 1) if is_homogeneous: ones = np.ones_like(x_t) coords =np.stack([x_t, y_t, ones], axis=0) else: coords = np.stack([x_t, y_t], axis=0) return coords def compute_depth(img, pred_param, num_plane, intrinsics): height = img.shape[0] width = img.shape[1] # Construct pixel grid coordinates pixel_coords = meshgrid(height, width) # 3*128*416 cam_coords = np.reshape(pixel_coords, [3, -1]) unscaled_ray_Q = np.matmul(np.linalg.inv(intrinsics), cam_coords) for k in range(num_plane): n_div_d = np.expand_dims(pred_param[ k, :], axis=0) scale = 1./ (np.matmul(n_div_d, np.matmul(np.linalg.inv(intrinsics), cam_coords)) + 1e-10) plane_based_Q = scale * (unscaled_ray_Q ) plane_based_Q = np.reshape(plane_based_Q, [3, height, width]) plane_based_Q = np.transpose(plane_based_Q, [1, 2, 0]) if k == 0: plane_depth_stack = plane_based_Q[:,:,-1:] else: plane_depth_stack = np.concatenate([plane_depth_stack, plane_based_Q[ :,:, -1:]], axis=-1) return plane_depth_stack #*********************************compute depth error************************************** def compute_errors(gt, pred): b_empyt = (gt.size == 0) # if there is no groundtruth available b_non_zero = np.all(gt * pred) # if one of them are 0 this will return false if b_empyt or not b_non_zero: return [-100, -100, -100, -100, -100, -100, -100] #this tst image will be ignored thresh = np.maximum((gt / pred), (pred / gt)) a1 = (thresh < 1.25 ).mean() a2 = (thresh < 1.25 ** 2).mean() a3 = (thresh < 1.25 ** 3).mean() rmse = (gt - pred) ** 2 rmse = np.sqrt(rmse.mean()) rmse_log = (np.log(gt) - np.log(pred)) ** 2 rmse_log = np.sqrt(rmse_log.mean()) abs_rel = np.mean(np.abs(gt - pred) / gt) sq_rel = np.mean(((gt - pred)**2) / gt) return abs_rel, sq_rel, rmse, rmse_log, a1, a2, a3 #***************************************************************************************************** def main(_): with open(TEST_LIST, 'r') as f: test_files_list = [] #depth_file_list = [] test_files = f.readlines() for t in test_files: t_split = t[:-1].split() if not FLAGS.use_preprocessed: # use these two lines only if you preprocessed the dataset from scratch test_files_list.append(FLAGS.dataset_dir + '/' + t_split[-1] ) #depth_file_list.append(FLAGS.dataset_dir + '/' + t_split[0] +'/'+ t_split[-1] + '_depth.png') else: # use these two lines if you use our preprocessed dataset if t_split[0] == '22': # seq 22 is not available in our preprocessed dataset, see README for more details continue test_files_list.append(FLAGS.dataset_dir + '/' + t_split[0] +'/'+ t_split[-1] ) #depth_file_list.append(FLAGS.dataset_dir + '/' + t_split[0] +'/'+ t_split[-1] + '_depth.png') if not os.path.exists(FLAGS.output_dir): os.makedirs(FLAGS.output_dir) basename = os.path.basename(FLAGS.ckpt_file) # to ensure the consistant color map default_top_five_colors = [(0.8, 0.0, 1.0), (0.8, 1.0, 0.0), (0.0, 1.0, 0.4), (1.0, 0.0, 0.0), (0.0, 0.4, 1.0)] if FLAGS.num_plane <= 5: colors = default_top_five_colors else: # Generate random colors colors = random_colors( FLAGS.num_plane) cnt = 0 for i in default_top_five_colors: if i not in colors: colors[cnt] = i cnt += 1 planeRecover = RecoverPlane() planeRecover.setup_inference(img_height=FLAGS.img_height, img_width=FLAGS.img_width, batch_size=FLAGS.batch_size, num_plane=FLAGS.num_plane ) rms = np.zeros(num_test, np.float32) log_rms = np.zeros(num_test, np.float32) abs_rel = np.zeros(num_test, np.float32) sq_rel = np.zeros(num_test, np.float32) a1 = np.zeros(num_test, np.float32) a2 = np.zeros(num_test, np.float32) a3 = np.zeros(num_test, np.float32) avg_time = 0. saver = tf.train.Saver([var for var in tf.model_variables()]) config = tf.ConfigProto() config.gpu_options.allow_growth = True with tf.Session(config=config) as sess: saver.restore(sess, FLAGS.ckpt_file) pred_all_masks = [] pred_all_param = [] for t in range(0, len(test_files_list), FLAGS.batch_size): if t % 100 == 0: print('processing %s: %d/%d' % (basename, t, len(test_files_list))) inputs = np.zeros( (FLAGS.batch_size, FLAGS.img_height, FLAGS.img_width, 3), dtype=np.uint8) for b in range(FLAGS.batch_size): idx = t + b if idx >= len(test_files_list): break fh = open(test_files_list[idx], 'r') raw_im = cv2.imread(test_files_list[idx]) #pil.open(fh) try: scaled_im = cv2.resize(raw_im , (FLAGS.img_width, FLAGS.img_height), interpolation = cv2.INTER_AREA) inputs[b] = np.array(scaled_im) except: print( str(test_files_list[idx]) + "is not read") start_time = time.time() pred = planeRecover.inference(inputs, sess) cost_time = time.time() - start_time avg_time += cost_time print("No.%d batch cost_time: %f/img" % (int(t / FLAGS.batch_size), cost_time / FLAGS.batch_size)) for b in range(FLAGS.batch_size): idx = t + b if idx >= len(test_files_list): break color_plane_mask = color_mask(inputs[b], pred['pred_mask'][b, :, :, :], colors, alpha=0.6) thres_masks = thres_mask(pred['pred_mask'][b, :, :, :], FLAGS.num_plane) # this will include non-plane mask at the last channle for depth #pred_depth = compute_depth(inputs[b], pred['pred_param'][b, :, :], FLAGS.num_plane, intrinsics) #masked_pred_depth = np.zeros([FLAGS.img_height, FLAGS.img_width, 1]) combined_mask = np.zeros([FLAGS.img_height, FLAGS.img_width, 1]) for p in range(FLAGS.num_plane): #masked_pred_depth += pred_depth[:, :, p: p + 1] * thres_masks[:, :, p: p + 1] combined_mask += (p + 1) * thres_masks[:, :, p: p + 1] # for matlab to eval all the plane should start from 1, and the last channle will be 0 as non-plane name = test_files_list[idx].split('/') if not FLAGS.use_preprocessed: pic_name = name[-3] + '_' + name[-1] else: pic_name = name[-2] + '_' + name[-1].replace('.jpg', '.png') visual_path = FLAGS.output_dir + '/plane_sgmts_vis/' eval_mask_path = FLAGS.output_dir + '/plane_sgmts/' modified_eval_mask_path = FLAGS.output_dir + '/plane_sgmts_modfied/' if not os.path.exists(visual_path): os.makedirs(visual_path) if not os.path.exists(eval_mask_path): os.makedirs(eval_mask_path) os.makedirs(modified_eval_mask_path) scipy.misc.imsave(visual_path + pic_name, color_plane_mask) #combined_mask = combined_mask cv2.imwrite(eval_mask_path + pic_name, combined_mask) # misc will normalize the number to 255 not good modified_combined_mask = combined_mask*60 cv2.imwrite(modified_eval_mask_path + pic_name, modified_combined_mask) ''' #*****************************depth eval************************ #print(str(depth_file_list[idx]) ) gt_depth = cv2.imread(depth_file_list[idx], cv2.IMREAD_ANYCOLOR | cv2.IMREAD_ANYDEPTH)[:, :, 0] / 100. gt_height, gt_width = gt_depth.shape masked_pred_depth = cv2.resize(masked_pred_depth, dsize=( gt_width, gt_height),interpolation=cv2.INTER_NEAREST) plane_area = cv2.resize(combined_mask, dsize=(gt_width, gt_height),interpolation=cv2.INTER_NEAREST) pred_depth_save = masked_pred_depth mask = np.logical_and(gt_depth > MIN_DEPTH, gt_depth < MAX_DEPTH) mask = np.logical_and(mask, plane_area > 0) # only eval the plane area pred_depth_save = pred_depth_save * mask * 100 seq = depth_file_list[idx].split('/') if not FLAGS.use_preprocessed: seq_id = seq[-3] name_id = seq[-1] else: seq_id = seq[-2] name_id = seq[-1].replace('_depth.png', '.png') if not os.path.exists(FLAGS.output_dir + '/depth/' ): os.makedirs(FLAGS.output_dir + '/depth/' ) save_pred_path = FLAGS.output_dir + '/depth/' + seq_id + '_' + name_id[:-4] + '_pred.png' # save pred_depth and error cv2.imwrite(save_pred_path, pred_depth_save.astype(np.uint16)) masked_pred_depth[masked_pred_depth < MIN_DEPTH] = MIN_DEPTH masked_pred_depth[masked_pred_depth > MAX_DEPTH] = MAX_DEPTH gt_depth_validate = gt_depth[mask] pred_depth_validate = masked_pred_depth[mask] abs_rel[idx], sq_rel[idx], rms[idx], log_rms[idx], a1[idx], a2[idx], a3[idx] = \ compute_errors(gt_depth_validate, pred_depth_validate) # test if the thres is correct # for p in range(FLAGS.num_plane): # thres_name = name[-3] + '_' + name[-1][:-4] + '_' + str(p) + '.png' # scipy.misc.imsave(visual_path + thres_name, thres_masks[:,:,p]) pred_all_masks.append(pred['pred_mask'][b,:,:,:]) pred_all_param.append(pred['pred_param'][b,:,:]) abs_rel = abs_rel[abs_rel != -100] sq_rel = sq_rel[sq_rel != -100] rms = rms[rms != -100] log_rms = log_rms[log_rms != -100] a1 = a1[a1 != -100] a2 = a2[a2 != -100] a3 = a3[a3 != -100] print("{:>10}, {:>10}, {:>10}, {:>10}, {:>10}, {:>10}, {:>10}".format('abs_rel', 'sq_rel', 'rms', 'log_rms', 'a1', 'a2', 'a3')) print("{:10.4f}, {:10.4f}, {:10.4f}, {:10.4f}, {:10.4f}, {:10.4f}, {:10.4f}".format( abs_rel.mean(), sq_rel.mean(), rms.mean(), log_rms.mean(), a1.mean(), a2.mean(), a3.mean())) print('avg_time %.4f/img' %(avg_time/num_test)) ''' if __name__ == '__main__': tf.app.run() ================================================ FILE: plane_segmentation/net.py ================================================ from __future__ import division import tensorflow as tf import tensorflow.contrib.slim as slim from tensorflow.contrib.layers.python.layers import utils import numpy as np SCALING = 1.0 BIAS = 0. def resize_like(inputs, ref): iH, iW = inputs.get_shape()[1], inputs.get_shape()[2] rH, rW = ref.get_shape()[1], ref.get_shape()[2] if iH == rH and iW == rW: return inputs return tf.image.resize_nearest_neighbor(inputs, [rH.value, rW.value]) def plane_pred_net(tgt_image, num_plane, is_training=True): H = tgt_image.get_shape()[1].value W = tgt_image.get_shape()[2].value with tf.variable_scope('depth_net') as sc: # design the nn architecture for the depth network end_points_collection = sc.original_name_scope + '_end_points' with slim.arg_scope([slim.conv2d, slim.conv2d_transpose], #define a conv2d operator with fixed params shown below normalizer_fn=None, weights_regularizer=slim.l2_regularizer(0.05), # using l2 regularizer with 0.05 weight activation_fn=tf.nn.relu, outputs_collections=end_points_collection): #for slim.conv2d the default padding mode = 'same' cnv1 = slim.conv2d(tgt_image, 32, [7, 7], stride=2, scope='cnv1') #4*96*160*32 cnv1b = slim.conv2d(cnv1, 32, [7, 7], stride=1, scope='cnv1b') cnv2 = slim.conv2d(cnv1b, 64, [5, 5], stride=2, scope='cnv2') #4*48*80*64 cnv2b = slim.conv2d(cnv2, 64, [5, 5], stride=1, scope='cnv2b') cnv3 = slim.conv2d(cnv2b, 128, [3, 3], stride=2, scope='cnv3') #4*24*40*128 cnv3b = slim.conv2d(cnv3, 128, [3, 3], stride=1, scope='cnv3b') cnv4 = slim.conv2d(cnv3b, 256, [3, 3], stride=2, scope='cnv4') #4*12*20*256 cnv4b = slim.conv2d(cnv4, 256, [3, 3], stride=1, scope='cnv4b') cnv5 = slim.conv2d(cnv4b, 512, [3, 3], stride=2, scope='cnv5') cnv5b = slim.conv2d(cnv5, 512, [3, 3], stride=1, scope='cnv5b') # 4*6*10*256 with tf.variable_scope('param'): cnv6_plane = slim.conv2d(cnv5b, 512, [3, 3], stride=2, scope='cnv6_plane') # 4*3*5*256 cnv7_plane = slim.conv2d(cnv6_plane, 512, [3, 3], stride=2, scope='cnv7_plane') # 4*2*3*256 param_pred = slim.conv2d(cnv7_plane, 3*(num_plane), [1, 1], scope='param', # 4*2*3*3n stride=1, normalizer_fn=None, activation_fn=None) param_avg = tf.reduce_mean(param_pred, [1, 2]) #4*3n # Empirically we found that scaling by a small constant facilitates training. param_final = 0.01 * tf.reshape(param_avg, [-1, (num_plane), 3]) #4*n*3, 2 for n planes in tgt, B*n*num_param with tf.variable_scope('mask'): upcnv5 = slim.conv2d_transpose(cnv5b, 256, [3, 3], stride=2, scope='upcnv5') i5_in = tf.concat([upcnv5, cnv4b], axis=3) icnv5 = slim.conv2d(i5_in, 256, [3, 3], stride=1, scope='icnv5') # 4*12*20*256 upcnv4 = slim.conv2d_transpose(icnv5, 128, [3, 3], stride=2, scope='upcnv4') # 4*24*40*128 i4_in = tf.concat([upcnv4, cnv3b], axis=3) icnv4 = slim.conv2d(i4_in, 128, [3, 3], stride=1, scope='icnv4') segm4 = SCALING * slim.conv2d(icnv4, num_plane + 1, [3, 3], stride=1, activation_fn=None, normalizer_fn=None, scope='disp4') + BIAS # 4*24*40*(1+n) segm4_up = tf.image.resize_bilinear(segm4, [np.int(H/4), np.int(W/4)]) upcnv3 = slim.conv2d_transpose(icnv4, 64, [3, 3], stride=2, scope='upcnv3') # 4*48*80*64 i3_in = tf.concat([upcnv3, cnv2b, segm4_up], axis=3) icnv3 = slim.conv2d(i3_in, 64, [3, 3], stride=1, scope='icnv3') segm3 = SCALING * slim.conv2d(icnv3, num_plane + 1, [3, 3], stride=1, #4*48*80*(1+n) activation_fn=None, normalizer_fn=None, scope='disp3') + BIAS segm3_up = tf.image.resize_bilinear(segm3, [np.int(H/2), np.int(W/2)]) upcnv2 = slim.conv2d_transpose(icnv3, 32, [3, 3], stride=2, scope='upcnv2') # 4*96*160*32 i2_in = tf.concat([upcnv2, cnv1b, segm3_up], axis=3) icnv2 = slim.conv2d(i2_in, 32, [3, 3], stride=1, scope='icnv2') segm2 = SCALING * slim.conv2d(icnv2, num_plane + 1, [3, 3], stride=1, #4*96*160*(n+1) activation_fn=None, normalizer_fn=None, scope='disp2') + BIAS segm2_up = tf.image.resize_bilinear(segm2, [H, W]) upcnv1 = slim.conv2d_transpose(icnv2, 16, [3, 3], stride=2, scope='upcnv1') #4*192*320*16 i1_in = tf.concat([upcnv1, segm2_up], axis=3) icnv1 = slim.conv2d(i1_in, 16, [3, 3], stride=1, scope='icnv1') #4*192*320*(n+1) segm1 = SCALING * slim.conv2d(icnv1, num_plane + 1, [3, 3], stride=1, activation_fn=None, normalizer_fn=None, scope='disp1') + BIAS end_points = utils.convert_collection_to_dict(end_points_collection) return param_final, [segm1, segm2, segm3, segm4], end_points ================================================ FILE: plane_segmentation/openloris.txt ================================================ OpenLORIS_images img5314.jpg OpenLORIS_images img96.jpg OpenLORIS_images img268.jpg OpenLORIS_images img4489.jpg OpenLORIS_images img4539.jpg OpenLORIS_images img870.jpg OpenLORIS_images img197.jpg OpenLORIS_images img5397.jpg OpenLORIS_images img666.jpg OpenLORIS_images img1147.jpg OpenLORIS_images img113.jpg OpenLORIS_images img2986.jpg OpenLORIS_images img1138.jpg OpenLORIS_images img2658.jpg OpenLORIS_images img2920.jpg OpenLORIS_images img360.jpg OpenLORIS_images img1074.jpg OpenLORIS_images img5622.jpg OpenLORIS_images img219.jpg OpenLORIS_images img4272.jpg OpenLORIS_images img1308.jpg OpenLORIS_images img5827.jpg OpenLORIS_images img3532.jpg OpenLORIS_images img140.jpg OpenLORIS_images img3453.jpg OpenLORIS_images img2149.jpg OpenLORIS_images img891.jpg OpenLORIS_images img4264.jpg OpenLORIS_images img6088.jpg OpenLORIS_images img1665.jpg OpenLORIS_images img3305.jpg OpenLORIS_images img2530.jpg OpenLORIS_images img878.jpg OpenLORIS_images img1559.jpg OpenLORIS_images img4105.jpg OpenLORIS_images img4566.jpg OpenLORIS_images img942.jpg OpenLORIS_images img1382.jpg OpenLORIS_images img480.jpg OpenLORIS_images img167.jpg OpenLORIS_images img3691.jpg OpenLORIS_images img4288.jpg OpenLORIS_images img5014.jpg OpenLORIS_images img1798.jpg OpenLORIS_images img1436.jpg OpenLORIS_images img5788.jpg OpenLORIS_images img1045.jpg OpenLORIS_images img3274.jpg OpenLORIS_images img4418.jpg OpenLORIS_images img4196.jpg OpenLORIS_images img3359.jpg OpenLORIS_images img3111.jpg OpenLORIS_images img657.jpg OpenLORIS_images img4777.jpg OpenLORIS_images img807.jpg OpenLORIS_images img2580.jpg OpenLORIS_images img2800.jpg OpenLORIS_images img4918.jpg OpenLORIS_images img642.jpg OpenLORIS_images img2153.jpg OpenLORIS_images img6023.jpg OpenLORIS_images img1506.jpg OpenLORIS_images img439.jpg OpenLORIS_images img5945.jpg OpenLORIS_images img2272.jpg OpenLORIS_images img3606.jpg OpenLORIS_images img1723.jpg OpenLORIS_images img3041.jpg OpenLORIS_images img2644.jpg OpenLORIS_images img2870.jpg OpenLORIS_images img1160.jpg OpenLORIS_images img1268.jpg OpenLORIS_images img1175.jpg OpenLORIS_images img998.jpg OpenLORIS_images img5307.jpg OpenLORIS_images img554.jpg OpenLORIS_images img5395.jpg OpenLORIS_images img4286.jpg OpenLORIS_images img4500.jpg OpenLORIS_images img6026.jpg OpenLORIS_images img6097.jpg OpenLORIS_images img969.jpg OpenLORIS_images img810.jpg OpenLORIS_images img1377.jpg OpenLORIS_images img329.jpg OpenLORIS_images img1272.jpg OpenLORIS_images img3459.jpg OpenLORIS_images img5401.jpg OpenLORIS_images img4271.jpg OpenLORIS_images img2087.jpg OpenLORIS_images img4952.jpg OpenLORIS_images img4445.jpg OpenLORIS_images img2290.jpg OpenLORIS_images img3419.jpg OpenLORIS_images img3525.jpg OpenLORIS_images img5643.jpg OpenLORIS_images img463.jpg OpenLORIS_images img5576.jpg OpenLORIS_images img3045.jpg OpenLORIS_images img6138.jpg OpenLORIS_images img4001.jpg OpenLORIS_images img6078.jpg OpenLORIS_images img3962.jpg OpenLORIS_images img3235.jpg OpenLORIS_images img4744.jpg OpenLORIS_images img66.jpg OpenLORIS_images img3429.jpg OpenLORIS_images img1444.jpg OpenLORIS_images img2902.jpg OpenLORIS_images img3772.jpg OpenLORIS_images img367.jpg OpenLORIS_images img5527.jpg OpenLORIS_images img1445.jpg OpenLORIS_images img4532.jpg OpenLORIS_images img1530.jpg OpenLORIS_images img3421.jpg OpenLORIS_images img4354.jpg OpenLORIS_images img4753.jpg OpenLORIS_images img3572.jpg OpenLORIS_images img57.jpg OpenLORIS_images img412.jpg OpenLORIS_images img4799.jpg OpenLORIS_images img3783.jpg OpenLORIS_images img2500.jpg OpenLORIS_images img3537.jpg OpenLORIS_images img1705.jpg OpenLORIS_images img5201.jpg OpenLORIS_images img5507.jpg OpenLORIS_images img5960.jpg OpenLORIS_images img4234.jpg OpenLORIS_images img5849.jpg OpenLORIS_images img2685.jpg OpenLORIS_images img3275.jpg OpenLORIS_images img4832.jpg OpenLORIS_images img5173.jpg OpenLORIS_images img1985.jpg OpenLORIS_images img3257.jpg OpenLORIS_images img266.jpg OpenLORIS_images img5211.jpg OpenLORIS_images img601.jpg OpenLORIS_images img2547.jpg OpenLORIS_images img5853.jpg OpenLORIS_images img1402.jpg OpenLORIS_images img1821.jpg OpenLORIS_images img1936.jpg OpenLORIS_images img2472.jpg OpenLORIS_images img4168.jpg OpenLORIS_images img3473.jpg OpenLORIS_images img4290.jpg OpenLORIS_images img1536.jpg OpenLORIS_images img1021.jpg OpenLORIS_images img2907.jpg OpenLORIS_images img3838.jpg OpenLORIS_images img1527.jpg OpenLORIS_images img5499.jpg OpenLORIS_images img5105.jpg OpenLORIS_images img5226.jpg OpenLORIS_images img4507.jpg OpenLORIS_images img372.jpg OpenLORIS_images img3671.jpg OpenLORIS_images img911.jpg OpenLORIS_images img4130.jpg OpenLORIS_images img1385.jpg OpenLORIS_images img5342.jpg OpenLORIS_images img761.jpg OpenLORIS_images img819.jpg OpenLORIS_images img586.jpg OpenLORIS_images img3547.jpg OpenLORIS_images img882.jpg OpenLORIS_images img1637.jpg OpenLORIS_images img3583.jpg OpenLORIS_images img3776.jpg OpenLORIS_images img481.jpg OpenLORIS_images img1575.jpg OpenLORIS_images img5620.jpg OpenLORIS_images img2332.jpg OpenLORIS_images img704.jpg OpenLORIS_images img5991.jpg OpenLORIS_images img5510.jpg OpenLORIS_images img2985.jpg OpenLORIS_images img4894.jpg OpenLORIS_images img2311.jpg OpenLORIS_images img38.jpg OpenLORIS_images img3103.jpg OpenLORIS_images img4201.jpg OpenLORIS_images img4411.jpg OpenLORIS_images img3593.jpg OpenLORIS_images img5807.jpg OpenLORIS_images img2731.jpg OpenLORIS_images img3754.jpg OpenLORIS_images img4645.jpg OpenLORIS_images img3941.jpg OpenLORIS_images img3734.jpg OpenLORIS_images img1466.jpg OpenLORIS_images img2343.jpg OpenLORIS_images img5974.jpg OpenLORIS_images img2539.jpg OpenLORIS_images img3032.jpg OpenLORIS_images img4928.jpg OpenLORIS_images img4031.jpg OpenLORIS_images img445.jpg OpenLORIS_images img1831.jpg OpenLORIS_images img5585.jpg OpenLORIS_images img5791.jpg OpenLORIS_images img3983.jpg OpenLORIS_images img3637.jpg OpenLORIS_images img1778.jpg OpenLORIS_images img2952.jpg OpenLORIS_images img582.jpg OpenLORIS_images img1199.jpg OpenLORIS_images img4658.jpg OpenLORIS_images img3607.jpg OpenLORIS_images img2399.jpg OpenLORIS_images img2329.jpg OpenLORIS_images img2026.jpg OpenLORIS_images img5570.jpg OpenLORIS_images img5126.jpg OpenLORIS_images img1468.jpg OpenLORIS_images img4819.jpg OpenLORIS_images img3628.jpg OpenLORIS_images img4217.jpg OpenLORIS_images img654.jpg OpenLORIS_images img860.jpg OpenLORIS_images img410.jpg OpenLORIS_images img2691.jpg OpenLORIS_images img5119.jpg OpenLORIS_images img680.jpg OpenLORIS_images img3914.jpg OpenLORIS_images img5645.jpg OpenLORIS_images img1406.jpg OpenLORIS_images img5323.jpg OpenLORIS_images img418.jpg OpenLORIS_images img4270.jpg OpenLORIS_images img5487.jpg OpenLORIS_images img3969.jpg OpenLORIS_images img6048.jpg OpenLORIS_images img3324.jpg OpenLORIS_images img3599.jpg OpenLORIS_images img4811.jpg OpenLORIS_images img2327.jpg OpenLORIS_images img2361.jpg OpenLORIS_images img505.jpg OpenLORIS_images img2548.jpg OpenLORIS_images img2098.jpg OpenLORIS_images img2181.jpg OpenLORIS_images img4737.jpg OpenLORIS_images img1396.jpg OpenLORIS_images img2876.jpg OpenLORIS_images img5363.jpg OpenLORIS_images img2080.jpg OpenLORIS_images img228.jpg OpenLORIS_images img1718.jpg OpenLORIS_images img4027.jpg OpenLORIS_images img3787.jpg OpenLORIS_images img2879.jpg OpenLORIS_images img1650.jpg OpenLORIS_images img5752.jpg OpenLORIS_images img5721.jpg OpenLORIS_images img1603.jpg OpenLORIS_images img632.jpg OpenLORIS_images img2968.jpg OpenLORIS_images img2219.jpg OpenLORIS_images img4451.jpg OpenLORIS_images img4429.jpg OpenLORIS_images img907.jpg OpenLORIS_images img4885.jpg OpenLORIS_images img1866.jpg OpenLORIS_images img5477.jpg OpenLORIS_images img1515.jpg OpenLORIS_images img3812.jpg OpenLORIS_images img5930.jpg OpenLORIS_images img798.jpg OpenLORIS_images img4200.jpg OpenLORIS_images img2247.jpg OpenLORIS_images img2606.jpg OpenLORIS_images img598.jpg OpenLORIS_images img3574.jpg OpenLORIS_images img2498.jpg OpenLORIS_images img1302.jpg OpenLORIS_images img4736.jpg OpenLORIS_images img4881.jpg OpenLORIS_images img1817.jpg OpenLORIS_images img4559.jpg OpenLORIS_images img1130.jpg OpenLORIS_images img861.jpg OpenLORIS_images img1471.jpg OpenLORIS_images img3594.jpg OpenLORIS_images img4542.jpg OpenLORIS_images img4564.jpg OpenLORIS_images img3095.jpg OpenLORIS_images img2549.jpg OpenLORIS_images img568.jpg OpenLORIS_images img2167.jpg OpenLORIS_images img2948.jpg OpenLORIS_images img3677.jpg OpenLORIS_images img5521.jpg OpenLORIS_images img2944.jpg OpenLORIS_images img550.jpg OpenLORIS_images img1340.jpg OpenLORIS_images img5346.jpg OpenLORIS_images img3524.jpg OpenLORIS_images img1314.jpg OpenLORIS_images img3864.jpg OpenLORIS_images img1967.jpg OpenLORIS_images img4497.jpg OpenLORIS_images img212.jpg OpenLORIS_images img3480.jpg OpenLORIS_images img2161.jpg OpenLORIS_images img4584.jpg OpenLORIS_images img5770.jpg OpenLORIS_images img4471.jpg OpenLORIS_images img2286.jpg OpenLORIS_images img6067.jpg OpenLORIS_images img4640.jpg OpenLORIS_images img1706.jpg OpenLORIS_images img2335.jpg OpenLORIS_images img4654.jpg OpenLORIS_images img3877.jpg OpenLORIS_images img2425.jpg OpenLORIS_images img4941.jpg OpenLORIS_images img4828.jpg OpenLORIS_images img6089.jpg OpenLORIS_images img4009.jpg OpenLORIS_images img2524.jpg OpenLORIS_images img2645.jpg OpenLORIS_images img5890.jpg OpenLORIS_images img618.jpg OpenLORIS_images img5149.jpg OpenLORIS_images img4842.jpg OpenLORIS_images img472.jpg OpenLORIS_images img2018.jpg OpenLORIS_images img4848.jpg OpenLORIS_images img5804.jpg OpenLORIS_images img1484.jpg OpenLORIS_images img5528.jpg OpenLORIS_images img4227.jpg OpenLORIS_images img5864.jpg OpenLORIS_images img4981.jpg OpenLORIS_images img672.jpg OpenLORIS_images img3160.jpg OpenLORIS_images img2519.jpg OpenLORIS_images img2626.jpg OpenLORIS_images img3082.jpg OpenLORIS_images img5843.jpg OpenLORIS_images img3260.jpg OpenLORIS_images img296.jpg OpenLORIS_images img3620.jpg OpenLORIS_images img3369.jpg OpenLORIS_images img5172.jpg OpenLORIS_images img5413.jpg OpenLORIS_images img3503.jpg OpenLORIS_images img5114.jpg OpenLORIS_images img2947.jpg OpenLORIS_images img2997.jpg OpenLORIS_images img6112.jpg OpenLORIS_images img1880.jpg OpenLORIS_images img2430.jpg OpenLORIS_images img417.jpg OpenLORIS_images img1085.jpg OpenLORIS_images img5188.jpg OpenLORIS_images img4249.jpg OpenLORIS_images img3775.jpg OpenLORIS_images img4207.jpg OpenLORIS_images img4275.jpg OpenLORIS_images img3999.jpg OpenLORIS_images img3793.jpg OpenLORIS_images img817.jpg OpenLORIS_images img1940.jpg OpenLORIS_images img4661.jpg OpenLORIS_images img3885.jpg OpenLORIS_images img4754.jpg OpenLORIS_images img5468.jpg OpenLORIS_images img4649.jpg OpenLORIS_images img5744.jpg OpenLORIS_images img5706.jpg OpenLORIS_images img1965.jpg OpenLORIS_images img3826.jpg OpenLORIS_images img5618.jpg OpenLORIS_images img46.jpg OpenLORIS_images img800.jpg OpenLORIS_images img5666.jpg OpenLORIS_images img5580.jpg OpenLORIS_images img5443.jpg OpenLORIS_images img1494.jpg OpenLORIS_images img658.jpg OpenLORIS_images img4515.jpg OpenLORIS_images img2256.jpg OpenLORIS_images img3167.jpg OpenLORIS_images img5942.jpg OpenLORIS_images img2512.jpg OpenLORIS_images img3135.jpg OpenLORIS_images img5596.jpg OpenLORIS_images img4895.jpg OpenLORIS_images img1150.jpg OpenLORIS_images img4746.jpg OpenLORIS_images img793.jpg OpenLORIS_images img3522.jpg OpenLORIS_images img264.jpg OpenLORIS_images img1740.jpg OpenLORIS_images img985.jpg OpenLORIS_images img786.jpg OpenLORIS_images img249.jpg OpenLORIS_images img3785.jpg OpenLORIS_images img5391.jpg OpenLORIS_images img2196.jpg OpenLORIS_images img1295.jpg OpenLORIS_images img1561.jpg OpenLORIS_images img220.jpg OpenLORIS_images img4040.jpg OpenLORIS_images img1025.jpg OpenLORIS_images img5992.jpg OpenLORIS_images img2801.jpg OpenLORIS_images img2410.jpg OpenLORIS_images img1422.jpg OpenLORIS_images img5777.jpg OpenLORIS_images img5099.jpg OpenLORIS_images img5179.jpg OpenLORIS_images img287.jpg OpenLORIS_images img2700.jpg OpenLORIS_images img3489.jpg OpenLORIS_images img3986.jpg OpenLORIS_images img5980.jpg OpenLORIS_images img150.jpg OpenLORIS_images img5293.jpg OpenLORIS_images img597.jpg OpenLORIS_images img636.jpg OpenLORIS_images img3151.jpg OpenLORIS_images img4898.jpg OpenLORIS_images img3212.jpg OpenLORIS_images img4137.jpg OpenLORIS_images img715.jpg OpenLORIS_images img4607.jpg OpenLORIS_images img1545.jpg OpenLORIS_images img4548.jpg OpenLORIS_images img76.jpg OpenLORIS_images img1984.jpg OpenLORIS_images img58.jpg OpenLORIS_images img5880.jpg OpenLORIS_images img1301.jpg OpenLORIS_images img5322.jpg OpenLORIS_images img4969.jpg OpenLORIS_images img4339.jpg OpenLORIS_images img1262.jpg OpenLORIS_images img627.jpg OpenLORIS_images img2825.jpg OpenLORIS_images img4120.jpg OpenLORIS_images img145.jpg OpenLORIS_images img5602.jpg OpenLORIS_images img5200.jpg OpenLORIS_images img2897.jpg OpenLORIS_images img5202.jpg OpenLORIS_images img4015.jpg OpenLORIS_images img2415.jpg OpenLORIS_images img2291.jpg OpenLORIS_images img1048.jpg OpenLORIS_images img1572.jpg OpenLORIS_images img4395.jpg OpenLORIS_images img3371.jpg OpenLORIS_images img4006.jpg OpenLORIS_images img750.jpg OpenLORIS_images img1946.jpg OpenLORIS_images img5012.jpg OpenLORIS_images img1079.jpg OpenLORIS_images img5967.jpg OpenLORIS_images img3645.jpg OpenLORIS_images img4947.jpg OpenLORIS_images img974.jpg OpenLORIS_images img1814.jpg OpenLORIS_images img448.jpg OpenLORIS_images img2827.jpg OpenLORIS_images img2132.jpg OpenLORIS_images img5539.jpg OpenLORIS_images img4259.jpg OpenLORIS_images img5525.jpg OpenLORIS_images img1139.jpg OpenLORIS_images img4315.jpg OpenLORIS_images img1294.jpg OpenLORIS_images img5124.jpg OpenLORIS_images img4318.jpg OpenLORIS_images img2831.jpg OpenLORIS_images img379.jpg OpenLORIS_images img1037.jpg OpenLORIS_images img2484.jpg OpenLORIS_images img3216.jpg OpenLORIS_images img3472.jpg OpenLORIS_images img602.jpg OpenLORIS_images img4123.jpg OpenLORIS_images img5497.jpg OpenLORIS_images img2354.jpg OpenLORIS_images img1989.jpg OpenLORIS_images img5356.jpg OpenLORIS_images img5533.jpg OpenLORIS_images img1057.jpg OpenLORIS_images img853.jpg OpenLORIS_images img2518.jpg OpenLORIS_images img1780.jpg OpenLORIS_images img888.jpg OpenLORIS_images img3786.jpg OpenLORIS_images img2486.jpg OpenLORIS_images img223.jpg OpenLORIS_images img1350.jpg OpenLORIS_images img4362.jpg OpenLORIS_images img3273.jpg OpenLORIS_images img3742.jpg OpenLORIS_images img449.jpg OpenLORIS_images img1977.jpg OpenLORIS_images img899.jpg OpenLORIS_images img3299.jpg OpenLORIS_images img4177.jpg OpenLORIS_images img5359.jpg OpenLORIS_images img3562.jpg OpenLORIS_images img5613.jpg OpenLORIS_images img422.jpg OpenLORIS_images img3121.jpg OpenLORIS_images img2793.jpg OpenLORIS_images img2762.jpg OpenLORIS_images img6125.jpg OpenLORIS_images img813.jpg OpenLORIS_images img936.jpg OpenLORIS_images img1801.jpg OpenLORIS_images img1846.jpg OpenLORIS_images img5476.jpg OpenLORIS_images img5837.jpg OpenLORIS_images img4707.jpg OpenLORIS_images img365.jpg OpenLORIS_images img826.jpg OpenLORIS_images img4254.jpg OpenLORIS_images img5892.jpg OpenLORIS_images img2684.jpg OpenLORIS_images img4066.jpg OpenLORIS_images img1775.jpg OpenLORIS_images img2506.jpg OpenLORIS_images img3469.jpg OpenLORIS_images img573.jpg OpenLORIS_images img386.jpg OpenLORIS_images img2072.jpg OpenLORIS_images img5517.jpg OpenLORIS_images img62.jpg OpenLORIS_images img5883.jpg OpenLORIS_images img3973.jpg OpenLORIS_images img3505.jpg OpenLORIS_images img3224.jpg OpenLORIS_images img3700.jpg OpenLORIS_images img4624.jpg OpenLORIS_images img5268.jpg OpenLORIS_images img5233.jpg OpenLORIS_images img124.jpg OpenLORIS_images img2422.jpg OpenLORIS_images img4398.jpg OpenLORIS_images img3470.jpg OpenLORIS_images img1334.jpg OpenLORIS_images img2511.jpg OpenLORIS_images img965.jpg OpenLORIS_images img4750.jpg OpenLORIS_images img1945.jpg OpenLORIS_images img2651.jpg OpenLORIS_images img1349.jpg OpenLORIS_images img4678.jpg OpenLORIS_images img5327.jpg OpenLORIS_images img2611.jpg OpenLORIS_images img1426.jpg OpenLORIS_images img2744.jpg OpenLORIS_images img4237.jpg OpenLORIS_images img2244.jpg OpenLORIS_images img2582.jpg OpenLORIS_images img4487.jpg OpenLORIS_images img3022.jpg OpenLORIS_images img199.jpg OpenLORIS_images img5370.jpg OpenLORIS_images img5831.jpg OpenLORIS_images img68.jpg OpenLORIS_images img1472.jpg OpenLORIS_images img4492.jpg OpenLORIS_images img5496.jpg OpenLORIS_images img1619.jpg OpenLORIS_images img2287.jpg OpenLORIS_images img758.jpg OpenLORIS_images img1476.jpg OpenLORIS_images img4148.jpg OpenLORIS_images img1848.jpg OpenLORIS_images img1276.jpg OpenLORIS_images img788.jpg OpenLORIS_images img2192.jpg OpenLORIS_images img3567.jpg OpenLORIS_images img5367.jpg OpenLORIS_images img5313.jpg OpenLORIS_images img3394.jpg OpenLORIS_images img958.jpg OpenLORIS_images img237.jpg OpenLORIS_images img1644.jpg OpenLORIS_images img1832.jpg OpenLORIS_images img1787.jpg OpenLORIS_images img5351.jpg OpenLORIS_images img2302.jpg OpenLORIS_images img2980.jpg OpenLORIS_images img3071.jpg OpenLORIS_images img5915.jpg OpenLORIS_images img2101.jpg OpenLORIS_images img5402.jpg OpenLORIS_images img383.jpg OpenLORIS_images img191.jpg OpenLORIS_images img6091.jpg OpenLORIS_images img3831.jpg OpenLORIS_images img4195.jpg OpenLORIS_images img4547.jpg OpenLORIS_images img2806.jpg OpenLORIS_images img3579.jpg OpenLORIS_images img2303.jpg OpenLORIS_images img3884.jpg OpenLORIS_images img4034.jpg OpenLORIS_images img3902.jpg OpenLORIS_images img168.jpg OpenLORIS_images img4024.jpg OpenLORIS_images img1087.jpg OpenLORIS_images img4767.jpg OpenLORIS_images img4953.jpg OpenLORIS_images img3454.jpg OpenLORIS_images img6056.jpg OpenLORIS_images img2996.jpg OpenLORIS_images img3017.jpg OpenLORIS_images img1080.jpg OpenLORIS_images img4555.jpg OpenLORIS_images img442.jpg OpenLORIS_images img5720.jpg OpenLORIS_images img949.jpg OpenLORIS_images img5844.jpg OpenLORIS_images img4977.jpg OpenLORIS_images img2924.jpg OpenLORIS_images img3277.jpg OpenLORIS_images img2189.jpg OpenLORIS_images img101.jpg OpenLORIS_images img4454.jpg OpenLORIS_images img5485.jpg OpenLORIS_images img2151.jpg OpenLORIS_images img5054.jpg OpenLORIS_images img4038.jpg OpenLORIS_images img3254.jpg OpenLORIS_images img5414.jpg OpenLORIS_images img2250.jpg OpenLORIS_images img3733.jpg OpenLORIS_images img4477.jpg OpenLORIS_images img1157.jpg OpenLORIS_images img192.jpg OpenLORIS_images img1208.jpg OpenLORIS_images img1697.jpg OpenLORIS_images img3449.jpg OpenLORIS_images img4365.jpg OpenLORIS_images img208.jpg OpenLORIS_images img321.jpg OpenLORIS_images img5338.jpg OpenLORIS_images img5100.jpg OpenLORIS_images img4216.jpg OpenLORIS_images img4091.jpg OpenLORIS_images img1239.jpg OpenLORIS_images img2673.jpg OpenLORIS_images img898.jpg OpenLORIS_images img2239.jpg OpenLORIS_images img348.jpg OpenLORIS_images img3431.jpg OpenLORIS_images img664.jpg OpenLORIS_images img1763.jpg OpenLORIS_images img2843.jpg OpenLORIS_images img4541.jpg OpenLORIS_images img5652.jpg OpenLORIS_images img617.jpg OpenLORIS_images img1904.jpg OpenLORIS_images img794.jpg OpenLORIS_images img3332.jpg OpenLORIS_images img4021.jpg OpenLORIS_images img631.jpg OpenLORIS_images img5852.jpg OpenLORIS_images img4738.jpg OpenLORIS_images img5558.jpg OpenLORIS_images img5065.jpg OpenLORIS_images img613.jpg OpenLORIS_images img1997.jpg OpenLORIS_images img1235.jpg OpenLORIS_images img5747.jpg OpenLORIS_images img530.jpg OpenLORIS_images img5020.jpg OpenLORIS_images img3891.jpg OpenLORIS_images img3866.jpg OpenLORIS_images img1870.jpg OpenLORIS_images img2496.jpg OpenLORIS_images img5912.jpg OpenLORIS_images img895.jpg OpenLORIS_images img4887.jpg OpenLORIS_images img358.jpg OpenLORIS_images img984.jpg OpenLORIS_images img6100.jpg OpenLORIS_images img1890.jpg OpenLORIS_images img310.jpg OpenLORIS_images img4745.jpg OpenLORIS_images img4505.jpg OpenLORIS_images img341.jpg OpenLORIS_images img5899.jpg OpenLORIS_images img4709.jpg OpenLORIS_images img3183.jpg OpenLORIS_images img5855.jpg OpenLORIS_images img3443.jpg OpenLORIS_images img4229.jpg OpenLORIS_images img250.jpg OpenLORIS_images img4699.jpg OpenLORIS_images img4713.jpg OpenLORIS_images img3303.jpg OpenLORIS_images img5425.jpg OpenLORIS_images img4396.jpg OpenLORIS_images img1304.jpg OpenLORIS_images img2574.jpg OpenLORIS_images img3361.jpg OpenLORIS_images img3344.jpg OpenLORIS_images img5285.jpg OpenLORIS_images img5198.jpg OpenLORIS_images img4861.jpg OpenLORIS_images img5735.jpg OpenLORIS_images img5937.jpg OpenLORIS_images img4636.jpg OpenLORIS_images img285.jpg OpenLORIS_images img1852.jpg OpenLORIS_images img1253.jpg OpenLORIS_images img3368.jpg OpenLORIS_images img4221.jpg OpenLORIS_images img236.jpg OpenLORIS_images img4138.jpg OpenLORIS_images img2315.jpg OpenLORIS_images img1613.jpg OpenLORIS_images img3501.jpg OpenLORIS_images img5897.jpg OpenLORIS_images img1423.jpg OpenLORIS_images img564.jpg OpenLORIS_images img98.jpg OpenLORIS_images img4870.jpg OpenLORIS_images img3389.jpg OpenLORIS_images img4440.jpg OpenLORIS_images img5832.jpg OpenLORIS_images img5495.jpg OpenLORIS_images img849.jpg OpenLORIS_images img323.jpg OpenLORIS_images img5361.jpg OpenLORIS_images img3638.jpg OpenLORIS_images img3373.jpg OpenLORIS_images img1730.jpg OpenLORIS_images img78.jpg OpenLORIS_images img4206.jpg OpenLORIS_images img4533.jpg OpenLORIS_images img4638.jpg OpenLORIS_images img2245.jpg OpenLORIS_images img32.jpg OpenLORIS_images img3323.jpg OpenLORIS_images img5702.jpg OpenLORIS_images img6140.jpg OpenLORIS_images img2892.jpg OpenLORIS_images img2185.jpg OpenLORIS_images img5315.jpg OpenLORIS_images img6057.jpg OpenLORIS_images img120.jpg OpenLORIS_images img1456.jpg OpenLORIS_images img4470.jpg OpenLORIS_images img1693.jpg OpenLORIS_images img1062.jpg OpenLORIS_images img4344.jpg OpenLORIS_images img343.jpg OpenLORIS_images img1621.jpg OpenLORIS_images img4248.jpg OpenLORIS_images img4436.jpg OpenLORIS_images img3291.jpg OpenLORIS_images img2465.jpg OpenLORIS_images img6042.jpg OpenLORIS_images img451.jpg OpenLORIS_images img2717.jpg OpenLORIS_images img1512.jpg OpenLORIS_images img322.jpg OpenLORIS_images img698.jpg OpenLORIS_images img2918.jpg OpenLORIS_images img5362.jpg OpenLORIS_images img131.jpg OpenLORIS_images img933.jpg OpenLORIS_images img291.jpg OpenLORIS_images img910.jpg OpenLORIS_images img1361.jpg OpenLORIS_images img2783.jpg OpenLORIS_images img4762.jpg OpenLORIS_images img5500.jpg OpenLORIS_images img1546.jpg OpenLORIS_images img5155.jpg OpenLORIS_images img5934.jpg OpenLORIS_images img4734.jpg OpenLORIS_images img4037.jpg OpenLORIS_images img4159.jpg OpenLORIS_images img3426.jpg OpenLORIS_images img411.jpg OpenLORIS_images img5708.jpg OpenLORIS_images img692.jpg OpenLORIS_images img4121.jpg OpenLORIS_images img1920.jpg OpenLORIS_images img163.jpg OpenLORIS_images img1297.jpg OpenLORIS_images img461.jpg OpenLORIS_images img5369.jpg OpenLORIS_images img2434.jpg OpenLORIS_images img4796.jpg OpenLORIS_images img2602.jpg OpenLORIS_images img5219.jpg OpenLORIS_images img4018.jpg OpenLORIS_images img3585.jpg OpenLORIS_images img4166.jpg OpenLORIS_images img4957.jpg OpenLORIS_images img580.jpg OpenLORIS_images img1091.jpg OpenLORIS_images img3413.jpg OpenLORIS_images img5080.jpg OpenLORIS_images img1585.jpg OpenLORIS_images img3977.jpg OpenLORIS_images img479.jpg OpenLORIS_images img6064.jpg OpenLORIS_images img5096.jpg OpenLORIS_images img36.jpg OpenLORIS_images img4257.jpg OpenLORIS_images img2358.jpg OpenLORIS_images img2395.jpg OpenLORIS_images img2270.jpg OpenLORIS_images img3127.jpg OpenLORIS_images img3790.jpg OpenLORIS_images img4023.jpg OpenLORIS_images img3780.jpg OpenLORIS_images img1252.jpg OpenLORIS_images img4020.jpg OpenLORIS_images img1975.jpg OpenLORIS_images img2817.jpg OpenLORIS_images img2865.jpg OpenLORIS_images img5898.jpg OpenLORIS_images img3391.jpg OpenLORIS_images img1044.jpg OpenLORIS_images img5465.jpg OpenLORIS_images img2310.jpg OpenLORIS_images img634.jpg OpenLORIS_images img5834.jpg OpenLORIS_images img4698.jpg OpenLORIS_images img105.jpg OpenLORIS_images img1653.jpg OpenLORIS_images img1128.jpg OpenLORIS_images img5344.jpg OpenLORIS_images img5297.jpg OpenLORIS_images img3670.jpg OpenLORIS_images img1479.jpg OpenLORIS_images img2653.jpg OpenLORIS_images img2176.jpg OpenLORIS_images img5779.jpg OpenLORIS_images img873.jpg OpenLORIS_images img2118.jpg OpenLORIS_images img304.jpg OpenLORIS_images img926.jpg OpenLORIS_images img1356.jpg OpenLORIS_images img6039.jpg OpenLORIS_images img2635.jpg OpenLORIS_images img4550.jpg OpenLORIS_images img143.jpg OpenLORIS_images img3157.jpg OpenLORIS_images img486.jpg OpenLORIS_images img419.jpg OpenLORIS_images img3679.jpg OpenLORIS_images img3407.jpg OpenLORIS_images img626.jpg OpenLORIS_images img3630.jpg OpenLORIS_images img3695.jpg OpenLORIS_images img587.jpg OpenLORIS_images img6120.jpg OpenLORIS_images img5385.jpg OpenLORIS_images img5197.jpg OpenLORIS_images img1368.jpg OpenLORIS_images img4163.jpg OpenLORIS_images img3072.jpg OpenLORIS_images img490.jpg OpenLORIS_images img4152.jpg OpenLORIS_images img5464.jpg OpenLORIS_images img2333.jpg OpenLORIS_images img241.jpg OpenLORIS_images img4213.jpg OpenLORIS_images img2394.jpg OpenLORIS_images img3242.jpg OpenLORIS_images img1155.jpg OpenLORIS_images img2646.jpg OpenLORIS_images img5358.jpg OpenLORIS_images img5773.jpg OpenLORIS_images img4578.jpg OpenLORIS_images img4999.jpg OpenLORIS_images img1833.jpg OpenLORIS_images img366.jpg OpenLORIS_images img915.jpg OpenLORIS_images img2234.jpg OpenLORIS_images img3420.jpg OpenLORIS_images img2160.jpg OpenLORIS_images img3769.jpg OpenLORIS_images img524.jpg OpenLORIS_images img5364.jpg OpenLORIS_images img1198.jpg OpenLORIS_images img842.jpg OpenLORIS_images img540.jpg OpenLORIS_images img211.jpg OpenLORIS_images img3251.jpg OpenLORIS_images img6047.jpg OpenLORIS_images img6060.jpg OpenLORIS_images img2295.jpg OpenLORIS_images img2872.jpg OpenLORIS_images img1407.jpg OpenLORIS_images img1112.jpg OpenLORIS_images img493.jpg OpenLORIS_images img659.jpg OpenLORIS_images img4154.jpg OpenLORIS_images img4715.jpg OpenLORIS_images img777.jpg OpenLORIS_images img2759.jpg OpenLORIS_images img181.jpg OpenLORIS_images img3314.jpg OpenLORIS_images img88.jpg OpenLORIS_images img2382.jpg OpenLORIS_images img3278.jpg OpenLORIS_images img1012.jpg OpenLORIS_images img2491.jpg OpenLORIS_images img4549.jpg OpenLORIS_images img5006.jpg OpenLORIS_images img248.jpg OpenLORIS_images img1854.jpg OpenLORIS_images img3719.jpg OpenLORIS_images img5848.jpg OpenLORIS_images img5066.jpg OpenLORIS_images img2170.jpg OpenLORIS_images img968.jpg OpenLORIS_images img5787.jpg OpenLORIS_images img4510.jpg OpenLORIS_images img72.jpg OpenLORIS_images img3494.jpg OpenLORIS_images img543.jpg OpenLORIS_images img28.jpg OpenLORIS_images img606.jpg OpenLORIS_images img3057.jpg OpenLORIS_images img2241.jpg OpenLORIS_images img2941.jpg OpenLORIS_images img1687.jpg OpenLORIS_images img2397.jpg OpenLORIS_images img1109.jpg OpenLORIS_images img1229.jpg OpenLORIS_images img5384.jpg OpenLORIS_images img1196.jpg OpenLORIS_images img4891.jpg OpenLORIS_images img5116.jpg OpenLORIS_images img5625.jpg OpenLORIS_images img5933.jpg OpenLORIS_images img1813.jpg OpenLORIS_images img1939.jpg OpenLORIS_images img4169.jpg OpenLORIS_images img3106.jpg OpenLORIS_images img4534.jpg OpenLORIS_images img3711.jpg OpenLORIS_images img1072.jpg OpenLORIS_images img2747.jpg OpenLORIS_images img2963.jpg OpenLORIS_images img3339.jpg OpenLORIS_images img1086.jpg OpenLORIS_images img2741.jpg OpenLORIS_images img835.jpg OpenLORIS_images img2757.jpg OpenLORIS_images img3448.jpg OpenLORIS_images img5208.jpg OpenLORIS_images img5512.jpg OpenLORIS_images img4983.jpg OpenLORIS_images img4448.jpg OpenLORIS_images img5597.jpg OpenLORIS_images img977.jpg OpenLORIS_images img4732.jpg OpenLORIS_images img363.jpg OpenLORIS_images img1188.jpg OpenLORIS_images img3357.jpg OpenLORIS_images img4373.jpg OpenLORIS_images img5559.jpg OpenLORIS_images img416.jpg OpenLORIS_images img144.jpg OpenLORIS_images img4774.jpg OpenLORIS_images img6117.jpg OpenLORIS_images img1906.jpg OpenLORIS_images img4901.jpg OpenLORIS_images img4464.jpg OpenLORIS_images img1222.jpg OpenLORIS_images img663.jpg OpenLORIS_images img1376.jpg OpenLORIS_images img4727.jpg OpenLORIS_images img4079.jpg OpenLORIS_images img3887.jpg OpenLORIS_images img4089.jpg OpenLORIS_images img4496.jpg OpenLORIS_images img2450.jpg OpenLORIS_images img2112.jpg OpenLORIS_images img3416.jpg OpenLORIS_images img5475.jpg OpenLORIS_images img4112.jpg OpenLORIS_images img1748.jpg OpenLORIS_images img1815.jpg OpenLORIS_images img16.jpg OpenLORIS_images img2862.jpg OpenLORIS_images img1971.jpg OpenLORIS_images img3298.jpg OpenLORIS_images img4972.jpg OpenLORIS_images img1449.jpg OpenLORIS_images img2095.jpg OpenLORIS_images img1145.jpg OpenLORIS_images img2228.jpg OpenLORIS_images img17.jpg OpenLORIS_images img4665.jpg OpenLORIS_images img2324.jpg OpenLORIS_images img10.jpg OpenLORIS_images img2068.jpg OpenLORIS_images img2976.jpg OpenLORIS_images img4740.jpg OpenLORIS_images img2293.jpg OpenLORIS_images img2585.jpg OpenLORIS_images img2128.jpg OpenLORIS_images img4453.jpg OpenLORIS_images img2378.jpg OpenLORIS_images img5568.jpg OpenLORIS_images img1234.jpg OpenLORIS_images img4333.jpg OpenLORIS_images img1627.jpg OpenLORIS_images img4370.jpg OpenLORIS_images img1026.jpg OpenLORIS_images img5858.jpg OpenLORIS_images img271.jpg OpenLORIS_images img1065.jpg OpenLORIS_images img5588.jpg OpenLORIS_images img608.jpg OpenLORIS_images img5802.jpg OpenLORIS_images img1887.jpg OpenLORIS_images img3478.jpg OpenLORIS_images img1544.jpg OpenLORIS_images img3846.jpg OpenLORIS_images img3346.jpg OpenLORIS_images img1244.jpg OpenLORIS_images img4342.jpg OpenLORIS_images img6122.jpg OpenLORIS_images img1076.jpg OpenLORIS_images img4184.jpg OpenLORIS_images img3644.jpg OpenLORIS_images img3325.jpg OpenLORIS_images img204.jpg OpenLORIS_images img1336.jpg OpenLORIS_images img4922.jpg OpenLORIS_images img3109.jpg OpenLORIS_images img5311.jpg OpenLORIS_images img1010.jpg OpenLORIS_images img5365.jpg OpenLORIS_images img1893.jpg OpenLORIS_images img2819.jpg OpenLORIS_images img1467.jpg OpenLORIS_images img693.jpg OpenLORIS_images img4338.jpg OpenLORIS_images img6043.jpg OpenLORIS_images img3725.jpg OpenLORIS_images img4632.jpg OpenLORIS_images img4818.jpg OpenLORIS_images img67.jpg OpenLORIS_images img2786.jpg OpenLORIS_images img3622.jpg OpenLORIS_images img1285.jpg OpenLORIS_images img1383.jpg OpenLORIS_images img5282.jpg OpenLORIS_images img1991.jpg OpenLORIS_images img5203.jpg OpenLORIS_images img2435.jpg OpenLORIS_images img3993.jpg OpenLORIS_images img5216.jpg OpenLORIS_images img764.jpg OpenLORIS_images img3263.jpg OpenLORIS_images img5916.jpg OpenLORIS_images img3980.jpg OpenLORIS_images img5811.jpg OpenLORIS_images img2761.jpg OpenLORIS_images img5986.jpg OpenLORIS_images img973.jpg OpenLORIS_images img869.jpg OpenLORIS_images img1905.jpg OpenLORIS_images img4378.jpg OpenLORIS_images img1192.jpg OpenLORIS_images img3805.jpg OpenLORIS_images img3873.jpg OpenLORIS_images img5064.jpg OpenLORIS_images img1795.jpg OpenLORIS_images img4979.jpg OpenLORIS_images img660.jpg OpenLORIS_images img5348.jpg OpenLORIS_images img2225.jpg OpenLORIS_images img2476.jpg OpenLORIS_images img3687.jpg OpenLORIS_images img2453.jpg OpenLORIS_images img1500.jpg OpenLORIS_images img1878.jpg OpenLORIS_images img1363.jpg OpenLORIS_images img1127.jpg OpenLORIS_images img4087.jpg OpenLORIS_images img4045.jpg OpenLORIS_images img2106.jpg OpenLORIS_images img61.jpg OpenLORIS_images img2221.jpg OpenLORIS_images img1678.jpg OpenLORIS_images img5256.jpg OpenLORIS_images img1264.jpg OpenLORIS_images img4804.jpg OpenLORIS_images img4760.jpg OpenLORIS_images img4516.jpg OpenLORIS_images img1824.jpg OpenLORIS_images img2693.jpg OpenLORIS_images img1800.jpg OpenLORIS_images img2557.jpg OpenLORIS_images img1154.jpg OpenLORIS_images img948.jpg OpenLORIS_images img177.jpg OpenLORIS_images img912.jpg OpenLORIS_images img2141.jpg OpenLORIS_images img4663.jpg OpenLORIS_images img4993.jpg OpenLORIS_images img2150.jpg OpenLORIS_images img1418.jpg OpenLORIS_images img3618.jpg OpenLORIS_images img5727.jpg OpenLORIS_images img1416.jpg OpenLORIS_images img5783.jpg OpenLORIS_images img1766.jpg OpenLORIS_images img1452.jpg OpenLORIS_images img2433.jpg OpenLORIS_images img2901.jpg OpenLORIS_images img443.jpg OpenLORIS_images img5136.jpg OpenLORIS_images img1243.jpg OpenLORIS_images img2283.jpg OpenLORIS_images img5519.jpg OpenLORIS_images img1116.jpg OpenLORIS_images img1177.jpg OpenLORIS_images img3699.jpg OpenLORIS_images img2526.jpg OpenLORIS_images img3834.jpg OpenLORIS_images img3531.jpg OpenLORIS_images img5554.jpg OpenLORIS_images img2735.jpg OpenLORIS_images img5630.jpg OpenLORIS_images img4392.jpg OpenLORIS_images img4942.jpg OpenLORIS_images img1306.jpg OpenLORIS_images img4414.jpg OpenLORIS_images img374.jpg OpenLORIS_images img1871.jpg OpenLORIS_images img5594.jpg OpenLORIS_images img5686.jpg OpenLORIS_images img3475.jpg OpenLORIS_images img4835.jpg OpenLORIS_images img3034.jpg OpenLORIS_images img3129.jpg OpenLORIS_images img749.jpg OpenLORIS_images img396.jpg OpenLORIS_images img180.jpg OpenLORIS_images img5529.jpg OpenLORIS_images img5053.jpg OpenLORIS_images img3410.jpg OpenLORIS_images img217.jpg OpenLORIS_images img4757.jpg OpenLORIS_images img258.jpg OpenLORIS_images img5851.jpg OpenLORIS_images img2401.jpg OpenLORIS_images img5372.jpg OpenLORIS_images img4484.jpg OpenLORIS_images img3358.jpg OpenLORIS_images img5252.jpg OpenLORIS_images img4681.jpg OpenLORIS_images img5766.jpg OpenLORIS_images img992.jpg OpenLORIS_images img5685.jpg OpenLORIS_images img2888.jpg OpenLORIS_images img3661.jpg OpenLORIS_images img2974.jpg OpenLORIS_images img190.jpg OpenLORIS_images img5147.jpg OpenLORIS_images img4307.jpg OpenLORIS_images img3571.jpg OpenLORIS_images img3485.jpg OpenLORIS_images img1845.jpg OpenLORIS_images img2822.jpg OpenLORIS_images img2054.jpg OpenLORIS_images img2043.jpg OpenLORIS_images img2044.jpg OpenLORIS_images img4126.jpg OpenLORIS_images img5876.jpg OpenLORIS_images img3249.jpg OpenLORIS_images img3513.jpg OpenLORIS_images img4551.jpg OpenLORIS_images img1309.jpg OpenLORIS_images img4801.jpg OpenLORIS_images img3086.jpg OpenLORIS_images img1053.jpg OpenLORIS_images img3860.jpg OpenLORIS_images img225.jpg OpenLORIS_images img1437.jpg OpenLORIS_images img763.jpg OpenLORIS_images img2909.jpg OpenLORIS_images img164.jpg OpenLORIS_images img3065.jpg OpenLORIS_images img1166.jpg OpenLORIS_images img2381.jpg OpenLORIS_images img6141.jpg OpenLORIS_images img2660.jpg OpenLORIS_images img5410.jpg OpenLORIS_images img5423.jpg OpenLORIS_images img3948.jpg OpenLORIS_images img4735.jpg OpenLORIS_images img2038.jpg OpenLORIS_images img5177.jpg OpenLORIS_images img1570.jpg OpenLORIS_images img3631.jpg OpenLORIS_images img3600.jpg OpenLORIS_images img6015.jpg OpenLORIS_images img2490.jpg OpenLORIS_images img5753.jpg OpenLORIS_images img4616.jpg OpenLORIS_images img1562.jpg OpenLORIS_images img5798.jpg OpenLORIS_images img2593.jpg OpenLORIS_images img2047.jpg OpenLORIS_images img5310.jpg OpenLORIS_images img56.jpg OpenLORIS_images img2428.jpg OpenLORIS_images img5491.jpg OpenLORIS_images img3457.jpg OpenLORIS_images img1590.jpg OpenLORIS_images img4404.jpg OpenLORIS_images img3543.jpg OpenLORIS_images img3715.jpg OpenLORIS_images img2147.jpg OpenLORIS_images img5341.jpg OpenLORIS_images img1159.jpg OpenLORIS_images img3718.jpg OpenLORIS_images img1659.jpg OpenLORIS_images img4840.jpg OpenLORIS_images img2598.jpg OpenLORIS_images img5083.jpg OpenLORIS_images img5722.jpg OpenLORIS_images img364.jpg OpenLORIS_images img4476.jpg OpenLORIS_images img584.jpg OpenLORIS_images img3374.jpg OpenLORIS_images img1020.jpg OpenLORIS_images img5867.jpg OpenLORIS_images img2889.jpg OpenLORIS_images img4810.jpg OpenLORIS_images img5456.jpg OpenLORIS_images img4466.jpg OpenLORIS_images img4765.jpg OpenLORIS_images img5697.jpg OpenLORIS_images img578.jpg OpenLORIS_images img1929.jpg OpenLORIS_images img1387.jpg OpenLORIS_images img1560.jpg OpenLORIS_images img6.jpg OpenLORIS_images img4046.jpg OpenLORIS_images img1003.jpg OpenLORIS_images img4849.jpg OpenLORIS_images img6065.jpg OpenLORIS_images img628.jpg OpenLORIS_images img1428.jpg OpenLORIS_images img1333.jpg OpenLORIS_images img1776.jpg OpenLORIS_images img1370.jpg OpenLORIS_images img3960.jpg OpenLORIS_images img2483.jpg OpenLORIS_images img119.jpg OpenLORIS_images img1664.jpg OpenLORIS_images img4421.jpg OpenLORIS_images img2319.jpg OpenLORIS_images img5174.jpg OpenLORIS_images img2654.jpg OpenLORIS_images img4186.jpg OpenLORIS_images img5140.jpg OpenLORIS_images img468.jpg OpenLORIS_images img3586.jpg OpenLORIS_images img5090.jpg OpenLORIS_images img1529.jpg OpenLORIS_images img1704.jpg OpenLORIS_images img491.jpg OpenLORIS_images img6105.jpg OpenLORIS_images img1772.jpg OpenLORIS_images img4621.jpg OpenLORIS_images img4293.jpg OpenLORIS_images img2370.jpg OpenLORIS_images img5415.jpg OpenLORIS_images img5814.jpg OpenLORIS_images img3404.jpg OpenLORIS_images img5040.jpg OpenLORIS_images img863.jpg OpenLORIS_images img3692.jpg OpenLORIS_images img5029.jpg OpenLORIS_images img4531.jpg OpenLORIS_images img2016.jpg OpenLORIS_images img791.jpg OpenLORIS_images img1789.jpg OpenLORIS_images img3659.jpg OpenLORIS_images img795.jpg OpenLORIS_images img1933.jpg OpenLORIS_images img5566.jpg OpenLORIS_images img2084.jpg OpenLORIS_images img3716.jpg OpenLORIS_images img6084.jpg OpenLORIS_images img679.jpg OpenLORIS_images img1631.jpg OpenLORIS_images img4866.jpg OpenLORIS_images img897.jpg OpenLORIS_images img3182.jpg OpenLORIS_images img1236.jpg OpenLORIS_images img5840.jpg OpenLORIS_images img4529.jpg OpenLORIS_images img5070.jpg OpenLORIS_images img3432.jpg OpenLORIS_images img4830.jpg OpenLORIS_images img4383.jpg OpenLORIS_images img6119.jpg OpenLORIS_images img2749.jpg OpenLORIS_images img166.jpg OpenLORIS_images img2396.jpg OpenLORIS_images img3747.jpg OpenLORIS_images img4986.jpg OpenLORIS_images img3516.jpg OpenLORIS_images img2108.jpg OpenLORIS_images img2605.jpg OpenLORIS_images img4954.jpg OpenLORIS_images img2567.jpg OpenLORIS_images img1522.jpg OpenLORIS_images img5584.jpg OpenLORIS_images img3452.jpg OpenLORIS_images img4287.jpg OpenLORIS_images img6114.jpg OpenLORIS_images img69.jpg OpenLORIS_images img1580.jpg OpenLORIS_images img4439.jpg OpenLORIS_images img1646.jpg OpenLORIS_images img1429.jpg OpenLORIS_images img5416.jpg OpenLORIS_images img5176.jpg OpenLORIS_images img2254.jpg OpenLORIS_images img4987.jpg OpenLORIS_images img744.jpg OpenLORIS_images img3982.jpg OpenLORIS_images img5067.jpg OpenLORIS_images img5320.jpg OpenLORIS_images img5653.jpg OpenLORIS_images img2277.jpg OpenLORIS_images img3309.jpg OpenLORIS_images img344.jpg OpenLORIS_images img2178.jpg OpenLORIS_images img1720.jpg OpenLORIS_images img3493.jpg OpenLORIS_images img3995.jpg OpenLORIS_images img1864.jpg OpenLORIS_images img792.jpg OpenLORIS_images img4651.jpg OpenLORIS_images img751.jpg OpenLORIS_images img3229.jpg OpenLORIS_images img1565.jpg OpenLORIS_images img1543.jpg OpenLORIS_images img2586.jpg OpenLORIS_images img4581.jpg OpenLORIS_images img5870.jpg OpenLORIS_images img4730.jpg OpenLORIS_images img4770.jpg OpenLORIS_images img5008.jpg OpenLORIS_images img5145.jpg OpenLORIS_images img1141.jpg OpenLORIS_images img2734.jpg OpenLORIS_images img5265.jpg OpenLORIS_images img3981.jpg OpenLORIS_images img3759.jpg OpenLORIS_images img286.jpg OpenLORIS_images img340.jpg OpenLORIS_images img5039.jpg OpenLORIS_images img4565.jpg OpenLORIS_images img954.jpg OpenLORIS_images img5170.jpg OpenLORIS_images img94.jpg OpenLORIS_images img319.jpg OpenLORIS_images img4049.jpg OpenLORIS_images img691.jpg OpenLORIS_images img3791.jpg OpenLORIS_images img1008.jpg OpenLORIS_images img1018.jpg OpenLORIS_images img104.jpg OpenLORIS_images img2268.jpg OpenLORIS_images img3029.jpg OpenLORIS_images img1980.jpg OpenLORIS_images img1751.jpg OpenLORIS_images img4710.jpg OpenLORIS_images img5762.jpg OpenLORIS_images img885.jpg OpenLORIS_images img187.jpg OpenLORIS_images img2919.jpg OpenLORIS_images img126.jpg OpenLORIS_images img532.jpg OpenLORIS_images img1818.jpg OpenLORIS_images img3558.jpg OpenLORIS_images img4684.jpg OpenLORIS_images img1270.jpg OpenLORIS_images img2803.jpg OpenLORIS_images img1379.jpg OpenLORIS_images img5031.jpg OpenLORIS_images img4173.jpg OpenLORIS_images img3581.jpg OpenLORIS_images img3651.jpg OpenLORIS_images img4176.jpg OpenLORIS_images img6118.jpg OpenLORIS_images img4330.jpg OpenLORIS_images img2236.jpg OpenLORIS_images img2402.jpg OpenLORIS_images img5642.jpg OpenLORIS_images img5923.jpg OpenLORIS_images img2155.jpg OpenLORIS_images img5872.jpg OpenLORIS_images img5259.jpg OpenLORIS_images img5286.jpg OpenLORIS_images img5590.jpg OpenLORIS_images img1915.jpg OpenLORIS_images img3172.jpg OpenLORIS_images img5741.jpg OpenLORIS_images img518.jpg OpenLORIS_images img5705.jpg OpenLORIS_images img1081.jpg OpenLORIS_images img3285.jpg OpenLORIS_images img576.jpg OpenLORIS_images img3788.jpg OpenLORIS_images img3740.jpg OpenLORIS_images img3855.jpg OpenLORIS_images img2763.jpg OpenLORIS_images img3025.jpg OpenLORIS_images img3012.jpg OpenLORIS_images img980.jpg OpenLORIS_images img6121.jpg OpenLORIS_images img4352.jpg OpenLORIS_images img5106.jpg OpenLORIS_images img1695.jpg OpenLORIS_images img1577.jpg OpenLORIS_images img389.jpg OpenLORIS_images img4701.jpg OpenLORIS_images img3612.jpg OpenLORIS_images img5925.jpg OpenLORIS_images img1401.jpg OpenLORIS_images img5571.jpg OpenLORIS_images img19.jpg OpenLORIS_images img5400.jpg OpenLORIS_images img2847.jpg OpenLORIS_images img4482.jpg OpenLORIS_images img4110.jpg OpenLORIS_images img3149.jpg OpenLORIS_images img1979.jpg OpenLORIS_images img5724.jpg OpenLORIS_images img1326.jpg OpenLORIS_images img916.jpg OpenLORIS_images img4834.jpg OpenLORIS_images img2505.jpg OpenLORIS_images img198.jpg OpenLORIS_images img4291.jpg OpenLORIS_images img5237.jpg OpenLORIS_images img2531.jpg OpenLORIS_images img3213.jpg OpenLORIS_images img2917.jpg OpenLORIS_images img5085.jpg OpenLORIS_images img2366.jpg OpenLORIS_images img5754.jpg OpenLORIS_images img2107.jpg OpenLORIS_images img1287.jpg OpenLORIS_images img3997.jpg OpenLORIS_images img5778.jpg OpenLORIS_images img1715.jpg OpenLORIS_images img3667.jpg OpenLORIS_images img2316.jpg OpenLORIS_images img2233.jpg OpenLORIS_images img2854.jpg OpenLORIS_images img6028.jpg OpenLORIS_images img194.jpg OpenLORIS_images img2398.jpg OpenLORIS_images img4268.jpg OpenLORIS_images img2714.jpg OpenLORIS_images img2922.jpg OpenLORIS_images img5427.jpg OpenLORIS_images img5772.jpg OpenLORIS_images img5962.jpg OpenLORIS_images img4965.jpg OpenLORIS_images img4108.jpg OpenLORIS_images img1371.jpg OpenLORIS_images img774.jpg OpenLORIS_images img3313.jpg OpenLORIS_images img5676.jpg OpenLORIS_images img4313.jpg OpenLORIS_images img1969.jpg OpenLORIS_images img5016.jpg OpenLORIS_images img2960.jpg OpenLORIS_images img1925.jpg OpenLORIS_images img5659.jpg OpenLORIS_images img4587.jpg OpenLORIS_images img5205.jpg OpenLORIS_images img2652.jpg OpenLORIS_images img3861.jpg OpenLORIS_images img1844.jpg OpenLORIS_images img2711.jpg OpenLORIS_images img3221.jpg OpenLORIS_images img829.jpg OpenLORIS_images img2818.jpg OpenLORIS_images img2950.jpg OpenLORIS_images img1782.jpg OpenLORIS_images img4252.jpg OpenLORIS_images img4081.jpg OpenLORIS_images img4620.jpg OpenLORIS_images img650.jpg OpenLORIS_images img4530.jpg OpenLORIS_images img498.jpg OpenLORIS_images img2367.jpg OpenLORIS_images img2939.jpg OpenLORIS_images img3577.jpg OpenLORIS_images img41.jpg OpenLORIS_images img5312.jpg OpenLORIS_images img5612.jpg OpenLORIS_images img5928.jpg OpenLORIS_images img5143.jpg OpenLORIS_images img4923.jpg OpenLORIS_images img514.jpg OpenLORIS_images img2687.jpg OpenLORIS_images img4325.jpg OpenLORIS_images img2861.jpg OpenLORIS_images img63.jpg OpenLORIS_images img1129.jpg OpenLORIS_images img1218.jpg OpenLORIS_images img4927.jpg OpenLORIS_images img3904.jpg OpenLORIS_images img1729.jpg OpenLORIS_images img1872.jpg OpenLORIS_images img2391.jpg OpenLORIS_images img5032.jpg OpenLORIS_images img431.jpg OpenLORIS_images img5408.jpg OpenLORIS_images img6033.jpg OpenLORIS_images img5328.jpg OpenLORIS_images img5887.jpg OpenLORIS_images img4068.jpg OpenLORIS_images img5437.jpg OpenLORIS_images img769.jpg OpenLORIS_images img5371.jpg OpenLORIS_images img1413.jpg OpenLORIS_images img938.jpg OpenLORIS_images img3610.jpg OpenLORIS_images img2408.jpg OpenLORIS_images img3741.jpg OpenLORIS_images img6052.jpg OpenLORIS_images img4569.jpg OpenLORIS_images img4407.jpg OpenLORIS_images img2377.jpg OpenLORIS_images img6104.jpg OpenLORIS_images img2742.jpg OpenLORIS_images img3840.jpg OpenLORIS_images img1549.jpg OpenLORIS_images img1791.jpg OpenLORIS_images img2823.jpg OpenLORIS_images img1736.jpg OpenLORIS_images img4673.jpg OpenLORIS_images img3233.jpg OpenLORIS_images img1764.jpg OpenLORIS_images img3031.jpg OpenLORIS_images img2575.jpg OpenLORIS_images img3591.jpg OpenLORIS_images img1884.jpg OpenLORIS_images img2452.jpg OpenLORIS_images img370.jpg OpenLORIS_images img5157.jpg OpenLORIS_images img2727.jpg OpenLORIS_images img5884.jpg OpenLORIS_images img5447.jpg OpenLORIS_images img4393.jpg OpenLORIS_images img4459.jpg OpenLORIS_images img6133.jpg OpenLORIS_images img494.jpg OpenLORIS_images img5508.jpg OpenLORIS_images img1574.jpg OpenLORIS_images img2503.jpg OpenLORIS_images img3686.jpg OpenLORIS_images img1408.jpg OpenLORIS_images img5036.jpg OpenLORIS_images img3239.jpg OpenLORIS_images img4856.jpg OpenLORIS_images img1226.jpg OpenLORIS_images img4859.jpg OpenLORIS_images img5739.jpg OpenLORIS_images img4400.jpg OpenLORIS_images img3027.jpg OpenLORIS_images img1757.jpg OpenLORIS_images img721.jpg OpenLORIS_images img4441.jpg OpenLORIS_images img4490.jpg OpenLORIS_images img5920.jpg OpenLORIS_images img4962.jpg OpenLORIS_images img2011.jpg OpenLORIS_images img1271.jpg OpenLORIS_images img4179.jpg OpenLORIS_images img1934.jpg OpenLORIS_images img333.jpg OpenLORIS_images img1125.jpg OpenLORIS_images img2716.jpg OpenLORIS_images img2674.jpg OpenLORIS_images img130.jpg OpenLORIS_images img1779.jpg OpenLORIS_images img1944.jpg OpenLORIS_images img3582.jpg OpenLORIS_images img3910.jpg OpenLORIS_images img993.jpg OpenLORIS_images img3744.jpg OpenLORIS_images img4479.jpg OpenLORIS_images img53.jpg OpenLORIS_images img4611.jpg OpenLORIS_images img4662.jpg OpenLORIS_images img2364.jpg OpenLORIS_images img1011.jpg OpenLORIS_images img5538.jpg OpenLORIS_images img5056.jpg OpenLORIS_images img877.jpg OpenLORIS_images img5037.jpg OpenLORIS_images img4764.jpg OpenLORIS_images img2797.jpg OpenLORIS_images img2104.jpg OpenLORIS_images img80.jpg OpenLORIS_images img2551.jpg OpenLORIS_images img2159.jpg OpenLORIS_images img2752.jpg OpenLORIS_images img1534.jpg OpenLORIS_images img2029.jpg OpenLORIS_images img743.jpg OpenLORIS_images img1851.jpg OpenLORIS_images img1912.jpg OpenLORIS_images img2954.jpg OpenLORIS_images img5919.jpg OpenLORIS_images img1029.jpg OpenLORIS_images img700.jpg OpenLORIS_images img2058.jpg OpenLORIS_images img3918.jpg OpenLORIS_images img5885.jpg OpenLORIS_images img2834.jpg OpenLORIS_images img2156.jpg OpenLORIS_images img2927.jpg OpenLORIS_images img1734.jpg OpenLORIS_images img5552.jpg OpenLORIS_images img3748.jpg OpenLORIS_images img3445.jpg OpenLORIS_images img5560.jpg OpenLORIS_images img3192.jpg OpenLORIS_images img5835.jpg OpenLORIS_images img4706.jpg OpenLORIS_images img4759.jpg OpenLORIS_images img1803.jpg OpenLORIS_images img5236.jpg OpenLORIS_images img3849.jpg OpenLORIS_images img6018.jpg OpenLORIS_images img881.jpg OpenLORIS_images img4787.jpg OpenLORIS_images img4050.jpg OpenLORIS_images img2821.jpg OpenLORIS_images img2973.jpg OpenLORIS_images img1352.jpg OpenLORIS_images img736.jpg OpenLORIS_images img1679.jpg OpenLORIS_images img2090.jpg OpenLORIS_images img1947.jpg OpenLORIS_images img5646.jpg OpenLORIS_images img2134.jpg OpenLORIS_images img648.jpg OpenLORIS_images img3322.jpg OpenLORIS_images img1296.jpg OpenLORIS_images img1405.jpg OpenLORIS_images img1042.jpg OpenLORIS_images img5904.jpg OpenLORIS_images img2965.jpg OpenLORIS_images img2129.jpg OpenLORIS_images img5079.jpg OpenLORIS_images img97.jpg OpenLORIS_images img4.jpg OpenLORIS_images img1654.jpg OpenLORIS_images img4422.jpg OpenLORIS_images img5335.jpg OpenLORIS_images img5042.jpg OpenLORIS_images img2119.jpg OpenLORIS_images img730.jpg OpenLORIS_images img884.jpg OpenLORIS_images img2376.jpg OpenLORIS_images img2588.jpg OpenLORIS_images img827.jpg OpenLORIS_images img5658.jpg OpenLORIS_images img506.jpg OpenLORIS_images img4538.jpg OpenLORIS_images img2014.jpg OpenLORIS_images img1223.jpg OpenLORIS_images img2683.jpg OpenLORIS_images img702.jpg OpenLORIS_images img1414.jpg OpenLORIS_images img3252.jpg OpenLORIS_images img2926.jpg OpenLORIS_images img1438.jpg OpenLORIS_images img2163.jpg OpenLORIS_images img1178.jpg OpenLORIS_images img5586.jpg OpenLORIS_images img951.jpg OpenLORIS_images img3142.jpg OpenLORIS_images img4266.jpg OpenLORIS_images img1015.jpg OpenLORIS_images img5664.jpg OpenLORIS_images img2609.jpg OpenLORIS_images img4708.jpg OpenLORIS_images img5063.jpg OpenLORIS_images img4824.jpg OpenLORIS_images img1760.jpg OpenLORIS_images img5592.jpg OpenLORIS_images img4317.jpg OpenLORIS_images img2074.jpg OpenLORIS_images img4224.jpg OpenLORIS_images img2298.jpg OpenLORIS_images img1374.jpg OpenLORIS_images img4808.jpg OpenLORIS_images img2373.jpg OpenLORIS_images img5956.jpg OpenLORIS_images img2488.jpg OpenLORIS_images img4907.jpg OpenLORIS_images img3047.jpg OpenLORIS_images img2632.jpg OpenLORIS_images img1883.jpg OpenLORIS_images img3830.jpg OpenLORIS_images img4158.jpg OpenLORIS_images img3161.jpg OpenLORIS_images img1752.jpg OpenLORIS_images img3474.jpg OpenLORIS_images img2671.jpg OpenLORIS_images img535.jpg OpenLORIS_images img1743.jpg OpenLORIS_images img3402.jpg OpenLORIS_images img3345.jpg OpenLORIS_images img2755.jpg OpenLORIS_images img1171.jpg OpenLORIS_images img3376.jpg OpenLORIS_images img4462.jpg OpenLORIS_images img5343.jpg OpenLORIS_images img4308.jpg OpenLORIS_images img1922.jpg OpenLORIS_images img1283.jpg OpenLORIS_images img5865.jpg OpenLORIS_images img4084.jpg OpenLORIS_images img2883.jpg OpenLORIS_images img5123.jpg OpenLORIS_images img4520.jpg OpenLORIS_images img4285.jpg OpenLORIS_images img4310.jpg OpenLORIS_images img4653.jpg OpenLORIS_images img85.jpg OpenLORIS_images img4687.jpg OpenLORIS_images img6031.jpg OpenLORIS_images img4951.jpg OpenLORIS_images img6073.jpg OpenLORIS_images img4452.jpg OpenLORIS_images img5663.jpg OpenLORIS_images img5021.jpg OpenLORIS_images img178.jpg OpenLORIS_images img812.jpg OpenLORIS_images img787.jpg OpenLORIS_images img4129.jpg OpenLORIS_images img2925.jpg OpenLORIS_images img4643.jpg OpenLORIS_images img4614.jpg OpenLORIS_images img3968.jpg OpenLORIS_images img3295.jpg OpenLORIS_images img3580.jpg OpenLORIS_images img2784.jpg OpenLORIS_images img2978.jpg OpenLORIS_images img3749.jpg OpenLORIS_images img1503.jpg OpenLORIS_images img1683.jpg OpenLORIS_images img2923.jpg OpenLORIS_images img2046.jpg OpenLORIS_images img6055.jpg OpenLORIS_images img1993.jpg OpenLORIS_images img4780.jpg OpenLORIS_images img2001.jpg OpenLORIS_images img5988.jpg OpenLORIS_images img3929.jpg OpenLORIS_images img4172.jpg OpenLORIS_images img5003.jpg OpenLORIS_images img2565.jpg OpenLORIS_images img4423.jpg OpenLORIS_images img4609.jpg OpenLORIS_images img3271.jpg OpenLORIS_images img1935.jpg OpenLORIS_images img1640.jpg OpenLORIS_images img1446.jpg OpenLORIS_images img5101.jpg OpenLORIS_images img1960.jpg OpenLORIS_images img841.jpg OpenLORIS_images img4702.jpg OpenLORIS_images img6130.jpg OpenLORIS_images img4592.jpg OpenLORIS_images img1001.jpg OpenLORIS_images img82.jpg OpenLORIS_images img5010.jpg OpenLORIS_images img1733.jpg OpenLORIS_images img1699.jpg OpenLORIS_images img2507.jpg OpenLORIS_images img1762.jpg OpenLORIS_images img2120.jpg OpenLORIS_images img1328.jpg OpenLORIS_images img3611.jpg OpenLORIS_images img3335.jpg OpenLORIS_images img3755.jpg OpenLORIS_images img6012.jpg OpenLORIS_images img3238.jpg OpenLORIS_images img1242.jpg OpenLORIS_images img87.jpg OpenLORIS_images img1897.jpg OpenLORIS_images img5820.jpg OpenLORIS_images img4474.jpg OpenLORIS_images img2647.jpg OpenLORIS_images img4676.jpg OpenLORIS_images img3145.jpg OpenLORIS_images img3603.jpg OpenLORIS_images img3085.jpg OpenLORIS_images img1686.jpg OpenLORIS_images img3091.jpg OpenLORIS_images img1313.jpg OpenLORIS_images img4625.jpg OpenLORIS_images img5616.jpg OpenLORIS_images img4070.jpg OpenLORIS_images img4245.jpg OpenLORIS_images img3803.jpg OpenLORIS_images img5567.jpg OpenLORIS_images img5279.jpg OpenLORIS_images img3155.jpg OpenLORIS_images img3136.jpg OpenLORIS_images img2812.jpg OpenLORIS_images img5637.jpg OpenLORIS_images img279.jpg OpenLORIS_images img567.jpg OpenLORIS_images img2400.jpg OpenLORIS_images img2855.jpg OpenLORIS_images img2942.jpg OpenLORIS_images img4743.jpg OpenLORIS_images img946.jpg OpenLORIS_images img176.jpg OpenLORIS_images img353.jpg OpenLORIS_images img4491.jpg OpenLORIS_images img5326.jpg OpenLORIS_images img18.jpg OpenLORIS_images img3520.jpg OpenLORIS_images img420.jpg OpenLORIS_images img3892.jpg OpenLORIS_images img3743.jpg OpenLORIS_images img1630.jpg OpenLORIS_images img1605.jpg OpenLORIS_images img4784.jpg OpenLORIS_images img1677.jpg OpenLORIS_images img4855.jpg OpenLORIS_images img4341.jpg OpenLORIS_images img607.jpg OpenLORIS_images img3767.jpg OpenLORIS_images img3223.jpg OpenLORIS_images img1606.jpg OpenLORIS_images img3854.jpg OpenLORIS_images img2751.jpg OpenLORIS_images img3392.jpg OpenLORIS_images img3267.jpg OpenLORIS_images img4949.jpg OpenLORIS_images img3481.jpg OpenLORIS_images img1937.jpg OpenLORIS_images img3427.jpg OpenLORIS_images img3266.jpg OpenLORIS_images img1521.jpg OpenLORIS_images img3761.jpg OpenLORIS_images img1636.jpg OpenLORIS_images img3397.jpg OpenLORIS_images img3798.jpg OpenLORIS_images img1594.jpg OpenLORIS_images img6001.jpg OpenLORIS_images img5631.jpg OpenLORIS_images img4332.jpg OpenLORIS_images img5183.jpg OpenLORIS_images img1440.jpg OpenLORIS_images img2226.jpg OpenLORIS_images img5454.jpg OpenLORIS_images img3365.jpg OpenLORIS_images img3590.jpg OpenLORIS_images img4694.jpg OpenLORIS_images img5953.jpg OpenLORIS_images img5634.jpg OpenLORIS_images img2269.jpg OpenLORIS_images img5137.jpg OpenLORIS_images img6072.jpg OpenLORIS_images img1820.jpg OpenLORIS_images img1415.jpg OpenLORIS_images img6041.jpg OpenLORIS_images img4910.jpg OpenLORIS_images img3190.jpg OpenLORIS_images img5838.jpg OpenLORIS_images img4432.jpg OpenLORIS_images img2237.jpg OpenLORIS_images img5562.jpg OpenLORIS_images img3890.jpg OpenLORIS_images img5394.jpg OpenLORIS_images img1369.jpg OpenLORIS_images img5057.jpg OpenLORIS_images img1311.jpg OpenLORIS_images img460.jpg OpenLORIS_images img2666.jpg OpenLORIS_images img3828.jpg OpenLORIS_images img1451.jpg OpenLORIS_images img801.jpg OpenLORIS_images img4692.jpg OpenLORIS_images img1496.jpg OpenLORIS_images img1114.jpg OpenLORIS_images img252.jpg OpenLORIS_images img5561.jpg OpenLORIS_images img1094.jpg OpenLORIS_images img5393.jpg OpenLORIS_images img1463.jpg OpenLORIS_images img963.jpg OpenLORIS_images img5058.jpg OpenLORIS_images img720.jpg OpenLORIS_images img4630.jpg OpenLORIS_images img5321.jpg OpenLORIS_images img4729.jpg OpenLORIS_images img3336.jpg OpenLORIS_images img52.jpg OpenLORIS_images img2624.jpg OpenLORIS_images img2212.jpg OpenLORIS_images img31.jpg OpenLORIS_images img3839.jpg OpenLORIS_images img4262.jpg OpenLORIS_images img260.jpg OpenLORIS_images img3364.jpg OpenLORIS_images img132.jpg OpenLORIS_images img3959.jpg OpenLORIS_images img3996.jpg OpenLORIS_images img4890.jpg OpenLORIS_images img5492.jpg OpenLORIS_images img5710.jpg OpenLORIS_images img3848.jpg OpenLORIS_images img2197.jpg OpenLORIS_images img5650.jpg OpenLORIS_images img625.jpg OpenLORIS_images img1903.jpg OpenLORIS_images img2839.jpg OpenLORIS_images img3664.jpg OpenLORIS_images img2955.jpg OpenLORIS_images img139.jpg OpenLORIS_images img5972.jpg OpenLORIS_images img2750.jpg OpenLORIS_images img5810.jpg OpenLORIS_images img1700.jpg OpenLORIS_images img93.jpg OpenLORIS_images img473.jpg OpenLORIS_images img3310.jpg OpenLORIS_images img464.jpg OpenLORIS_images img600.jpg OpenLORIS_images img3061.jpg OpenLORIS_images img5127.jpg OpenLORIS_images img2535.jpg OpenLORIS_images img1520.jpg OpenLORIS_images img4772.jpg OpenLORIS_images img3613.jpg OpenLORIS_images img4078.jpg OpenLORIS_images img2372.jpg OpenLORIS_images img3512.jpg OpenLORIS_images img1320.jpg OpenLORIS_images img4475.jpg OpenLORIS_images img688.jpg OpenLORIS_images img925.jpg OpenLORIS_images img4528.jpg OpenLORIS_images img3944.jpg OpenLORIS_images img8.jpg OpenLORIS_images img1464.jpg OpenLORIS_images img976.jpg OpenLORIS_images img1203.jpg OpenLORIS_images img722.jpg OpenLORIS_images img1739.jpg OpenLORIS_images img1796.jpg OpenLORIS_images img2057.jpg OpenLORIS_images img3912.jpg OpenLORIS_images img1620.jpg OpenLORIS_images img4447.jpg OpenLORIS_images img2578.jpg OpenLORIS_images img3455.jpg OpenLORIS_images img4149.jpg OpenLORIS_images img2419.jpg OpenLORIS_images img3627.jpg OpenLORIS_images img678.jpg OpenLORIS_images img768.jpg OpenLORIS_images img3465.jpg OpenLORIS_images img2993.jpg OpenLORIS_images img2516.jpg OpenLORIS_images img3568.jpg OpenLORIS_images img2146.jpg OpenLORIS_images img2092.jpg OpenLORIS_images img5267.jpg OpenLORIS_images img3675.jpg OpenLORIS_images img5511.jpg OpenLORIS_images img294.jpg OpenLORIS_images img5589.jpg OpenLORIS_images img1525.jpg OpenLORIS_images img1582.jpg OpenLORIS_images img6102.jpg OpenLORIS_images img1569.jpg OpenLORIS_images img1504.jpg OpenLORIS_images img2770.jpg OpenLORIS_images img3736.jpg OpenLORIS_images img3438.jpg OpenLORIS_images img2726.jpg OpenLORIS_images img4394.jpg OpenLORIS_images img4622.jpg OpenLORIS_images img1773.jpg OpenLORIS_images img324.jpg OpenLORIS_images img5026.jpg OpenLORIS_images img2282.jpg OpenLORIS_images img1316.jpg OpenLORIS_images img5224.jpg OpenLORIS_images img3853.jpg OpenLORIS_images img4187.jpg OpenLORIS_images img709.jpg OpenLORIS_images img1528.jpg OpenLORIS_images img4691.jpg OpenLORIS_images img5309.jpg OpenLORIS_images img1675.jpg OpenLORIS_images img719.jpg OpenLORIS_images img361.jpg OpenLORIS_images img1508.jpg OpenLORIS_images img3218.jpg OpenLORIS_images img5532.jpg OpenLORIS_images img1354.jpg OpenLORIS_images img2130.jpg OpenLORIS_images img1000.jpg OpenLORIS_images img205.jpg OpenLORIS_images img6054.jpg OpenLORIS_images img1224.jpg OpenLORIS_images img6046.jpg OpenLORIS_images img1923.jpg OpenLORIS_images img1784.jpg OpenLORIS_images img3589.jpg OpenLORIS_images img3110.jpg OpenLORIS_images img3370.jpg OpenLORIS_images img3518.jpg OpenLORIS_images img920.jpg OpenLORIS_images img5349.jpg OpenLORIS_images img3847.jpg OpenLORIS_images img4157.jpg OpenLORIS_images img2667.jpg OpenLORIS_images img4135.jpg OpenLORIS_images img394.jpg OpenLORIS_images img4872.jpg OpenLORIS_images img5654.jpg OpenLORIS_images img5218.jpg OpenLORIS_images img3262.jpg OpenLORIS_images img5287.jpg OpenLORIS_images img4008.jpg OpenLORIS_images img3070.jpg OpenLORIS_images img55.jpg OpenLORIS_images img3560.jpg OpenLORIS_images img1918.jpg OpenLORIS_images img3113.jpg OpenLORIS_images img2957.jpg OpenLORIS_images img3833.jpg OpenLORIS_images img2934.jpg OpenLORIS_images img4375.jpg OpenLORIS_images img4030.jpg OpenLORIS_images img159.jpg OpenLORIS_images img6129.jpg OpenLORIS_images img4679.jpg OpenLORIS_images img1847.jpg OpenLORIS_images img871.jpg OpenLORIS_images img4806.jpg OpenLORIS_images img3334.jpg OpenLORIS_images img2110.jpg OpenLORIS_images img4600.jpg OpenLORIS_images img453.jpg OpenLORIS_images img5574.jpg OpenLORIS_images img6126.jpg OpenLORIS_images img2307.jpg OpenLORIS_images img5672.jpg OpenLORIS_images img5520.jpg OpenLORIS_images img2171.jpg OpenLORIS_images img2480.jpg OpenLORIS_images img3852.jpg OpenLORIS_images img815.jpg OpenLORIS_images img2051.jpg OpenLORIS_images img561.jpg OpenLORIS_images img3814.jpg OpenLORIS_images img5452.jpg OpenLORIS_images img1865.jpg OpenLORIS_images img4019.jpg OpenLORIS_images img413.jpg OpenLORIS_images img5167.jpg OpenLORIS_images img4919.jpg OpenLORIS_images img5220.jpg OpenLORIS_images img2114.jpg OpenLORIS_images img5248.jpg OpenLORIS_images img816.jpg OpenLORIS_images img1460.jpg OpenLORIS_images img1838.jpg OpenLORIS_images img1427.jpg OpenLORIS_images img4238.jpg OpenLORIS_images img3137.jpg OpenLORIS_images img4864.jpg OpenLORIS_images img3490.jpg OpenLORIS_images img5823.jpg OpenLORIS_images img5678.jpg OpenLORIS_images img1550.jpg OpenLORIS_images img5043.jpg OpenLORIS_images img2690.jpg OpenLORIS_images img2183.jpg OpenLORIS_images img557.jpg OpenLORIS_images img3746.jpg OpenLORIS_images img2613.jpg OpenLORIS_images img879.jpg OpenLORIS_images img1513.jpg OpenLORIS_images img4067.jpg OpenLORIS_images img2374.jpg OpenLORIS_images img3991.jpg OpenLORIS_images img805.jpg OpenLORIS_images img6139.jpg OpenLORIS_images img3804.jpg OpenLORIS_images img1325.jpg OpenLORIS_images img923.jpg OpenLORIS_images img3802.jpg OpenLORIS_images img3060.jpg OpenLORIS_images img917.jpg OpenLORIS_images img5420.jpg OpenLORIS_images img784.jpg OpenLORIS_images img1373.jpg OpenLORIS_images img3510.jpg OpenLORIS_images img170.jpg OpenLORIS_images img640.jpg OpenLORIS_images img5822.jpg OpenLORIS_images img1708.jpg OpenLORIS_images img2172.jpg OpenLORIS_images img5516.jpg OpenLORIS_images img5164.jpg OpenLORIS_images img3412.jpg OpenLORIS_images img2882.jpg OpenLORIS_images img703.jpg OpenLORIS_images img369.jpg OpenLORIS_images img1986.jpg OpenLORIS_images img5290.jpg OpenLORIS_images img4631.jpg OpenLORIS_images img3857.jpg OpenLORIS_images img504.jpg OpenLORIS_images img11.jpg OpenLORIS_images img154.jpg OpenLORIS_images img5940.jpg OpenLORIS_images img919.jpg OpenLORIS_images img6127.jpg OpenLORIS_images img316.jpg OpenLORIS_images img706.jpg OpenLORIS_images img4463.jpg OpenLORIS_images img2805.jpg OpenLORIS_images img5757.jpg OpenLORIS_images img1395.jpg OpenLORIS_images img2679.jpg OpenLORIS_images img746.jpg OpenLORIS_images img979.jpg OpenLORIS_images img6049.jpg OpenLORIS_images img2349.jpg OpenLORIS_images img3990.jpg OpenLORIS_images img4655.jpg OpenLORIS_images img3681.jpg OpenLORIS_images img5555.jpg OpenLORIS_images img4690.jpg OpenLORIS_images img1359.jpg OpenLORIS_images img3689.jpg OpenLORIS_images img1254.jpg OpenLORIS_images img3966.jpg OpenLORIS_images img5984.jpg OpenLORIS_images img3909.jpg OpenLORIS_images img4428.jpg OpenLORIS_images img2048.jpg OpenLORIS_images img1310.jpg OpenLORIS_images img5373.jpg OpenLORIS_images img1070.jpg OpenLORIS_images img2033.jpg OpenLORIS_images img1671.jpg OpenLORIS_images img1491.jpg OpenLORIS_images img950.jpg OpenLORIS_images img5151.jpg OpenLORIS_images img5731.jpg OpenLORIS_images img2808.jpg OpenLORIS_images img5.jpg OpenLORIS_images img1781.jpg OpenLORIS_images img1187.jpg OpenLORIS_images img5061.jpg OpenLORIS_images img5979.jpg OpenLORIS_images img156.jpg OpenLORIS_images img4719.jpg OpenLORIS_images img4851.jpg OpenLORIS_images img1698.jpg OpenLORIS_images img5502.jpg OpenLORIS_images img526.jpg OpenLORIS_images img1538.jpg OpenLORIS_images img5522.jpg OpenLORIS_images img4170.jpg OpenLORIS_images img5330.jpg OpenLORIS_images img4956.jpg OpenLORIS_images img2062.jpg OpenLORIS_images img5769.jpg OpenLORIS_images img4813.jpg OpenLORIS_images img1737.jpg OpenLORIS_images img1120.jpg OpenLORIS_images img327.jpg OpenLORIS_images img5603.jpg OpenLORIS_images img5981.jpg OpenLORIS_images img5729.jpg OpenLORIS_images img2055.jpg OpenLORIS_images img4230.jpg OpenLORIS_images img3152.jpg OpenLORIS_images img2758.jpg OpenLORIS_images img1597.jpg OpenLORIS_images img6006.jpg OpenLORIS_images img3604.jpg OpenLORIS_images img4716.jpg OpenLORIS_images img1389.jpg OpenLORIS_images img3851.jpg OpenLORIS_images img4667.jpg OpenLORIS_images img3669.jpg OpenLORIS_images img3265.jpg OpenLORIS_images img1458.jpg OpenLORIS_images img4402.jpg OpenLORIS_images img3561.jpg OpenLORIS_images img2990.jpg OpenLORIS_images img141.jpg OpenLORIS_images img2860.jpg OpenLORIS_images img1526.jpg OpenLORIS_images img4633.jpg OpenLORIS_images img1043.jpg OpenLORIS_images img852.jpg OpenLORIS_images img641.jpg OpenLORIS_images img1842.jpg OpenLORIS_images img3964.jpg OpenLORIS_images img4236.jpg OpenLORIS_images img224.jpg OpenLORIS_images img2544.jpg OpenLORIS_images img203.jpg OpenLORIS_images img3311.jpg OpenLORIS_images img4151.jpg OpenLORIS_images img982.jpg OpenLORIS_images img3506.jpg OpenLORIS_images img2796.jpg OpenLORIS_images img1667.jpg OpenLORIS_images img2089.jpg OpenLORIS_images img1439.jpg OpenLORIS_images img5019.jpg OpenLORIS_images img3942.jpg OpenLORIS_images img3958.jpg OpenLORIS_images img2930.jpg OpenLORIS_images img1995.jpg OpenLORIS_images img2995.jpg OpenLORIS_images img834.jpg OpenLORIS_images img404.jpg OpenLORIS_images img4558.jpg OpenLORIS_images img4219.jpg OpenLORIS_images img3845.jpg OpenLORIS_images img3458.jpg OpenLORIS_images img1900.jpg OpenLORIS_images img5891.jpg OpenLORIS_images img5103.jpg OpenLORIS_images img3935.jpg OpenLORIS_images img5905.jpg OpenLORIS_images img2656.jpg OpenLORIS_images img4000.jpg OpenLORIS_images img4345.jpg OpenLORIS_images img4016.jpg OpenLORIS_images img4299.jpg OpenLORIS_images img1111.jpg OpenLORIS_images img5985.jpg OpenLORIS_images img5869.jpg OpenLORIS_images img3782.jpg OpenLORIS_images img25.jpg OpenLORIS_images img2649.jpg OpenLORIS_images img776.jpg OpenLORIS_images img3673.jpg OpenLORIS_images img2698.jpg OpenLORIS_images img2597.jpg OpenLORIS_images img4502.jpg OpenLORIS_images img502.jpg OpenLORIS_images img971.jpg OpenLORIS_images img2105.jpg OpenLORIS_images img2633.jpg OpenLORIS_images img1531.jpg OpenLORIS_images img1221.jpg OpenLORIS_images img3317.jpg OpenLORIS_images img4888.jpg OpenLORIS_images img2991.jpg OpenLORIS_images img476.jpg OpenLORIS_images img149.jpg OpenLORIS_images img2131.jpg OpenLORIS_images img401.jpg OpenLORIS_images img4988.jpg OpenLORIS_images img2875.jpg OpenLORIS_images img2732.jpg OpenLORIS_images img4902.jpg OpenLORIS_images img5007.jpg OpenLORIS_images img5524.jpg OpenLORIS_images img3597.jpg OpenLORIS_images img4779.jpg OpenLORIS_images img2203.jpg OpenLORIS_images img4175.jpg OpenLORIS_images img943.jpg OpenLORIS_images img3717.jpg OpenLORIS_images img922.jpg OpenLORIS_images img313.jpg OpenLORIS_images img5742.jpg OpenLORIS_images img5745.jpg OpenLORIS_images img3016.jpg OpenLORIS_images img2662.jpg OpenLORIS_images img2030.jpg OpenLORIS_images img3608.jpg OpenLORIS_images img3201.jpg OpenLORIS_images img3495.jpg OpenLORIS_images img1505.jpg OpenLORIS_images img2916.jpg OpenLORIS_images img5969.jpg OpenLORIS_images img5895.jpg OpenLORIS_images img3546.jpg OpenLORIS_images img22.jpg OpenLORIS_images img4494.jpg OpenLORIS_images img2082.jpg OpenLORIS_images img5977.jpg OpenLORIS_images img2579.jpg OpenLORIS_images img3081.jpg OpenLORIS_images img4127.jpg OpenLORIS_images img4356.jpg OpenLORIS_images img4789.jpg OpenLORIS_images img2807.jpg OpenLORIS_images img4839.jpg OpenLORIS_images img5129.jpg OpenLORIS_images img1424.jpg OpenLORIS_images img2021.jpg OpenLORIS_images img2385.jpg OpenLORIS_images img4950.jpg OpenLORIS_images img1981.jpg OpenLORIS_images img3425.jpg OpenLORIS_images img2515.jpg OpenLORIS_images img5353.jpg OpenLORIS_images img3946.jpg OpenLORIS_images img1282.jpg OpenLORIS_images img821.jpg OpenLORIS_images img525.jpg OpenLORIS_images img4827.jpg OpenLORIS_images img499.jpg OpenLORIS_images img482.jpg OpenLORIS_images img3138.jpg OpenLORIS_images img90.jpg OpenLORIS_images img4147.jpg OpenLORIS_images img3674.jpg OpenLORIS_images img336.jpg OpenLORIS_images img3987.jpg OpenLORIS_images img4896.jpg OpenLORIS_images img4415.jpg OpenLORIS_images img4669.jpg OpenLORIS_images img5929.jpg OpenLORIS_images img5601.jpg OpenLORIS_images img4795.jpg OpenLORIS_images img1201.jpg OpenLORIS_images img5550.jpg OpenLORIS_images img838.jpg OpenLORIS_images img4258.jpg OpenLORIS_images img1263.jpg OpenLORIS_images img347.jpg OpenLORIS_images img712.jpg OpenLORIS_images img3724.jpg OpenLORIS_images img785.jpg OpenLORIS_images img5148.jpg OpenLORIS_images img184.jpg OpenLORIS_images img1966.jpg OpenLORIS_images img3067.jpg OpenLORIS_images img3653.jpg OpenLORIS_images img444.jpg OpenLORIS_images img4944.jpg OpenLORIS_images img4897.jpg OpenLORIS_images img1859.jpg OpenLORIS_images img4989.jpg OpenLORIS_images img5225.jpg OpenLORIS_images img3905.jpg OpenLORIS_images img1394.jpg OpenLORIS_images img5263.jpg OpenLORIS_images img4425.jpg OpenLORIS_images img4936.jpg OpenLORIS_images img4155.jpg OpenLORIS_images img5196.jpg OpenLORIS_images img3329.jpg OpenLORIS_images img91.jpg OpenLORIS_images img2495.jpg OpenLORIS_images img857.jpg OpenLORIS_images img1786.jpg OpenLORIS_images img782.jpg OpenLORIS_images img4508.jpg OpenLORIS_images img3395.jpg OpenLORIS_images img5186.jpg OpenLORIS_images img1658.jpg OpenLORIS_images img3656.jpg OpenLORIS_images img2009.jpg OpenLORIS_images img40.jpg OpenLORIS_images img5579.jpg OpenLORIS_images img4156.jpg OpenLORIS_images img727.jpg OpenLORIS_images img477.jpg OpenLORIS_images img5354.jpg OpenLORIS_images img2777.jpg OpenLORIS_images img2175.jpg OpenLORIS_images img2133.jpg OpenLORIS_images img3259.jpg OpenLORIS_images img3214.jpg OpenLORIS_images img2330.jpg OpenLORIS_images img2297.jpg OpenLORIS_images img685.jpg OpenLORIS_images img1122.jpg OpenLORIS_images img2802.jpg OpenLORIS_images img931.jpg OpenLORIS_images img5435.jpg OpenLORIS_images img3550.jpg OpenLORIS_images img5175.jpg OpenLORIS_images img3337.jpg OpenLORIS_images img3203.jpg OpenLORIS_images img1317.jpg OpenLORIS_images img5565.jpg OpenLORIS_images img5661.jpg OpenLORIS_images img5378.jpg OpenLORIS_images img4460.jpg OpenLORIS_images img3643.jpg OpenLORIS_images img3342.jpg OpenLORIS_images img2989.jpg OpenLORIS_images img1344.jpg OpenLORIS_images img3244.jpg OpenLORIS_images img3751.jpg OpenLORIS_images img5995.jpg OpenLORIS_images img5514.jpg OpenLORIS_images img1181.jpg OpenLORIS_images img978.jpg OpenLORIS_images img864.jpg OpenLORIS_images img3101.jpg OpenLORIS_images img2086.jpg OpenLORIS_images img4253.jpg OpenLORIS_images img4133.jpg OpenLORIS_images img5463.jpg OpenLORIS_images img2195.jpg OpenLORIS_images img5111.jpg OpenLORIS_images img4946.jpg OpenLORIS_images img1331.jpg OpenLORIS_images img4247.jpg OpenLORIS_images img399.jpg OpenLORIS_images img4825.jpg OpenLORIS_images img2070.jpg OpenLORIS_images img2514.jpg OpenLORIS_images img5340.jpg OpenLORIS_images img6036.jpg OpenLORIS_images img3048.jpg OpenLORIS_images img406.jpg OpenLORIS_images img3936.jpg OpenLORIS_images img246.jpg OpenLORIS_images img2768.jpg OpenLORIS_images img4593.jpg OpenLORIS_images img4320.jpg OpenLORIS_images img4359.jpg OpenLORIS_images img4590.jpg OpenLORIS_images img5633.jpg OpenLORIS_images img3781.jpg OpenLORIS_images img2280.jpg OpenLORIS_images img5483.jpg OpenLORIS_images img4629.jpg OpenLORIS_images img3033.jpg OpenLORIS_images img5958.jpg OpenLORIS_images img4430.jpg OpenLORIS_images img1881.jpg OpenLORIS_images img5331.jpg OpenLORIS_images img4568.jpg OpenLORIS_images img513.jpg OpenLORIS_images img4517.jpg OpenLORIS_images img2525.jpg OpenLORIS_images img2877.jpg OpenLORIS_images img5806.jpg OpenLORIS_images img2368.jpg OpenLORIS_images img2351.jpg OpenLORIS_images img596.jpg OpenLORIS_images img808.jpg OpenLORIS_images img4090.jpg OpenLORIS_images img3408.jpg OpenLORIS_images img4557.jpg OpenLORIS_images img4961.jpg OpenLORIS_images img1735.jpg OpenLORIS_images img5619.jpg OpenLORIS_images img4647.jpg OpenLORIS_images img1063.jpg OpenLORIS_images img4688.jpg OpenLORIS_images img3043.jpg OpenLORIS_images img2462.jpg OpenLORIS_images img5734.jpg OpenLORIS_images img6044.jpg OpenLORIS_images img1106.jpg OpenLORIS_images img3605.jpg OpenLORIS_images img5569.jpg OpenLORIS_images img5376.jpg OpenLORIS_images img2874.jpg OpenLORIS_images img4438.jpg OpenLORIS_images img409.jpg OpenLORIS_images img2460.jpg OpenLORIS_images img1137.jpg OpenLORIS_images img2265.jpg OpenLORIS_images img1421.jpg OpenLORIS_images img3949.jpg OpenLORIS_images img929.jpg OpenLORIS_images img5626.jpg OpenLORIS_images img2257.jpg OpenLORIS_images img3957.jpg OpenLORIS_images img3040.jpg OpenLORIS_images img4444.jpg OpenLORIS_images img1384.jpg OpenLORIS_images img6009.jpg OpenLORIS_images img2059.jpg OpenLORIS_images img677.jpg OpenLORIS_images img4948.jpg OpenLORIS_images img3466.jpg OpenLORIS_images img4775.jpg OpenLORIS_images img3207.jpg OpenLORIS_images img1746.jpg OpenLORIS_images img3286.jpg OpenLORIS_images img1692.jpg OpenLORIS_images img3165.jpg OpenLORIS_images img3050.jpg OpenLORIS_images img991.jpg OpenLORIS_images img6136.jpg OpenLORIS_images img1956.jpg OpenLORIS_images img3122.jpg OpenLORIS_images img4057.jpg OpenLORIS_images img732.jpg OpenLORIS_images img4125.jpg OpenLORIS_images img240.jpg OpenLORIS_images img4975.jpg OpenLORIS_images img5563.jpg OpenLORIS_images img5484.jpg OpenLORIS_images img1805.jpg OpenLORIS_images img5469.jpg OpenLORIS_images img5771.jpg OpenLORIS_images img408.jpg OpenLORIS_images img5299.jpg OpenLORIS_images img99.jpg OpenLORIS_images img4305.jpg OpenLORIS_images img5453.jpg OpenLORIS_images img956.jpg OpenLORIS_images img339.jpg OpenLORIS_images img2404.jpg OpenLORIS_images img5122.jpg OpenLORIS_images img3194.jpg OpenLORIS_images img3665.jpg OpenLORIS_images img4792.jpg OpenLORIS_images img4334.jpg OpenLORIS_images img781.jpg OpenLORIS_images img102.jpg OpenLORIS_images img1622.jpg OpenLORIS_images img4061.jpg OpenLORIS_images img1098.jpg OpenLORIS_images img5449.jpg OpenLORIS_images img4377.jpg OpenLORIS_images img5882.jpg OpenLORIS_images img4675.jpg OpenLORIS_images img4863.jpg OpenLORIS_images img2894.jpg OpenLORIS_images img3823.jpg OpenLORIS_images img1899.jpg OpenLORIS_images img2621.jpg OpenLORIS_images img2253.jpg OpenLORIS_images img6037.jpg OpenLORIS_images img5135.jpg OpenLORIS_images img1602.jpg OpenLORIS_images img4626.jpg OpenLORIS_images img2251.jpg OpenLORIS_images img2006.jpg OpenLORIS_images img579.jpg OpenLORIS_images img1480.jpg OpenLORIS_images img3237.jpg OpenLORIS_images img3955.jpg OpenLORIS_images img3529.jpg OpenLORIS_images img4413.jpg OpenLORIS_images img2765.jpg OpenLORIS_images img4666.jpg OpenLORIS_images img5715.jpg OpenLORIS_images img3384.jpg OpenLORIS_images img5537.jpg OpenLORIS_images img5381.jpg OpenLORIS_images img3319.jpg OpenLORIS_images img2464.jpg OpenLORIS_images img4442.jpg OpenLORIS_images img2359.jpg OpenLORIS_images img371.jpg OpenLORIS_images img2591.jpg OpenLORIS_images img5623.jpg OpenLORIS_images img3953.jpg OpenLORIS_images img1398.jpg OpenLORIS_images img1507.jpg OpenLORIS_images img4483.jpg OpenLORIS_images img757.jpg OpenLORIS_images img5227.jpg OpenLORIS_images img297.jpg OpenLORIS_images img2184.jpg OpenLORIS_images img566.jpg OpenLORIS_images img5192.jpg OpenLORIS_images img1604.jpg OpenLORIS_images img3255.jpg OpenLORIS_images img3134.jpg OpenLORIS_images img3343.jpg OpenLORIS_images img5818.jpg OpenLORIS_images img2473.jpg OpenLORIS_images img1616.jpg OpenLORIS_images img4124.jpg OpenLORIS_images img3302.jpg OpenLORIS_images img300.jpg OpenLORIS_images img3174.jpg OpenLORIS_images img3820.jpg OpenLORIS_images img288.jpg OpenLORIS_images img2610.jpg OpenLORIS_images img1586.jpg OpenLORIS_images img325.jpg OpenLORIS_images img331.jpg OpenLORIS_images img4563.jpg OpenLORIS_images img3069.jpg OpenLORIS_images img2201.jpg OpenLORIS_images img2529.jpg OpenLORIS_images img1694.jpg OpenLORIS_images img5479.jpg OpenLORIS_images img376.jpg OpenLORIS_images img2615.jpg OpenLORIS_images img1645.jpg OpenLORIS_images img3883.jpg OpenLORIS_images img4095.jpg OpenLORIS_images img4174.jpg OpenLORIS_images img934.jpg OpenLORIS_images img635.jpg OpenLORIS_images img638.jpg OpenLORIS_images img2100.jpg OpenLORIS_images img5478.jpg OpenLORIS_images img3171.jpg OpenLORIS_images img42.jpg OpenLORIS_images img620.jpg OpenLORIS_images img2977.jpg OpenLORIS_images img649.jpg OpenLORIS_images img5911.jpg OpenLORIS_images img5171.jpg OpenLORIS_images img1614.jpg OpenLORIS_images img2556.jpg OpenLORIS_images img5736.jpg OpenLORIS_images img5210.jpg OpenLORIS_images img5023.jpg OpenLORIS_images img3502.jpg OpenLORIS_images img5829.jpg OpenLORIS_images img3886.jpg OpenLORIS_images img318.jpg OpenLORIS_images img896.jpg OpenLORIS_images img1792.jpg OpenLORIS_images img4573.jpg OpenLORIS_images img6143.jpg OpenLORIS_images img4865.jpg OpenLORIS_images img4278.jpg OpenLORIS_images img4718.jpg OpenLORIS_images img2536.jpg OpenLORIS_images img5599.jpg OpenLORIS_images img2828.jpg OpenLORIS_images img2641.jpg OpenLORIS_images img2458.jpg OpenLORIS_images img1016.jpg OpenLORIS_images img147.jpg OpenLORIS_images img1576.jpg OpenLORIS_images img1535.jpg OpenLORIS_images img3415.jpg OpenLORIS_images img2260.jpg OpenLORIS_images img4094.jpg OpenLORIS_images img1633.jpg OpenLORIS_images img37.jpg OpenLORIS_images img3760.jpg OpenLORIS_images img2572.jpg OpenLORIS_images img2463.jpg OpenLORIS_images img5448.jpg OpenLORIS_images img3433.jpg OpenLORIS_images img5812.jpg OpenLORIS_images img745.jpg OpenLORIS_images img4674.jpg OpenLORIS_images img1703.jpg OpenLORIS_images img4363.jpg OpenLORIS_images img2951.jpg OpenLORIS_images img2792.jpg OpenLORIS_images img6076.jpg OpenLORIS_images img1194.jpg OpenLORIS_images img2121.jpg OpenLORIS_images img5639.jpg OpenLORIS_images img1674.jpg OpenLORIS_images img2903.jpg OpenLORIS_images img1165.jpg OpenLORIS_images img4798.jpg OpenLORIS_images img3778.jpg OpenLORIS_images img4223.jpg OpenLORIS_images img242.jpg OpenLORIS_images img4660.jpg OpenLORIS_images img45.jpg OpenLORIS_images img883.jpg OpenLORIS_images img3898.jpg OpenLORIS_images img1110.jpg OpenLORIS_images img2493.jpg OpenLORIS_images img160.jpg OpenLORIS_images img3000.jpg OpenLORIS_images img4644.jpg OpenLORIS_images img4461.jpg OpenLORIS_images img1835.jpg OpenLORIS_images img5556.jpg OpenLORIS_images img3170.jpg OpenLORIS_images img1047.jpg OpenLORIS_images img277.jpg OpenLORIS_images img3542.jpg OpenLORIS_images img5433.jpg OpenLORIS_images img811.jpg OpenLORIS_images img2064.jpg OpenLORIS_images img299.jpg OpenLORIS_images img1180.jpg OpenLORIS_images img3094.jpg OpenLORIS_images img1195.jpg OpenLORIS_images img4146.jpg OpenLORIS_images img1176.jpg OpenLORIS_images img3943.jpg OpenLORIS_images img274.jpg OpenLORIS_images img3463.jpg OpenLORIS_images img3829.jpg OpenLORIS_images img4899.jpg OpenLORIS_images img972.jpg OpenLORIS_images img5387.jpg OpenLORIS_images img503.jpg OpenLORIS_images img3042.jpg OpenLORIS_images img1245.jpg OpenLORIS_images img4980.jpg OpenLORIS_images img1691.jpg OpenLORIS_images img512.jpg OpenLORIS_images img3710.jpg OpenLORIS_images img5276.jpg OpenLORIS_images img5573.jpg OpenLORIS_images img3001.jpg OpenLORIS_images img3595.jpg OpenLORIS_images img4472.jpg OpenLORIS_images img1230.jpg OpenLORIS_images img4495.jpg OpenLORIS_images img3387.jpg OpenLORIS_images img1794.jpg OpenLORIS_images img2103.jpg OpenLORIS_images img2028.jpg OpenLORIS_images img593.jpg OpenLORIS_images img2705.jpg OpenLORIS_images img5235.jpg OpenLORIS_images img3159.jpg OpenLORIS_images img1322.jpg OpenLORIS_images img2390.jpg OpenLORIS_images img3975.jpg OpenLORIS_images img1002.jpg OpenLORIS_images img2837.jpg OpenLORIS_images img2352.jpg OpenLORIS_images img2618.jpg OpenLORIS_images img952.jpg OpenLORIS_images img2670.jpg OpenLORIS_images img1511.jpg OpenLORIS_images img1711.jpg OpenLORIS_images img1742.jpg OpenLORIS_images img4553.jpg OpenLORIS_images img4873.jpg OpenLORIS_images img3283.jpg OpenLORIS_images img1225.jpg OpenLORIS_images img3523.jpg OpenLORIS_images img471.jpg OpenLORIS_images img5089.jpg OpenLORIS_images img5215.jpg OpenLORIS_images img4276.jpg OpenLORIS_images img6108.jpg OpenLORIS_images img1179.jpg OpenLORIS_images img4838.jpg OpenLORIS_images img5489.jpg OpenLORIS_images img5366.jpg OpenLORIS_images img2933.jpg OpenLORIS_images img5139.jpg OpenLORIS_images img1709.jpg OpenLORIS_images img5581.jpg OpenLORIS_images img6092.jpg OpenLORIS_images img5583.jpg OpenLORIS_images img317.jpg OpenLORIS_images img5765.jpg OpenLORIS_images img414.jpg OpenLORIS_images img4615.jpg OpenLORIS_images img345.jpg OpenLORIS_images img385.jpg OpenLORIS_images img5431.jpg OpenLORIS_images img3698.jpg OpenLORIS_images img391.jpg OpenLORIS_images img3534.jpg OpenLORIS_images img4554.jpg OpenLORIS_images img2760.jpg OpenLORIS_images img2209.jpg OpenLORIS_images img3930.jpg OpenLORIS_images img1485.jpg OpenLORIS_images img2304.jpg OpenLORIS_images img5388.jpg OpenLORIS_images img4945.jpg OpenLORIS_images img133.jpg OpenLORIS_images img395.jpg OpenLORIS_images img1681.jpg OpenLORIS_images img4552.jpg OpenLORIS_images img4793.jpg OpenLORIS_images img2878.jpg OpenLORIS_images img5352.jpg OpenLORIS_images img349.jpg OpenLORIS_images img1541.jpg OpenLORIS_images img1107.jpg OpenLORIS_images img818.jpg OpenLORIS_images img4153.jpg OpenLORIS_images img1894.jpg OpenLORIS_images img3623.jpg OpenLORIS_images img3390.jpg OpenLORIS_images img4973.jpg OpenLORIS_images img1324.jpg OpenLORIS_images img2012.jpg OpenLORIS_images img5906.jpg OpenLORIS_images img3279.jpg OpenLORIS_images img282.jpg OpenLORIS_images img2442.jpg OpenLORIS_images img1078.jpg OpenLORIS_images img2736.jpg OpenLORIS_images img1927.jpg OpenLORIS_images img880.jpg OpenLORIS_images img290.jpg OpenLORIS_images img1808.jpg OpenLORIS_images img755.jpg OpenLORIS_images img5582.jpg OpenLORIS_images img4971.jpg OpenLORIS_images img1517.jpg OpenLORIS_images img3680.jpg OpenLORIS_images img424.jpg OpenLORIS_images img569.jpg OpenLORIS_images img3158.jpg OpenLORIS_images img2739.jpg OpenLORIS_images img1793.jpg OpenLORIS_images img3538.jpg OpenLORIS_images img4401.jpg OpenLORIS_images img3876.jpg OpenLORIS_images img668.jpg OpenLORIS_images img1554.jpg OpenLORIS_images img1839.jpg OpenLORIS_images img1680.jpg OpenLORIS_images img4140.jpg OpenLORIS_images img2010.jpg OpenLORIS_images img3551.jpg OpenLORIS_images img4595.jpg OpenLORIS_images img3601.jpg OpenLORIS_images img5389.jpg OpenLORIS_images img2868.jpg OpenLORIS_images img3922.jpg OpenLORIS_images img1133.jpg OpenLORIS_images img3906.jpg OpenLORIS_images img5474.jpg OpenLORIS_images img4978.jpg OpenLORIS_images img2162.jpg OpenLORIS_images img3967.jpg OpenLORIS_images img4943.jpg OpenLORIS_images img5458.jpg OpenLORIS_images img740.jpg OpenLORIS_images img5038.jpg OpenLORIS_images img2467.jpg OpenLORIS_images img4499.jpg OpenLORIS_images img3988.jpg OpenLORIS_images img4619.jpg OpenLORIS_images img4403.jpg OpenLORIS_images img5548.jpg OpenLORIS_images img729.jpg OpenLORIS_images img354.jpg OpenLORIS_images img5258.jpg OpenLORIS_images img1910.jpg OpenLORIS_images img2238.jpg OpenLORIS_images img330.jpg OpenLORIS_images img5842.jpg OpenLORIS_images img1957.jpg OpenLORIS_images img1608.jpg OpenLORIS_images img5799.jpg OpenLORIS_images img3075.jpg OpenLORIS_images img5966.jpg OpenLORIS_images img4518.jpg OpenLORIS_images img4467.jpg OpenLORIS_images img5450.jpg OpenLORIS_images img4976.jpg OpenLORIS_images img3400.jpg OpenLORIS_images img2521.jpg OpenLORIS_images img1822.jpg OpenLORIS_images img6059.jpg OpenLORIS_images img3092.jpg OpenLORIS_images img1261.jpg OpenLORIS_images img437.jpg OpenLORIS_images img4458.jpg OpenLORIS_images img454.jpg OpenLORIS_images img5669.jpg OpenLORIS_images img2826.jpg OpenLORIS_images img4102.jpg OpenLORIS_images img775.jpg OpenLORIS_images img4171.jpg OpenLORIS_images img5544.jpg OpenLORIS_images img2384.jpg OpenLORIS_images img2501.jpg OpenLORIS_images img3508.jpg OpenLORIS_images img639.jpg OpenLORIS_images img5635.jpg OpenLORIS_images img3331.jpg OpenLORIS_images img334.jpg OpenLORIS_images img4880.jpg OpenLORIS_images img6017.jpg OpenLORIS_images img1417.jpg OpenLORIS_images img1269.jpg OpenLORIS_images img356.jpg OpenLORIS_images img4051.jpg OpenLORIS_images img1191.jpg OpenLORIS_images img2689.jpg OpenLORIS_images img5572.jpg OpenLORIS_images img4803.jpg OpenLORIS_images img5526.jpg OpenLORIS_images img487.jpg OpenLORIS_images img5417.jpg OpenLORIS_images img5159.jpg OpenLORIS_images img213.jpg OpenLORIS_images img3446.jpg OpenLORIS_images img5386.jpg OpenLORIS_images img243.jpg OpenLORIS_images img2620.jpg OpenLORIS_images img270.jpg OpenLORIS_images img5922.jpg OpenLORIS_images img5027.jpg OpenLORIS_images img4349.jpg OpenLORIS_images img5336.jpg OpenLORIS_images img1533.jpg OpenLORIS_images img694.jpg OpenLORIS_images img2866.jpg OpenLORIS_images img2004.jpg OpenLORIS_images img3062.jpg OpenLORIS_images img3166.jpg OpenLORIS_images img2053.jpg OpenLORIS_images img1853.jpg OpenLORIS_images img3297.jpg OpenLORIS_images img5604.jpg OpenLORIS_images img2431.jpg OpenLORIS_images img2109.jpg OpenLORIS_images img2891.jpg OpenLORIS_images img6070.jpg OpenLORIS_images img64.jpg OpenLORIS_images img5970.jpg OpenLORIS_images img5684.jpg OpenLORIS_images img4908.jpg OpenLORIS_images img4963.jpg OpenLORIS_images img4420.jpg OpenLORIS_images img2619.jpg OpenLORIS_images img3131.jpg OpenLORIS_images img1260.jpg OpenLORIS_images img1731.jpg OpenLORIS_images img2426.jpg OpenLORIS_images img623.jpg OpenLORIS_images img1435.jpg OpenLORIS_images img1970.jpg OpenLORIS_images img1982.jpg OpenLORIS_images img4501.jpg OpenLORIS_images img4915.jpg OpenLORIS_images img2451.jpg OpenLORIS_images img3763.jpg OpenLORIS_images img1217.jpg OpenLORIS_images img6111.jpg OpenLORIS_images img843.jpg OpenLORIS_images img4433.jpg OpenLORIS_images img2350.jpg OpenLORIS_images img1747.jpg OpenLORIS_images img3210.jpg OpenLORIS_images img1347.jpg OpenLORIS_images img195.jpg OpenLORIS_images img3087.jpg OpenLORIS_images img1721.jpg OpenLORIS_images img2027.jpg OpenLORIS_images img5780.jpg OpenLORIS_images img4982.jpg OpenLORIS_images img2799.jpg OpenLORIS_images img1804.jpg OpenLORIS_images img4693.jpg OpenLORIS_images img5746.jpg OpenLORIS_images img1228.jpg OpenLORIS_images img1232.jpg OpenLORIS_images img4652.jpg OpenLORIS_images img4455.jpg OpenLORIS_images img3730.jpg OpenLORIS_images img2850.jpg OpenLORIS_images img5909.jpg OpenLORIS_images img1327.jpg OpenLORIS_images img562.jpg OpenLORIS_images img1038.jpg OpenLORIS_images img3530.jpg OpenLORIS_images img741.jpg OpenLORIS_images img5078.jpg OpenLORIS_images img3559.jpg OpenLORIS_images img2369.jpg OpenLORIS_images img488.jpg OpenLORIS_images img2296.jpg OpenLORIS_images img2914.jpg OpenLORIS_images img5924.jpg OpenLORIS_images img5760.jpg OpenLORIS_images img4817.jpg OpenLORIS_images img551.jpg OpenLORIS_images img1990.jpg OpenLORIS_images img3566.jpg OpenLORIS_images img1913.jpg OpenLORIS_images img4938.jpg OpenLORIS_images img3720.jpg OpenLORIS_images img3843.jpg OpenLORIS_images img1364.jpg OpenLORIS_images img5918.jpg OpenLORIS_images img81.jpg OpenLORIS_images img6003.jpg OpenLORIS_images img4575.jpg OpenLORIS_images img3202.jpg OpenLORIS_images img1168.jpg OpenLORIS_images img2695.jpg OpenLORIS_images img3901.jpg OpenLORIS_images img4119.jpg OpenLORIS_images img1032.jpg OpenLORIS_images img4933.jpg OpenLORIS_images img4883.jpg OpenLORIS_images img840.jpg OpenLORIS_images img3130.jpg OpenLORIS_images img4869.jpg OpenLORIS_images img2007.jpg OpenLORIS_images img3193.jpg OpenLORIS_images img2022.jpg OpenLORIS_images img5350.jpg OpenLORIS_images img3818.jpg OpenLORIS_images img2308.jpg OpenLORIS_images img1093.jpg OpenLORIS_images img3250.jpg OpenLORIS_images img1442.jpg OpenLORIS_images img4399.jpg OpenLORIS_images img2455.jpg OpenLORIS_images img3098.jpg OpenLORIS_images img3347.jpg OpenLORIS_images img5015.jpg OpenLORIS_images img2833.jpg OpenLORIS_images img247.jpg OpenLORIS_images img3222.jpg OpenLORIS_images img3184.jpg OpenLORIS_images img3979.jpg OpenLORIS_images img4273.jpg OpenLORIS_images img1388.jpg OpenLORIS_images img1172.jpg OpenLORIS_images img1089.jpg OpenLORIS_images img4837.jpg OpenLORIS_images img162.jpg OpenLORIS_images img4065.jpg OpenLORIS_images img3126.jpg OpenLORIS_images img1459.jpg OpenLORIS_images img2859.jpg OpenLORIS_images img4251.jpg OpenLORIS_images img6128.jpg OpenLORIS_images img5118.jpg OpenLORIS_images img5001.jpg OpenLORIS_images img3703.jpg OpenLORIS_images img5629.jpg OpenLORIS_images img1612.jpg OpenLORIS_images img4481.jpg OpenLORIS_images img1004.jpg OpenLORIS_images img1964.jpg OpenLORIS_images img73.jpg OpenLORIS_images img281.jpg OpenLORIS_images img3511.jpg OpenLORIS_images img4386.jpg OpenLORIS_images img5438.jpg OpenLORIS_images img2657.jpg OpenLORIS_images img2215.jpg OpenLORIS_images img1611.jpg OpenLORIS_images img1280.jpg OpenLORIS_images img1840.jpg OpenLORIS_images img2008.jpg OpenLORIS_images img778.jpg OpenLORIS_images img2835.jpg OpenLORIS_images img1135.jpg OpenLORIS_images img2754.jpg OpenLORIS_images img4668.jpg OpenLORIS_images img5445.jpg OpenLORIS_images img174.jpg OpenLORIS_images img4306.jpg OpenLORIS_images img5606.jpg OpenLORIS_images img515.jpg OpenLORIS_images img2243.jpg OpenLORIS_images img3321.jpg OpenLORIS_images img5914.jpg OpenLORIS_images img1335.jpg OpenLORIS_images img3668.jpg OpenLORIS_images img2804.jpg OpenLORIS_images img1701.jpg OpenLORIS_images img1744.jpg OpenLORIS_images img1509.jpg OpenLORIS_images img3588.jpg OpenLORIS_images img1750.jpg OpenLORIS_images img2669.jpg OpenLORIS_images img3937.jpg OpenLORIS_images img3028.jpg OpenLORIS_images img4426.jpg OpenLORIS_images img5002.jpg OpenLORIS_images img5605.jpg OpenLORIS_images img5195.jpg OpenLORIS_images img5150.jpg OpenLORIS_images img6035.jpg OpenLORIS_images img4931.jpg OpenLORIS_images img5850.jpg OpenLORIS_images img4610.jpg OpenLORIS_images img2905.jpg OpenLORIS_images img4556.jpg OpenLORIS_images img3178.jpg OpenLORIS_images img673.jpg OpenLORIS_images img1788.jpg OpenLORIS_images img4316.jpg OpenLORIS_images img1450.jpg OpenLORIS_images img2569.jpg OpenLORIS_images img2881.jpg OpenLORIS_images img2625.jpg OpenLORIS_images img847.jpg OpenLORIS_images img696.jpg OpenLORIS_images img5789.jpg OpenLORIS_images img3868.jpg OpenLORIS_images img4256.jpg OpenLORIS_images img134.jpg OpenLORIS_images img2627.jpg OpenLORIS_images img1202.jpg OpenLORIS_images img1465.jpg OpenLORIS_images img5726.jpg OpenLORIS_images img5275.jpg OpenLORIS_images img2347.jpg OpenLORIS_images img314.jpg OpenLORIS_images img3354.jpg OpenLORIS_images img6061.jpg OpenLORIS_images img6109.jpg OpenLORIS_images img583.jpg OpenLORIS_images img5239.jpg OpenLORIS_images img1607.jpg OpenLORIS_images img2124.jpg OpenLORIS_images img2719.jpg OpenLORIS_images img4618.jpg OpenLORIS_images img4190.jpg OpenLORIS_images img5204.jpg OpenLORIS_images img3484.jpg OpenLORIS_images img2541.jpg OpenLORIS_images img5598.jpg OpenLORIS_images img3498.jpg OpenLORIS_images img3125.jpg OpenLORIS_images img434.jpg OpenLORIS_images img2589.jpg OpenLORIS_images img5077.jpg OpenLORIS_images img1928.jpg OpenLORIS_images img5301.jpg OpenLORIS_images img1676.jpg OpenLORIS_images img5229.jpg OpenLORIS_images img3168.jpg OpenLORIS_images img4228.jpg OpenLORIS_images img3141.jpg OpenLORIS_images img3998.jpg OpenLORIS_images img6062.jpg OpenLORIS_images img5379.jpg OpenLORIS_images img3306.jpg OpenLORIS_images img2815.jpg OpenLORIS_images img202.jpg OpenLORIS_images img2910.jpg OpenLORIS_images img4059.jpg OpenLORIS_images img4343.jpg OpenLORIS_images img3088.jpg OpenLORIS_images img1710.jpg OpenLORIS_images img1126.jpg OpenLORIS_images img5436.jpg OpenLORIS_images img4585.jpg OpenLORIS_images img2482.jpg OpenLORIS_images img4893.jpg OpenLORIS_images img4109.jpg OpenLORIS_images img1386.jpg OpenLORIS_images img4302.jpg OpenLORIS_images img4836.jpg OpenLORIS_images img4850.jpg OpenLORIS_images img244.jpg OpenLORIS_images img767.jpg OpenLORIS_images img5024.jpg OpenLORIS_images img2139.jpg OpenLORIS_images img4443.jpg OpenLORIS_images img4167.jpg OpenLORIS_images img2013.jpg OpenLORIS_images img6040.jpg OpenLORIS_images img2738.jpg OpenLORIS_images img3156.jpg OpenLORIS_images img3146.jpg OpenLORIS_images img2527.jpg OpenLORIS_images img4527.jpg OpenLORIS_images img1891.jpg OpenLORIS_images img2775.jpg OpenLORIS_images img275.jpg OpenLORIS_images img4143.jpg OpenLORIS_images img3163.jpg OpenLORIS_images img5180.jpg OpenLORIS_images img3696.jpg OpenLORIS_images img5668.jpg OpenLORIS_images img1907.jpg OpenLORIS_images img5973.jpg OpenLORIS_images img2956.jpg OpenLORIS_images img900.jpg OpenLORIS_images img1321.jpg OpenLORIS_images img5515.jpg OpenLORIS_images img4212.jpg OpenLORIS_images img4506.jpg OpenLORIS_images img1441.jpg OpenLORIS_images img4351.jpg OpenLORIS_images img5936.jpg OpenLORIS_images img6124.jpg OpenLORIS_images img3209.jpg OpenLORIS_images img2481.jpg OpenLORIS_images img3270.jpg OpenLORIS_images img2263.jpg OpenLORIS_images img1583.jpg OpenLORIS_images img5621.jpg OpenLORIS_images img3713.jpg OpenLORIS_images img4498.jpg OpenLORIS_images img4924.jpg OpenLORIS_images img2542.jpg OpenLORIS_images img3312.jpg OpenLORIS_images img6075.jpg OpenLORIS_images img4243.jpg OpenLORIS_images img500.jpg OpenLORIS_images img4860.jpg OpenLORIS_images img1410.jpg OpenLORIS_images img2125.jpg OpenLORIS_images img4043.jpg OpenLORIS_images img3889.jpg OpenLORIS_images img2756.jpg OpenLORIS_images img1481.jpg OpenLORIS_images img5498.jpg OpenLORIS_images img3813.jpg OpenLORIS_images img3971.jpg OpenLORIS_images img3816.jpg OpenLORIS_images img3762.jpg OpenLORIS_images img127.jpg OpenLORIS_images img1077.jpg OpenLORIS_images img2276.jpg OpenLORIS_images img5270.jpg OpenLORIS_images img4369.jpg OpenLORIS_images img4511.jpg OpenLORIS_images img4298.jpg OpenLORIS_images img4656.jpg OpenLORIS_images img483.jpg OpenLORIS_images img2470.jpg OpenLORIS_images img868.jpg OpenLORIS_images img2573.jpg OpenLORIS_images img5951.jpg OpenLORIS_images img3896.jpg OpenLORIS_images img4996.jpg OpenLORIS_images img3617.jpg OpenLORIS_images img3372.jpg OpenLORIS_images img4284.jpg OpenLORIS_images img5254.jpg OpenLORIS_images img616.jpg OpenLORIS_images img3066.jpg OpenLORIS_images img1169.jpg OpenLORIS_images img3393.jpg OpenLORIS_images img6068.jpg OpenLORIS_images img1886.jpg OpenLORIS_images img4782.jpg OpenLORIS_images img5222.jpg OpenLORIS_images img6134.jpg OpenLORIS_images img708.jpg OpenLORIS_images img3690.jpg OpenLORIS_images img592.jpg OpenLORIS_images img5805.jpg OpenLORIS_images img814.jpg OpenLORIS_images img3913.jpg OpenLORIS_images img5045.jpg OpenLORIS_images img5617.jpg OpenLORIS_images img5217.jpg OpenLORIS_images img3403.jpg OpenLORIS_images img3246.jpg OpenLORIS_images img4391.jpg OpenLORIS_images img5683.jpg OpenLORIS_images img6083.jpg OpenLORIS_images img4769.jpg OpenLORIS_images img1618.jpg OpenLORIS_images img3491.jpg OpenLORIS_images img1542.jpg OpenLORIS_images img407.jpg OpenLORIS_images img2560.jpg OpenLORIS_images img2173.jpg OpenLORIS_images img3758.jpg OpenLORIS_images img3819.jpg OpenLORIS_images img5660.jpg OpenLORIS_images img3800.jpg OpenLORIS_images img6101.jpg OpenLORIS_images img1587.jpg OpenLORIS_images img651.jpg OpenLORIS_images img4097.jpg OpenLORIS_images img2748.jpg OpenLORIS_images img4337.jpg OpenLORIS_images img3117.jpg OpenLORIS_images img947.jpg OpenLORIS_images img3073.jpg OpenLORIS_images img3507.jpg OpenLORIS_images img4309.jpg OpenLORIS_images img546.jpg OpenLORIS_images img3068.jpg OpenLORIS_images img4597.jpg OpenLORIS_images img5424.jpg OpenLORIS_images img5730.jpg OpenLORIS_images img3035.jpg OpenLORIS_images img4773.jpg OpenLORIS_images img765.jpg OpenLORIS_images img3147.jpg OpenLORIS_images img1829.jpg OpenLORIS_images img2066.jpg OpenLORIS_images img311.jpg OpenLORIS_images img5781.jpg OpenLORIS_images img4882.jpg OpenLORIS_images img1240.jpg OpenLORIS_images img2622.jpg OpenLORIS_images img3920.jpg OpenLORIS_images img1702.jpg OpenLORIS_images img4012.jpg OpenLORIS_images img3545.jpg OpenLORIS_images img714.jpg OpenLORIS_images img2664.jpg OpenLORIS_images img3023.jpg OpenLORIS_images img4807.jpg OpenLORIS_images img1662.jpg OpenLORIS_images img1023.jpg OpenLORIS_images img3676.jpg OpenLORIS_images img4800.jpg OpenLORIS_images img4240.jpg OpenLORIS_images img234.jpg OpenLORIS_images img2222.jpg OpenLORIS_images img1246.jpg OpenLORIS_images img4267.jpg OpenLORIS_images img1601.jpg OpenLORIS_images img5300.jpg OpenLORIS_images img2558.jpg OpenLORIS_images img5578.jpg OpenLORIS_images img2179.jpg OpenLORIS_images img1185.jpg OpenLORIS_images img5178.jpg OpenLORIS_images img5627.jpg OpenLORIS_images img3.jpg OpenLORIS_images img5688.jpg OpenLORIS_images img5185.jpg OpenLORIS_images img2023.jpg OpenLORIS_images img4723.jpg OpenLORIS_images img2085.jpg OpenLORIS_images img533.jpg OpenLORIS_images img4930.jpg OpenLORIS_images img1867.jpg OpenLORIS_images img5421.jpg OpenLORIS_images img5467.jpg OpenLORIS_images img1489.jpg OpenLORIS_images img2320.jpg OpenLORIS_images img1774.jpg OpenLORIS_images img2824.jpg OpenLORIS_images img1256.jpg OpenLORIS_images img2148.jpg OpenLORIS_images img1998.jpg OpenLORIS_images img5187.jpg OpenLORIS_images img4544.jpg OpenLORIS_images img3634.jpg OpenLORIS_images img1443.jpg OpenLORIS_images img5648.jpg OpenLORIS_images img4353.jpg OpenLORIS_images img2782.jpg OpenLORIS_images img3315.jpg OpenLORIS_images img245.jpg OpenLORIS_images img438.jpg OpenLORIS_images img1490.jpg OpenLORIS_images img4903.jpg OpenLORIS_images img83.jpg OpenLORIS_images img5790.jpg OpenLORIS_images img4726.jpg OpenLORIS_images img1227.jpg OpenLORIS_images img4437.jpg OpenLORIS_images img2816.jpg OpenLORIS_images img3187.jpg OpenLORIS_images img1257.jpg OpenLORIS_images img6087.jpg OpenLORIS_images img1039.jpg OpenLORIS_images img1682.jpg OpenLORIS_images img2885.jpg OpenLORIS_images img684.jpg OpenLORIS_images img828.jpg OpenLORIS_images img6079.jpg OpenLORIS_images img862.jpg OpenLORIS_images img2166.jpg OpenLORIS_images img2533.jpg OpenLORIS_images img1834.jpg OpenLORIS_images img233.jpg OpenLORIS_images img3206.jpg OpenLORIS_images img2412.jpg OpenLORIS_images img432.jpg OpenLORIS_images img5607.jpg OpenLORIS_images img2456.jpg OpenLORIS_images img975.jpg OpenLORIS_images img5251.jpg OpenLORIS_images img6107.jpg OpenLORIS_images img5978.jpg OpenLORIS_images img3467.jpg OpenLORIS_images img4231.jpg OpenLORIS_images img2418.jpg OpenLORIS_images img605.jpg OpenLORIS_images img5161.jpg OpenLORIS_images img4141.jpg OpenLORIS_images img4929.jpg OpenLORIS_images img1067.jpg OpenLORIS_images img3418.jpg OpenLORIS_images img3994.jpg OpenLORIS_images img216.jpg OpenLORIS_images img4758.jpg OpenLORIS_images img4503.jpg OpenLORIS_images img5614.jpg OpenLORIS_images img903.jpg OpenLORIS_images img5163.jpg OpenLORIS_images img2380.jpg OpenLORIS_images img1959.jpg OpenLORIS_images img2448.jpg OpenLORIS_images img3080.jpg OpenLORIS_images img4431.jpg OpenLORIS_images img3533.jpg OpenLORIS_images img4717.jpg OpenLORIS_images img5360.jpg OpenLORIS_images img1041.jpg OpenLORIS_images img797.jpg OpenLORIS_images img5044.jpg OpenLORIS_images img2971.jpg OpenLORIS_images img6053.jpg OpenLORIS_images img2406.jpg OpenLORIS_images img4327.jpg OpenLORIS_images img1749.jpg OpenLORIS_images img3737.jpg OpenLORIS_images img2753.jpg OpenLORIS_images img2737.jpg OpenLORIS_images img4180.jpg OpenLORIS_images img3837.jpg OpenLORIS_images img2169.jpg OpenLORIS_images img4768.jpg OpenLORIS_images img1031.jpg OpenLORIS_images img2122.jpg OpenLORIS_images img646.jpg OpenLORIS_images img5959.jpg OpenLORIS_images img2893.jpg OpenLORIS_images img3479.jpg OpenLORIS_images img2655.jpg OpenLORIS_images img4427.jpg OpenLORIS_images img3241.jpg OpenLORIS_images img3592.jpg OpenLORIS_images img5462.jpg OpenLORIS_images img3836.jpg OpenLORIS_images img4766.jpg OpenLORIS_images img4998.jpg OpenLORIS_images img1434.jpg OpenLORIS_images img5941.jpg OpenLORIS_images img2780.jpg OpenLORIS_images img2969.jpg OpenLORIS_images img603.jpg OpenLORIS_images img1204.jpg OpenLORIS_images img2145.jpg OpenLORIS_images img5409.jpg OpenLORIS_images img7.jpg OpenLORIS_images img5246.jpg OpenLORIS_images img1578.jpg OpenLORIS_images img1193.jpg OpenLORIS_images img3002.jpg OpenLORIS_images img1362.jpg OpenLORIS_images img3596.jpg OpenLORIS_images img2035.jpg OpenLORIS_images img865.jpg OpenLORIS_images img44.jpg OpenLORIS_images img707.jpg OpenLORIS_images img3154.jpg OpenLORIS_images img876.jpg OpenLORIS_images img1251.jpg OpenLORIS_images img2407.jpg OpenLORIS_images img5782.jpg OpenLORIS_images img1136.jpg OpenLORIS_images img1951.jpg OpenLORIS_images img3893.jpg OpenLORIS_images img328.jpg OpenLORIS_images img5824.jpg OpenLORIS_images img1547.jpg OpenLORIS_images img1013.jpg OpenLORIS_images img1493.jpg OpenLORIS_images img5950.jpg OpenLORIS_images img5856.jpg OpenLORIS_images img429.jpg OpenLORIS_images img3924.jpg OpenLORIS_images img2928.jpg OpenLORIS_images img5874.jpg OpenLORIS_images img3642.jpg OpenLORIS_images img3951.jpg OpenLORIS_images img2117.jpg OpenLORIS_images img5406.jpg OpenLORIS_images img1917.jpg OpenLORIS_images img935.jpg OpenLORIS_images img4852.jpg OpenLORIS_images img3428.jpg OpenLORIS_images img645.jpg OpenLORIS_images img3774.jpg OpenLORIS_images img5277.jpg OpenLORIS_images img3683.jpg OpenLORIS_images img4802.jpg OpenLORIS_images img866.jpg OpenLORIS_images img3822.jpg OpenLORIS_images img3499.jpg OpenLORIS_images img930.jpg OpenLORIS_images img3269.jpg OpenLORIS_images img872.jpg OpenLORIS_images img3806.jpg OpenLORIS_images img2143.jpg OpenLORIS_images img521.jpg OpenLORIS_images img725.jpg OpenLORIS_images img851.jpg OpenLORIS_images img789.jpg OpenLORIS_images img1174.jpg OpenLORIS_images img1868.jpg OpenLORIS_images img5907.jpg OpenLORIS_images img4546.jpg OpenLORIS_images img1663.jpg OpenLORIS_images img1962.jpg OpenLORIS_images img2540.jpg OpenLORIS_images img5368.jpg OpenLORIS_images img5399.jpg OpenLORIS_images img4608.jpg OpenLORIS_images img999.jpg OpenLORIS_images img4524.jpg OpenLORIS_images img207.jpg OpenLORIS_images img2060.jpg OpenLORIS_images img3875.jpg OpenLORIS_images img3383.jpg OpenLORIS_images img2743.jpg OpenLORIS_images img1499.jpg OpenLORIS_images img5418.jpg OpenLORIS_images img5518.jpg OpenLORIS_images img5691.jpg OpenLORIS_images img893.jpg OpenLORIS_images img5398.jpg OpenLORIS_images img953.jpg OpenLORIS_images img5971.jpg OpenLORIS_images img1241.jpg OpenLORIS_images img2607.jpg OpenLORIS_images img2049.jpg OpenLORIS_images img5041.jpg OpenLORIS_images img1056.jpg OpenLORIS_images img5737.jpg OpenLORIS_images img4160.jpg OpenLORIS_images img2487.jpg OpenLORIS_images img4847.jpg OpenLORIS_images img106.jpg OpenLORIS_images img257.jpg OpenLORIS_images img1457.jpg OpenLORIS_images img724.jpg OpenLORIS_images img5333.jpg OpenLORIS_images img2915.jpg OpenLORIS_images img152.jpg OpenLORIS_images img2446.jpg OpenLORIS_images img2981.jpg OpenLORIS_images img3019.jpg OpenLORIS_images img4794.jpg OpenLORIS_images img3648.jpg OpenLORIS_images img2079.jpg OpenLORIS_images img3693.jpg OpenLORIS_images img3940.jpg OpenLORIS_images img2628.jpg OpenLORIS_images img2281.jpg OpenLORIS_images img3871.jpg OpenLORIS_images img3276.jpg OpenLORIS_images img2421.jpg OpenLORIS_images img5049.jpg OpenLORIS_images img196.jpg OpenLORIS_images img4053.jpg OpenLORIS_images img3789.jpg OpenLORIS_images img457.jpg OpenLORIS_images img4056.jpg OpenLORIS_images img3925.jpg OpenLORIS_images img2248.jpg OpenLORIS_images img265.jpg OpenLORIS_images img5845.jpg OpenLORIS_images img3197.jpg OpenLORIS_images img2564.jpg OpenLORIS_images img5751.jpg OpenLORIS_images img3817.jpg OpenLORIS_images img3519.jpg OpenLORIS_images img2623.jpg OpenLORIS_images img2764.jpg OpenLORIS_images img1291.jpg OpenLORIS_images img1265.jpg OpenLORIS_images img3646.jpg OpenLORIS_images img5963.jpg OpenLORIS_images img4047.jpg OpenLORIS_images img508.jpg OpenLORIS_images img2411.jpg OpenLORIS_images img12.jpg OpenLORIS_images img5879.jpg OpenLORIS_images img6014.jpg OpenLORIS_images img1345.jpg OpenLORIS_images img802.jpg OpenLORIS_images img4397.jpg OpenLORIS_images img2988.jpg OpenLORIS_images img4955.jpg OpenLORIS_images img5240.jpg OpenLORIS_images img3417.jpg OpenLORIS_images img5748.jpg OpenLORIS_images img571.jpg OpenLORIS_images img2365.jpg OpenLORIS_images img4250.jpg OpenLORIS_images img4514.jpg OpenLORIS_images img239.jpg OpenLORIS_images img5182.jpg OpenLORIS_images img226.jpg OpenLORIS_images img1523.jpg OpenLORIS_images img1809.jpg OpenLORIS_images img3381.jpg OpenLORIS_images img4122.jpg OpenLORIS_images img2791.jpg OpenLORIS_images img4940.jpg OpenLORIS_images img1399.jpg OpenLORIS_images img4191.jpg OpenLORIS_images img1200.jpg OpenLORIS_images img3672.jpg OpenLORIS_images img1968.jpg OpenLORIS_images img3650.jpg OpenLORIS_images img4372.jpg OpenLORIS_images img24.jpg OpenLORIS_images img1119.jpg OpenLORIS_images img5131.jpg OpenLORIS_images img4289.jpg OpenLORIS_images img1028.jpg OpenLORIS_images img3228.jpg OpenLORIS_images img5881.jpg OpenLORIS_images img4408.jpg OpenLORIS_images img4086.jpg OpenLORIS_images img2357.jpg OpenLORIS_images img3544.jpg OpenLORIS_images img5795.jpg OpenLORIS_images img39.jpg OpenLORIS_images img3180.jpg OpenLORIS_images img5403.jpg OpenLORIS_images img3077.jpg OpenLORIS_images img2829.jpg OpenLORIS_images img3189.jpg OpenLORIS_images img1548.jpg OpenLORIS_images img5278.jpg OpenLORIS_images img5952.jpg OpenLORIS_images img153.jpg OpenLORIS_images img5231.jpg OpenLORIS_images img5242.jpg OpenLORIS_images img2517.jpg OpenLORIS_images img3097.jpg OpenLORIS_images img5990.jpg OpenLORIS_images img2853.jpg OpenLORIS_images img4188.jpg OpenLORIS_images img4002.jpg OpenLORIS_images img3464.jpg OpenLORIS_images img3264.jpg OpenLORIS_images img4118.jpg OpenLORIS_images img1501.jpg OpenLORIS_images img3105.jpg OpenLORIS_images img887.jpg OpenLORIS_images img6080.jpg OpenLORIS_images img1863.jpg OpenLORIS_images img2204.jpg OpenLORIS_images img3880.jpg OpenLORIS_images img1495.jpg OpenLORIS_images img875.jpg OpenLORIS_images img229.jpg OpenLORIS_images img2475.jpg OpenLORIS_images img565.jpg OpenLORIS_images img1292.jpg OpenLORIS_images img5442.jpg OpenLORIS_images img436.jpg OpenLORIS_images img2091.jpg OpenLORIS_images img3662.jpg OpenLORIS_images img5640.jpg OpenLORIS_images img1131.jpg OpenLORIS_images img3685.jpg OpenLORIS_images img6066.jpg OpenLORIS_images img2345.jpg OpenLORIS_images img790.jpg OpenLORIS_images img3587.jpg OpenLORIS_images img210.jpg OpenLORIS_images img6050.jpg OpenLORIS_images img3284.jpg OpenLORIS_images img293.jpg OpenLORIS_images img874.jpg OpenLORIS_images img402.jpg OpenLORIS_images img235.jpg OpenLORIS_images img2938.jpg OpenLORIS_images img4347.jpg OpenLORIS_images img5298.jpg OpenLORIS_images img2252.jpg OpenLORIS_images img1140.jpg OpenLORIS_images img4904.jpg OpenLORIS_images img1100.jpg OpenLORIS_images img5615.jpg OpenLORIS_images img4388.jpg OpenLORIS_images img2207.jpg OpenLORIS_images img1598.jpg OpenLORIS_images img2672.jpg OpenLORIS_images img594.jpg OpenLORIS_images img4703.jpg OpenLORIS_images img3471.jpg OpenLORIS_images img15.jpg OpenLORIS_images img3441.jpg OpenLORIS_images img4277.jpg OpenLORIS_images img2255.jpg OpenLORIS_images img5432.jpg OpenLORIS_images img3771.jpg OpenLORIS_images img2864.jpg OpenLORIS_images img799.jpg OpenLORIS_images img538.jpg OpenLORIS_images img4379.jpg OpenLORIS_images img2982.jpg OpenLORIS_images img1275.jpg OpenLORIS_images img699.jpg OpenLORIS_images img5534.jpg OpenLORIS_images img5130.jpg OpenLORIS_images img115.jpg OpenLORIS_images img4246.jpg OpenLORIS_images img2534.jpg OpenLORIS_images img1728.jpg OpenLORIS_images img5207.jpg OpenLORIS_images img2246.jpg OpenLORIS_images img1117.jpg OpenLORIS_images img735.jpg OpenLORIS_images img2715.jpg OpenLORIS_images img5112.jpg OpenLORIS_images img1092.jpg OpenLORIS_images img1987.jpg OpenLORIS_images img3750.jpg OpenLORIS_images img1248.jpg OpenLORIS_images img5466.jpg OpenLORIS_images img1666.jpg OpenLORIS_images img5375.jpg OpenLORIS_images img1474.jpg OpenLORIS_images img890.jpg OpenLORIS_images img5889.jpg OpenLORIS_images img3010.jpg OpenLORIS_images img3049.jpg OpenLORIS_images img1672.jpg OpenLORIS_images img3460.jpg OpenLORIS_images img517.jpg OpenLORIS_images img4920.jpg OpenLORIS_images img528.jpg OpenLORIS_images img3096.jpg OpenLORIS_images img4705.jpg OpenLORIS_images img4960.jpg OpenLORIS_images img4791.jpg OpenLORIS_images img1046.jpg OpenLORIS_images img822.jpg OpenLORIS_images img4525.jpg OpenLORIS_images img151.jpg OpenLORIS_images img4934.jpg OpenLORIS_images img5214.jpg OpenLORIS_images img5082.jpg OpenLORIS_images img2523.jpg OpenLORIS_images img2468.jpg OpenLORIS_images img5074.jpg OpenLORIS_images img5888.jpg OpenLORIS_images img309.jpg OpenLORIS_images img2144.jpg OpenLORIS_images img2778.jpg OpenLORIS_images img4560.jpg OpenLORIS_images img3903.jpg OpenLORIS_images img3486.jpg OpenLORIS_images img988.jpg OpenLORIS_images img5451.jpg OpenLORIS_images img681.jpg OpenLORIS_images img5446.jpg OpenLORIS_images img4406.jpg OpenLORIS_images img806.jpg OpenLORIS_images img4921.jpg OpenLORIS_images img4889.jpg OpenLORIS_images img3056.jpg OpenLORIS_images img1323.jpg OpenLORIS_images img489.jpg OpenLORIS_images img2005.jpg OpenLORIS_images img5289.jpg OpenLORIS_images img4790.jpg OpenLORIS_images img2413.jpg OpenLORIS_images img1161.jpg OpenLORIS_images img2867.jpg OpenLORIS_images img981.jpg OpenLORIS_images img2454.jpg OpenLORIS_images img2249.jpg OpenLORIS_images img3729.jpg OpenLORIS_images img6096.jpg OpenLORIS_images img2445.jpg OpenLORIS_images img3456.jpg OpenLORIS_images img3811.jpg OpenLORIS_images img1231.jpg OpenLORIS_images img1753.jpg OpenLORIS_images img1624.jpg OpenLORIS_images img1.jpg OpenLORIS_images img1121.jpg OpenLORIS_images img5281.jpg OpenLORIS_images img3554.jpg OpenLORIS_images img4473.jpg OpenLORIS_images img3292.jpg OpenLORIS_images img766.jpg OpenLORIS_images img2301.jpg OpenLORIS_images img148.jpg OpenLORIS_images img2264.jpg OpenLORIS_images img4771.jpg OpenLORIS_images img2890.jpg OpenLORIS_images img3261.jpg OpenLORIS_images img3328.jpg OpenLORIS_images img2342.jpg OpenLORIS_images img1419.jpg OpenLORIS_images img4107.jpg OpenLORIS_images img5873.jpg OpenLORIS_images img5878.jpg OpenLORIS_images img6098.jpg OpenLORIS_images img2836.jpg OpenLORIS_images img5221.jpg OpenLORIS_images img3945.jpg OpenLORIS_images img2733.jpg OpenLORIS_images img4282.jpg OpenLORIS_images img2438.jpg OpenLORIS_images img4696.jpg OpenLORIS_images img1152.jpg OpenLORIS_images img2629.jpg OpenLORIS_images img2706.jpg OpenLORIS_images img5181.jpg OpenLORIS_images img905.jpg OpenLORIS_images img2275.jpg OpenLORIS_images img4997.jpg OpenLORIS_images img1963.jpg OpenLORIS_images img889.jpg OpenLORIS_images img772.jpg OpenLORIS_images img3514.jpg OpenLORIS_images img5034.jpg OpenLORIS_images img1588.jpg OpenLORIS_images img5917.jpg OpenLORIS_images img1690.jpg OpenLORIS_images img558.jpg OpenLORIS_images img4269.jpg OpenLORIS_images img1914.jpg OpenLORIS_images img5968.jpg OpenLORIS_images img3198.jpg OpenLORIS_images img5943.jpg OpenLORIS_images img4964.jpg OpenLORIS_images img128.jpg OpenLORIS_images img1567.jpg OpenLORIS_images img4131.jpg OpenLORIS_images img4689.jpg OpenLORIS_images img4279.jpg OpenLORIS_images img3917.jpg OpenLORIS_images img4877.jpg OpenLORIS_images img3104.jpg OpenLORIS_images img1431.jpg OpenLORIS_images img1355.jpg OpenLORIS_images img5169.jpg OpenLORIS_images img1584.jpg OpenLORIS_images img1278.jpg OpenLORIS_images img5274.jpg OpenLORIS_images img430.jpg OpenLORIS_images img5644.jpg OpenLORIS_images img1219.jpg OpenLORIS_images img1183.jpg OpenLORIS_images img5382.jpg OpenLORIS_images img1447.jpg OpenLORIS_images img3247.jpg OpenLORIS_images img3735.jpg OpenLORIS_images img5494.jpg OpenLORIS_images img5230.jpg OpenLORIS_images img5725.jpg OpenLORIS_images img3231.jpg OpenLORIS_images img4905.jpg OpenLORIS_images img4513.jpg OpenLORIS_images img3797.jpg OpenLORIS_images img1420.jpg OpenLORIS_images img3923.jpg OpenLORIS_images img320.jpg OpenLORIS_images img621.jpg OpenLORIS_images img3461.jpg OpenLORIS_images img6058.jpg OpenLORIS_images img5470.jpg OpenLORIS_images img2852.jpg OpenLORIS_images img5190.jpg OpenLORIS_images img1299.jpg OpenLORIS_images img2288.jpg OpenLORIS_images img2432.jpg OpenLORIS_images img771.jpg OpenLORIS_images img2424.jpg OpenLORIS_images img5206.jpg OpenLORIS_images img697.jpg OpenLORIS_images img1638.jpg OpenLORIS_images img5816.jpg OpenLORIS_images img214.jpg OpenLORIS_images img380.jpg OpenLORIS_images img689.jpg OpenLORIS_images img2088.jpg OpenLORIS_images img1783.jpg OpenLORIS_images img1095.jpg OpenLORIS_images img710.jpg OpenLORIS_images img2165.jpg OpenLORIS_images img5305.jpg OpenLORIS_images img5825.jpg OpenLORIS_images img4350.jpg OpenLORIS_images img3083.jpg OpenLORIS_images img1101.jpg OpenLORIS_images img3598.jpg OpenLORIS_images img2949.jpg OpenLORIS_images img1284.jpg OpenLORIS_images img1453.jpg OpenLORIS_images img5786.jpg OpenLORIS_images img3280.jpg OpenLORIS_images img3351.jpg OpenLORIS_images img5999.jpg OpenLORIS_images img426.jpg OpenLORIS_images img1050.jpg OpenLORIS_images img4100.jpg OpenLORIS_images img5826.jpg OpenLORIS_images img5156.jpg OpenLORIS_images img809.jpg OpenLORIS_images img2682.jpg OpenLORIS_images img1163.jpg OpenLORIS_images img5743.jpg OpenLORIS_images img4274.jpg OpenLORIS_images img1066.jpg OpenLORIS_images img2.jpg OpenLORIS_images img3179.jpg OpenLORIS_images img1591.jpg OpenLORIS_images img5871.jpg OpenLORIS_images img3878.jpg OpenLORIS_images img4571.jpg OpenLORIS_images img662.jpg OpenLORIS_images img6095.jpg OpenLORIS_images img2137.jpg OpenLORIS_images img2078.jpg OpenLORIS_images img4833.jpg OpenLORIS_images img3712.jpg OpenLORIS_images img2707.jpg OpenLORIS_images img3862.jpg OpenLORIS_images img3316.jpg OpenLORIS_images img165.jpg OpenLORIS_images img5005.jpg OpenLORIS_images img4366.jpg OpenLORIS_images img726.jpg OpenLORIS_images img5109.jpg OpenLORIS_images img695.jpg OpenLORIS_images img633.jpg OpenLORIS_images img3078.jpg OpenLORIS_images img4304.jpg OpenLORIS_images img3099.jpg OpenLORIS_images img5767.jpg OpenLORIS_images img961.jpg OpenLORIS_images img2537.jpg OpenLORIS_images img1930.jpg OpenLORIS_images img34.jpg OpenLORIS_images img5022.jpg OpenLORIS_images img1855.jpg OpenLORIS_images img4323.jpg OpenLORIS_images img2713.jpg OpenLORIS_images img332.jpg OpenLORIS_images img5444.jpg OpenLORIS_images img4319.jpg OpenLORIS_images img836.jpg OpenLORIS_images img4007.jpg OpenLORIS_images img5430.jpg OpenLORIS_images img2123.jpg OpenLORIS_images img4562.jpg OpenLORIS_images img4103.jpg OpenLORIS_images img470.jpg OpenLORIS_images img825.jpg OpenLORIS_images img5047.jpg OpenLORIS_images img5846.jpg OpenLORIS_images img1592.jpg OpenLORIS_images img5755.jpg OpenLORIS_images img2492.jpg OpenLORIS_images img4361.jpg OpenLORIS_images img3308.jpg OpenLORIS_images img2071.jpg OpenLORIS_images img4853.jpg OpenLORIS_images img4076.jpg OpenLORIS_images img1054.jpg OpenLORIS_images img3424.jpg OpenLORIS_images img4013.jpg OpenLORIS_images img4450.jpg OpenLORIS_images img5549.jpg OpenLORIS_images img158.jpg OpenLORIS_images img3205.jpg OpenLORIS_images img2205.jpg OpenLORIS_images img1367.jpg OpenLORIS_images img4974.jpg OpenLORIS_images img4312.jpg OpenLORIS_images img3639.jpg OpenLORIS_images img1487.jpg OpenLORIS_images img5060.jpg OpenLORIS_images img1182.jpg OpenLORIS_images img2279.jpg OpenLORIS_images img1670.jpg OpenLORIS_images img2300.jpg OpenLORIS_images img3434.jpg OpenLORIS_images img5717.jpg OpenLORIS_images img3199.jpg OpenLORIS_images img400.jpg OpenLORIS_images img393.jpg OpenLORIS_images img2267.jpg OpenLORIS_images img2174.jpg OpenLORIS_images img2392.jpg OpenLORIS_images img6051.jpg OpenLORIS_images img4235.jpg OpenLORIS_images img2543.jpg OpenLORIS_images img2216.jpg OpenLORIS_images img3143.jpg OpenLORIS_images img75.jpg OpenLORIS_images img5334.jpg OpenLORIS_images img3462.jpg OpenLORIS_images img4139.jpg OpenLORIS_images img5536.jpg OpenLORIS_images img3658.jpg OpenLORIS_images img1726.jpg OpenLORIS_images img107.jpg OpenLORIS_images img450.jpg OpenLORIS_images img5332.jpg OpenLORIS_images img108.jpg OpenLORIS_images img3850.jpg OpenLORIS_images img4543.jpg OpenLORIS_images img5839.jpg OpenLORIS_images img1994.jpg OpenLORIS_images img3556.jpg OpenLORIS_images img1115.jpg OpenLORIS_images img5713.jpg OpenLORIS_images img1807.jpg OpenLORIS_images img4845.jpg OpenLORIS_images img5405.jpg OpenLORIS_images img5768.jpg OpenLORIS_images img4785.jpg OpenLORIS_images img941.jpg OpenLORIS_images img4991.jpg OpenLORIS_images img390.jpg OpenLORIS_images img5797.jpg OpenLORIS_images img5455.jpg OpenLORIS_images img5793.jpg OpenLORIS_images img2318.jpg OpenLORIS_images img2461.jpg OpenLORIS_images img2024.jpg OpenLORIS_images img2135.jpg OpenLORIS_images img4208.jpg OpenLORIS_images img3243.jpg OpenLORIS_images img5441.jpg OpenLORIS_images img5709.jpg OpenLORIS_images img4788.jpg OpenLORIS_images img4055.jpg OpenLORIS_images img3835.jpg OpenLORIS_images img1862.jpg OpenLORIS_images img60.jpg OpenLORIS_images img637.jpg OpenLORIS_images img1916.jpg OpenLORIS_images img2728.jpg OpenLORIS_images img3795.jpg OpenLORIS_images img3355.jpg OpenLORIS_images img4642.jpg OpenLORIS_images img1931.jpg OpenLORIS_images img574.jpg OpenLORIS_images img4265.jpg OpenLORIS_images img3688.jpg OpenLORIS_images img1329.jpg OpenLORIS_images img381.jpg OpenLORIS_images img171.jpg OpenLORIS_images img70.jpg OpenLORIS_images img5674.jpg OpenLORIS_images img544.jpg OpenLORIS_images img1210.jpg OpenLORIS_images img355.jpg OpenLORIS_images img4685.jpg OpenLORIS_images img4783.jpg OpenLORIS_images img4054.jpg OpenLORIS_images img35.jpg OpenLORIS_images img269.jpg OpenLORIS_images img4205.jpg OpenLORIS_images img3005.jpg OpenLORIS_images img756.jpg OpenLORIS_images img5655.jpg OpenLORIS_images img5886.jpg OpenLORIS_images img3024.jpg OpenLORIS_images img459.jpg OpenLORIS_images img3535.jpg OpenLORIS_images img6081.jpg OpenLORIS_images img5506.jpg OpenLORIS_images img886.jpg OpenLORIS_images img3006.jpg OpenLORIS_images img1288.jpg OpenLORIS_images img5490.jpg OpenLORIS_images img4664.jpg OpenLORIS_images img4314.jpg OpenLORIS_images img2210.jpg OpenLORIS_images img2223.jpg OpenLORIS_images img6021.jpg OpenLORIS_images img970.jpg OpenLORIS_images img604.jpg OpenLORIS_images img1358.jpg OpenLORIS_images img5068.jpg OpenLORIS_images img6020.jpg OpenLORIS_images img47.jpg OpenLORIS_images img5859.jpg OpenLORIS_images img3655.jpg OpenLORIS_images img5682.jpg OpenLORIS_images img2039.jpg OpenLORIS_images img2202.jpg OpenLORIS_images img5996.jpg OpenLORIS_images img4417.jpg OpenLORIS_images img892.jpg OpenLORIS_images img3380.jpg OpenLORIS_images img4099.jpg OpenLORIS_images img5976.jpg OpenLORIS_images img591.jpg OpenLORIS_images img3211.jpg OpenLORIS_images img1628.jpg OpenLORIS_images img1341.jpg OpenLORIS_images img6074.jpg OpenLORIS_images img122.jpg OpenLORIS_images img2583.jpg OpenLORIS_images img4136.jpg OpenLORIS_images img4671.jpg OpenLORIS_images img3399.jpg OpenLORIS_images img5030.jpg OpenLORIS_images img6027.jpg OpenLORIS_images img1516.jpg OpenLORIS_images img200.jpg OpenLORIS_images img3258.jpg OpenLORIS_images img3932.jpg OpenLORIS_images img5071.jpg OpenLORIS_images img3320.jpg OpenLORIS_images img1392.jpg OpenLORIS_images img1353.jpg OpenLORIS_images img4846.jpg OpenLORIS_images img2694.jpg OpenLORIS_images img263.jpg OpenLORIS_images img2975.jpg OpenLORIS_images img4721.jpg OpenLORIS_images img2229.jpg OpenLORIS_images img1635.jpg OpenLORIS_images img2334.jpg OpenLORIS_images img2405.jpg OpenLORIS_images img5716.jpg OpenLORIS_images img3564.jpg OpenLORIS_images img5347.jpg OpenLORIS_images img4062.jpg OpenLORIS_images img4925.jpg OpenLORIS_images img2697.jpg OpenLORIS_images img117.jpg OpenLORIS_images img4733.jpg OpenLORIS_images img1339.jpg OpenLORIS_images img4797.jpg OpenLORIS_images img2880.jpg OpenLORIS_images img824.jpg OpenLORIS_images img4111.jpg OpenLORIS_images img5193.jpg OpenLORIS_images img4935.jpg OpenLORIS_images img2961.jpg OpenLORIS_images img4283.jpg OpenLORIS_images img752.jpg OpenLORIS_images img3753.jpg OpenLORIS_images img1298.jpg OpenLORIS_images img643.jpg OpenLORIS_images img5407.jpg OpenLORIS_images img2779.jpg OpenLORIS_images img4335.jpg OpenLORIS_images img1071.jpg OpenLORIS_images img3003.jpg OpenLORIS_images img5997.jpg OpenLORIS_images img3841.jpg OpenLORIS_images img4029.jpg OpenLORIS_images img2479.jpg OpenLORIS_images img4805.jpg OpenLORIS_images img3021.jpg OpenLORIS_images img4992.jpg OpenLORIS_images img2708.jpg OpenLORIS_images img4449.jpg OpenLORIS_images img1409.jpg OpenLORIS_images img1212.jpg OpenLORIS_images img303.jpg OpenLORIS_images img5480.jpg OpenLORIS_images img1220.jpg OpenLORIS_images img4297.jpg OpenLORIS_images img5245.jpg OpenLORIS_images img1274.jpg OpenLORIS_images img5318.jpg OpenLORIS_images img2634.jpg OpenLORIS_images img1896.jpg OpenLORIS_images img3629.jpg OpenLORIS_images img5504.jpg OpenLORIS_images img796.jpg OpenLORIS_images img1247.jpg OpenLORIS_images img368.jpg OpenLORIS_images img4064.jpg OpenLORIS_images img1259.jpg OpenLORIS_images img4959.jpg OpenLORIS_images img1330.jpg OpenLORIS_images img231.jpg OpenLORIS_images img3253.jpg OpenLORIS_images img4822.jpg OpenLORIS_images img5785.jpg OpenLORIS_images img1850.jpg OpenLORIS_images img3030.jpg OpenLORIS_images img2873.jpg OpenLORIS_images img2208.jpg OpenLORIS_images img4322.jpg OpenLORIS_images img3406.jpg OpenLORIS_images img820.jpg OpenLORIS_images img3013.jpg OpenLORIS_images img2510.jpg OpenLORIS_images img1876.jpg OpenLORIS_images img4405.jpg OpenLORIS_images img2186.jpg OpenLORIS_images img539.jpg OpenLORIS_images img2699.jpg OpenLORIS_images img484.jpg OpenLORIS_images img609.jpg OpenLORIS_images img6132.jpg OpenLORIS_images img5699.jpg OpenLORIS_images img5964.jpg OpenLORIS_images img5530.jpg OpenLORIS_images img2887.jpg OpenLORIS_images img5087.jpg OpenLORIS_images img3573.jpg OpenLORIS_images img3340.jpg OpenLORIS_images img2417.jpg OpenLORIS_images img4572.jpg OpenLORIS_images img4858.jpg OpenLORIS_images img5505.jpg OpenLORIS_images img4060.jpg OpenLORIS_images img335.jpg OpenLORIS_images img1461.jpg OpenLORIS_images img5523.jpg OpenLORIS_images img1064.jpg OpenLORIS_images img3102.jpg OpenLORIS_images img3705.jpg OpenLORIS_images img3366.jpg OpenLORIS_images img2712.jpg OpenLORIS_images img5665.jpg OpenLORIS_images img1105.jpg OpenLORIS_images img2414.jpg OpenLORIS_images img4739.jpg OpenLORIS_images img2849.jpg OpenLORIS_images img86.jpg OpenLORIS_images img3810.jpg OpenLORIS_images img3879.jpg OpenLORIS_images img927.jpg OpenLORIS_images img4747.jpg OpenLORIS_images img3869.jpg OpenLORIS_images img5680.jpg OpenLORIS_images img516.jpg OpenLORIS_images img23.jpg OpenLORIS_images img4025.jpg OpenLORIS_images img2346.jpg OpenLORIS_images img5809.jpg OpenLORIS_images img4970.jpg OpenLORIS_images img3697.jpg OpenLORIS_images img1768.jpg OpenLORIS_images img3227.jpg OpenLORIS_images img4003.jpg OpenLORIS_images img2931.jpg OpenLORIS_images img4686.jpg OpenLORIS_images img3825.jpg OpenLORIS_images img1411.jpg OpenLORIS_images img4916.jpg OpenLORIS_images img701.jpg OpenLORIS_images img5896.jpg OpenLORIS_images img1919.jpg OpenLORIS_images img5033.jpg OpenLORIS_images img478.jpg OpenLORIS_images img1189.jpg OpenLORIS_images img4074.jpg OpenLORIS_images img1668.jpg OpenLORIS_images img3036.jpg OpenLORIS_images img2830.jpg OpenLORIS_images img3225.jpg OpenLORIS_images img4178.jpg OpenLORIS_images img738.jpg OpenLORIS_images img71.jpg OpenLORIS_images img6022.jpg OpenLORIS_images img3052.jpg OpenLORIS_images img1312.jpg OpenLORIS_images img146.jpg OpenLORIS_images img2570.jpg OpenLORIS_images img747.jpg OpenLORIS_images img4879.jpg OpenLORIS_images img1948.jpg OpenLORIS_images img4346.jpg OpenLORIS_images img5707.jpg OpenLORIS_images img2353.jpg OpenLORIS_images img3282.jpg OpenLORIS_images img3614.jpg OpenLORIS_images img1684.jpg OpenLORIS_images img1976.jpg OpenLORIS_images img2509.jpg OpenLORIS_images img3039.jpg OpenLORIS_images img5808.jpg OpenLORIS_images img5095.jpg OpenLORIS_images img5213.jpg OpenLORIS_images img5543.jpg OpenLORIS_images img4165.jpg OpenLORIS_images img4117.jpg OpenLORIS_images img833.jpg OpenLORIS_images img5813.jpg OpenLORIS_images img3008.jpg OpenLORIS_images img3954.jpg OpenLORIS_images img5830.jpg OpenLORIS_images img742.jpg OpenLORIS_images img1908.jpg OpenLORIS_images img2213.jpg OpenLORIS_images img1564.jpg OpenLORIS_images img1258.jpg OpenLORIS_images img4364.jpg OpenLORIS_images img4700.jpg OpenLORIS_images img4603.jpg OpenLORIS_images img850.jpg OpenLORIS_images img215.jpg OpenLORIS_images img4303.jpg OpenLORIS_images img2314.jpg OpenLORIS_images img1164.jpg OpenLORIS_images img2067.jpg OpenLORIS_images img2138.jpg OpenLORIS_images img3872.jpg OpenLORIS_images img5926.jpg OpenLORIS_images img4014.jpg OpenLORIS_images img1717.jpg OpenLORIS_images img2034.jpg OpenLORIS_images img2077.jpg OpenLORIS_images img1955.jpg OpenLORIS_images img4868.jpg OpenLORIS_images img3756.jpg OpenLORIS_images img92.jpg OpenLORIS_images img2584.jpg OpenLORIS_images img1790.jpg OpenLORIS_images img227.jpg OpenLORIS_images img2386.jpg OpenLORIS_images img1068.jpg OpenLORIS_images img5657.jpg OpenLORIS_images img3899.jpg OpenLORIS_images img1553.jpg OpenLORIS_images img2499.jpg OpenLORIS_images img157.jpg OpenLORIS_images img5232.jpg OpenLORIS_images img1343.jpg OpenLORIS_images img2094.jpg OpenLORIS_images img2305.jpg OpenLORIS_images img4841.jpg OpenLORIS_images img3894.jpg OpenLORIS_images img2563.jpg OpenLORIS_images img1088.jpg OpenLORIS_images img4966.jpg OpenLORIS_images img2388.jpg OpenLORIS_images img1770.jpg OpenLORIS_images img4197.jpg OpenLORIS_images img4435.jpg OpenLORIS_images img839.jpg OpenLORIS_images img1860.jpg OpenLORIS_images img4536.jpg OpenLORIS_images img5854.jpg OpenLORIS_images img2242.jpg OpenLORIS_images img4077.jpg OpenLORIS_images img2387.jpg OpenLORIS_images img1144.jpg OpenLORIS_images img4844.jpg OpenLORIS_images img4711.jpg OpenLORIS_images img3727.jpg OpenLORIS_images img2313.jpg OpenLORIS_images img1573.jpg OpenLORIS_images img867.jpg OpenLORIS_images img1551.jpg OpenLORIS_images img1978.jpg OpenLORIS_images img4294.jpg OpenLORIS_images img1826.jpg OpenLORIS_images img5296.jpg OpenLORIS_images img5700.jpg OpenLORIS_images img6099.jpg OpenLORIS_images img3965.jpg OpenLORIS_images img4512.jpg OpenLORIS_images img3555.jpg OpenLORIS_images img2809.jpg OpenLORIS_images img4468.jpg OpenLORIS_images img3895.jpg OpenLORIS_images img27.jpg OpenLORIS_images img1898.jpg OpenLORIS_images img3230.jpg OpenLORIS_images img435.jpg OpenLORIS_images img350.jpg OpenLORIS_images img1879.jpg OpenLORIS_images img1888.jpg OpenLORIS_images img2020.jpg OpenLORIS_images img6090.jpg OpenLORIS_images img4194.jpg OpenLORIS_images img2701.jpg OpenLORIS_images img4509.jpg OpenLORIS_images img1869.jpg OpenLORIS_images img2127.jpg OpenLORIS_images img4161.jpg OpenLORIS_images img59.jpg OpenLORIS_images img5000.jpg OpenLORIS_images img5681.jpg OpenLORIS_images img3684.jpg OpenLORIS_images img5460.jpg OpenLORIS_images img1238.jpg OpenLORIS_images img111.jpg OpenLORIS_images img2041.jpg OpenLORIS_images img3272.jpg OpenLORIS_images img542.jpg OpenLORIS_images img5647.jpg OpenLORIS_images img2869.jpg OpenLORIS_images img5481.jpg OpenLORIS_images img3398.jpg OpenLORIS_images img5459.jpg OpenLORIS_images img2841.jpg OpenLORIS_images img5817.jpg OpenLORIS_images img2331.jpg OpenLORIS_images img137.jpg OpenLORIS_images img4042.jpg OpenLORIS_images img1799.jpg OpenLORIS_images img1754.jpg OpenLORIS_images img256.jpg OpenLORIS_images img522.jpg OpenLORIS_images img4412.jpg OpenLORIS_images img4816.jpg OpenLORIS_images img3289.jpg OpenLORIS_images img1118.jpg OpenLORIS_images img3497.jpg OpenLORIS_images img2722.jpg OpenLORIS_images img5429.jpg OpenLORIS_images img737.jpg OpenLORIS_images img1642.jpg OpenLORIS_images img5989.jpg OpenLORIS_images img2520.jpg OpenLORIS_images img3064.jpg OpenLORIS_images img136.jpg OpenLORIS_images img276.jpg OpenLORIS_images img2663.jpg OpenLORIS_images img308.jpg OpenLORIS_images img1689.jpg OpenLORIS_images img342.jpg OpenLORIS_images img3933.jpg OpenLORIS_images img4387.jpg OpenLORIS_images img5546.jpg OpenLORIS_images img545.jpg OpenLORIS_images img2056.jpg OpenLORIS_images img3176.jpg OpenLORIS_images img4968.jpg OpenLORIS_images img4300.jpg OpenLORIS_images img1950.jpg OpenLORIS_images img4382.jpg OpenLORIS_images img6115.jpg OpenLORIS_images img2546.jpg OpenLORIS_images img3874.jpg OpenLORIS_images img3360.jpg OpenLORIS_images img1273.jpg OpenLORIS_images img1599.jpg OpenLORIS_images img77.jpg OpenLORIS_images img65.jpg OpenLORIS_images img254.jpg OpenLORIS_images img6016.jpg OpenLORIS_images img2983.jpg OpenLORIS_images img2900.jpg OpenLORIS_images img1623.jpg OpenLORIS_images img1300.jpg OpenLORIS_images img2811.jpg OpenLORIS_images img3632.jpg OpenLORIS_images img1812.jpg OpenLORIS_images img3739.jpg OpenLORIS_images img5028.jpg OpenLORIS_images img3004.jpg OpenLORIS_images img2851.jpg OpenLORIS_images img5690.jpg OpenLORIS_images img2096.jpg OpenLORIS_images img155.jpg OpenLORIS_images img49.jpg OpenLORIS_images img1014.jpg OpenLORIS_images img5764.jpg OpenLORIS_images img5160.jpg OpenLORIS_images img5396.jpg OpenLORIS_images img5302.jpg OpenLORIS_images img2840.jpg OpenLORIS_images img5009.jpg OpenLORIS_images img3701.jpg OpenLORIS_images img5696.jpg OpenLORIS_images img3245.jpg OpenLORIS_images img3093.jpg OpenLORIS_images img1556.jpg OpenLORIS_images img1696.jpg OpenLORIS_images img3921.jpg OpenLORIS_images img2643.jpg OpenLORIS_images img759.jpg OpenLORIS_images img549.jpg OpenLORIS_images img3527.jpg OpenLORIS_images img4132.jpg OpenLORIS_images img315.jpg OpenLORIS_images img5863.jpg OpenLORIS_images img6069.jpg OpenLORIS_images img5704.jpg OpenLORIS_images img4032.jpg OpenLORIS_images img734.jpg OpenLORIS_images img2168.jpg OpenLORIS_images img1348.jpg OpenLORIS_images img1338.jpg OpenLORIS_images img1497.jpg OpenLORIS_images img387.jpg OpenLORIS_images img1293.jpg OpenLORIS_images img1988.jpg OpenLORIS_images img1403.jpg OpenLORIS_images img1785.jpg OpenLORIS_images img4162.jpg OpenLORIS_images img4480.jpg OpenLORIS_images img4752.jpg OpenLORIS_images img1557.jpg OpenLORIS_images img964.jpg OpenLORIS_images img2595.jpg OpenLORIS_images img2126.jpg OpenLORIS_images img2967.jpg OpenLORIS_images img3362.jpg OpenLORIS_images img172.jpg OpenLORIS_images img4181.jpg OpenLORIS_images img3046.jpg OpenLORIS_images img4211.jpg OpenLORIS_images img1069.jpg OpenLORIS_images img2494.jpg OpenLORIS_images img4069.jpg OpenLORIS_images img1566.jpg OpenLORIS_images img4606.jpg OpenLORIS_images img5473.jpg OpenLORIS_images img2423.jpg OpenLORIS_images img2702.jpg OpenLORIS_images img1049.jpg OpenLORIS_images img3509.jpg OpenLORIS_images img458.jpg OpenLORIS_images img5774.jpg OpenLORIS_images img5841.jpg OpenLORIS_images img2508.jpg OpenLORIS_images img378.jpg OpenLORIS_images img5733.jpg OpenLORIS_images img2188.jpg OpenLORIS_images img2217.jpg OpenLORIS_images img3757.jpg OpenLORIS_images img3570.jpg OpenLORIS_images img2193.jpg OpenLORIS_images img5667.jpg OpenLORIS_images img2200.jpg OpenLORIS_images img705.jpg OpenLORIS_images img5608.jpg OpenLORIS_images img1875.jpg OpenLORIS_images img3517.jpg OpenLORIS_images img4582.jpg OpenLORIS_images img1250.jpg OpenLORIS_images img6013.jpg OpenLORIS_images img3451.jpg OpenLORIS_images img1716.jpg OpenLORIS_images img5673.jpg OpenLORIS_images img255.jpg OpenLORIS_images img5422.jpg OpenLORIS_images img2581.jpg OpenLORIS_images img1290.jpg OpenLORIS_images img928.jpg OpenLORIS_images img5649.jpg OpenLORIS_images img84.jpg OpenLORIS_images img230.jpg OpenLORIS_images img2449.jpg OpenLORIS_images img5503.jpg OpenLORIS_images img5142.jpg OpenLORIS_images img4637.jpg OpenLORIS_images img4098.jpg OpenLORIS_images img3549.jpg OpenLORIS_images img4329.jpg OpenLORIS_images img5050.jpg OpenLORIS_images img2289.jpg OpenLORIS_images img1170.jpg OpenLORIS_images img3363.jpg OpenLORIS_images img5138.jpg OpenLORIS_images img5901.jpg OpenLORIS_images img3442.jpg OpenLORIS_images img2845.jpg OpenLORIS_images img3666.jpg OpenLORIS_images img2273.jpg OpenLORIS_images img1589.jpg OpenLORIS_images img1055.jpg OpenLORIS_images img4909.jpg OpenLORIS_images img906.jpg OpenLORIS_images img359.jpg OpenLORIS_images img4198.jpg OpenLORIS_images img3217.jpg OpenLORIS_images img804.jpg OpenLORIS_images img2604.jpg OpenLORIS_images img6110.jpg OpenLORIS_images img5689.jpg OpenLORIS_images img2230.jpg OpenLORIS_images img3144.jpg OpenLORIS_images img3500.jpg OpenLORIS_images img4567.jpg OpenLORIS_images img29.jpg OpenLORIS_images img3768.jpg OpenLORIS_images img5675.jpg OpenLORIS_images img4220.jpg OpenLORIS_images img3150.jpg OpenLORIS_images img3414.jpg OpenLORIS_images img555.jpg OpenLORIS_images img1540.jpg OpenLORIS_images img4336.jpg OpenLORIS_images img3619.jpg OpenLORIS_images img3234.jpg OpenLORIS_images img3621.jpg OpenLORIS_images img2608.jpg OpenLORIS_images img5750.jpg OpenLORIS_images img5069.jpg OpenLORIS_images img3722.jpg OpenLORIS_images img1318.jpg OpenLORIS_images img1707.jpg OpenLORIS_images img1030.jpg OpenLORIS_images img4680.jpg OpenLORIS_images img5894.jpg OpenLORIS_images img622.jpg OpenLORIS_images img2725.jpg OpenLORIS_images img1475.jpg OpenLORIS_images img2590.jpg OpenLORIS_images img4704.jpg OpenLORIS_images img4504.jpg OpenLORIS_images img3488.jpg OpenLORIS_images img1992.jpg OpenLORIS_images img3386.jpg OpenLORIS_images img312.jpg OpenLORIS_images img48.jpg OpenLORIS_images img357.jpg OpenLORIS_images img1305.jpg OpenLORIS_images img1492.jpg OpenLORIS_images img3268.jpg OpenLORIS_images img209.jpg OpenLORIS_images img5337.jpg OpenLORIS_images img337.jpg OpenLORIS_images img2730.jpg OpenLORIS_images img253.jpg OpenLORIS_images img4017.jpg OpenLORIS_images img5486.jpg OpenLORIS_images img2832.jpg OpenLORIS_images img2592.jpg OpenLORIS_images img3897.jpg OpenLORIS_images img2321.jpg OpenLORIS_images img4820.jpg OpenLORIS_images img5283.jpg OpenLORIS_images img5932.jpg OpenLORIS_images img2820.jpg OpenLORIS_images img5317.jpg OpenLORIS_images img1996.jpg OpenLORIS_images img4583.jpg OpenLORIS_images img2429.jpg OpenLORIS_images img3584.jpg OpenLORIS_images img4189.jpg OpenLORIS_images img4011.jpg OpenLORIS_images img2994.jpg OpenLORIS_images img753.jpg OpenLORIS_images img6010.jpg OpenLORIS_images img2206.jpg OpenLORIS_images img2371.jpg OpenLORIS_images img1942.jpg OpenLORIS_images img1477.jpg OpenLORIS_images img4355.jpg OpenLORIS_images img2278.jpg OpenLORIS_images img610.jpg OpenLORIS_images img548.jpg OpenLORIS_images img5392.jpg OpenLORIS_images img5993.jpg OpenLORIS_images img5011.jpg OpenLORIS_images img2466.jpg OpenLORIS_images img1190.jpg OpenLORIS_images img2596.jpg OpenLORIS_images img4381.jpg OpenLORIS_images img2704.jpg OpenLORIS_images img1856.jpg OpenLORIS_images img2362.jpg OpenLORIS_images img3794.jpg OpenLORIS_images img3647.jpg OpenLORIS_images img447.jpg OpenLORIS_images img5540.jpg OpenLORIS_images img4617.jpg OpenLORIS_images img1400.jpg OpenLORIS_images img4714.jpg OpenLORIS_images img2306.jpg OpenLORIS_images img5091.jpg OpenLORIS_images img3521.jpg OpenLORIS_images img4561.jpg OpenLORIS_images img2767.jpg OpenLORIS_images img5244.jpg OpenLORIS_images img2142.jpg OpenLORIS_images img1895.jpg OpenLORIS_images img2964.jpg OpenLORIS_images img3015.jpg OpenLORIS_images img1184.jpg OpenLORIS_images img1958.jpg OpenLORIS_images img3972.jpg OpenLORIS_images img455.jpg OpenLORIS_images img832.jpg OpenLORIS_images img2899.jpg OpenLORIS_images img3576.jpg OpenLORIS_images img4368.jpg OpenLORIS_images img179.jpg OpenLORIS_images img2906.jpg OpenLORIS_images img4416.jpg OpenLORIS_images img2846.jpg OpenLORIS_images img770.jpg OpenLORIS_images img421.jpg OpenLORIS_images img2911.jpg OpenLORIS_images img4225.jpg OpenLORIS_images img5821.jpg OpenLORIS_images img5357.jpg OpenLORIS_images img5531.jpg OpenLORIS_images img1075.jpg OpenLORIS_images img5319.jpg OpenLORIS_images img2789.jpg OpenLORIS_images img5600.jpg OpenLORIS_images img3801.jpg OpenLORIS_images img2863.jpg OpenLORIS_images img682.jpg OpenLORIS_images img4226.jpg OpenLORIS_images img2312.jpg OpenLORIS_images img3133.jpg OpenLORIS_images img2896.jpg OpenLORIS_images img501.jpg OpenLORIS_images img1378.jpg OpenLORIS_images img1595.jpg OpenLORIS_images img475.jpg OpenLORIS_images img3777.jpg OpenLORIS_images img5711.jpg OpenLORIS_images img5238.jpg OpenLORIS_images img5493.jpg OpenLORIS_images img6007.jpg OpenLORIS_images img3423.jpg OpenLORIS_images img994.jpg OpenLORIS_images img5076.jpg OpenLORIS_images img1593.jpg OpenLORIS_images img570.jpg OpenLORIS_images img5316.jpg OpenLORIS_images img990.jpg OpenLORIS_images img5509.jpg OpenLORIS_images img5551.jpg OpenLORIS_images img760.jpg OpenLORIS_images img238.jpg OpenLORIS_images img2550.jpg OpenLORIS_images img3422.jpg OpenLORIS_images img4048.jpg OpenLORIS_images img103.jpg OpenLORIS_images img5144.jpg OpenLORIS_images img79.jpg OpenLORIS_images img2231.jpg OpenLORIS_images img2810.jpg OpenLORIS_images img4874.jpg OpenLORIS_images img3792.jpg OpenLORIS_images img1819.jpg OpenLORIS_images img1083.jpg OpenLORIS_images img3300.jpg OpenLORIS_images img3888.jpg OpenLORIS_images img3281.jpg OpenLORIS_images img142.jpg OpenLORIS_images img5075.jpg OpenLORIS_images img3908.jpg OpenLORIS_images img3468.jpg OpenLORIS_images img4241.jpg OpenLORIS_images img717.jpg OpenLORIS_images img4044.jpg OpenLORIS_images img4239.jpg OpenLORIS_images img5611.jpg OpenLORIS_images img2795.jpg OpenLORIS_images img2292.jpg OpenLORIS_images img2447.jpg OpenLORIS_images img615.jpg OpenLORIS_images img5280.jpg OpenLORIS_images img2003.jpg OpenLORIS_images img996.jpg OpenLORIS_images img4419.jpg OpenLORIS_images img1167.jpg OpenLORIS_images img5701.jpg OpenLORIS_images img6032.jpg OpenLORIS_images img4292.jpg OpenLORIS_images img1470.jpg OpenLORIS_images img2998.jpg OpenLORIS_images img6034.jpg OpenLORIS_images img5723.jpg OpenLORIS_images img647.jpg OpenLORIS_images img2718.jpg OpenLORIS_images img1024.jpg OpenLORIS_images img5902.jpg OpenLORIS_images img1777.jpg OpenLORIS_images img30.jpg OpenLORIS_images img733.jpg OpenLORIS_images img5983.jpg OpenLORIS_images img653.jpg OpenLORIS_images img1286.jpg OpenLORIS_images img2164.jpg OpenLORIS_images img5434.jpg OpenLORIS_images img4321.jpg OpenLORIS_images img2958.jpg OpenLORIS_images img5860.jpg OpenLORIS_images img2904.jpg OpenLORIS_images img5987.jpg OpenLORIS_images img5046.jpg OpenLORIS_images img3865.jpg OpenLORIS_images img5900.jpg OpenLORIS_images img4199.jpg OpenLORIS_images img2984.jpg OpenLORIS_images img904.jpg OpenLORIS_images img4884.jpg OpenLORIS_images img206.jpg OpenLORIS_images img4371.jpg OpenLORIS_images img4478.jpg OpenLORIS_images img4823.jpg OpenLORIS_images img960.jpg OpenLORIS_images img3859.jpg OpenLORIS_images img844.jpg OpenLORIS_images img1641.jpg OpenLORIS_images img3539.jpg OpenLORIS_images img2992.jpg OpenLORIS_images img5146.jpg OpenLORIS_images img6029.jpg OpenLORIS_images img4682.jpg OpenLORIS_images img1332.jpg OpenLORIS_images img1727.jpg OpenLORIS_images img3330.jpg OpenLORIS_images img3009.jpg OpenLORIS_images img2017.jpg OpenLORIS_images img1877.jpg OpenLORIS_images img5194.jpg OpenLORIS_images img1099.jpg OpenLORIS_images img3952.jpg OpenLORIS_images img3240.jpg OpenLORIS_images img1149.jpg OpenLORIS_images img3950.jpg OpenLORIS_images img1381.jpg OpenLORIS_images img959.jpg OpenLORIS_images img3430.jpg OpenLORIS_images img446.jpg OpenLORIS_images img4932.jpg OpenLORIS_images img5158.jpg OpenLORIS_images img2284.jpg OpenLORIS_images img283.jpg OpenLORIS_images img983.jpg OpenLORIS_images img382.jpg OpenLORIS_images img4995.jpg OpenLORIS_images img674.jpg OpenLORIS_images img1391.jpg OpenLORIS_images img4301.jpg OpenLORIS_images img3079.jpg OpenLORIS_images img5694.jpg OpenLORIS_images img1724.jpg OpenLORIS_images img403.jpg OpenLORIS_images img3916.jpg OpenLORIS_images img4389.jpg OpenLORIS_images img2436.jpg OpenLORIS_images img5412.jpg OpenLORIS_images img1858.jpg OpenLORIS_images img5098.jpg OpenLORIS_images img9.jpg OpenLORIS_images img5866.jpg OpenLORIS_images img3294.jpg OpenLORIS_images img4052.jpg OpenLORIS_images img2328.jpg OpenLORIS_images img523.jpg OpenLORIS_images img5288.jpg OpenLORIS_images img5120.jpg OpenLORIS_images img541.jpg OpenLORIS_images img2987.jpg OpenLORIS_images img3014.jpg OpenLORIS_images img3938.jpg OpenLORIS_images img2180.jpg OpenLORIS_images img4134.jpg OpenLORIS_images img831.jpg OpenLORIS_images img6038.jpg OpenLORIS_images img4115.jpg OpenLORIS_images img1035.jpg OpenLORIS_images img5728.jpg OpenLORIS_images img2898.jpg OpenLORIS_images img302.jpg OpenLORIS_images img5687.jpg OpenLORIS_images img683.jpg OpenLORIS_images img5428.jpg OpenLORIS_images img3220.jpg OpenLORIS_images img2559.jpg OpenLORIS_images img5291.jpg OpenLORIS_images img3333.jpg OpenLORIS_images img2929.jpg OpenLORIS_images img2063.jpg OpenLORIS_images img1889.jpg OpenLORIS_images img4831.jpg OpenLORIS_images img4080.jpg OpenLORIS_images img1634.jpg OpenLORIS_images img1745.jpg OpenLORIS_images img6085.jpg OpenLORIS_images img1143.jpg OpenLORIS_images img3327.jpg OpenLORIS_images img723.jpg OpenLORIS_images img2776.jpg OpenLORIS_images img5698.jpg OpenLORIS_images img1357.jpg OpenLORIS_images img5284.jpg OpenLORIS_images img4360.jpg OpenLORIS_images img4185.jpg OpenLORIS_images img537.jpg OpenLORIS_images img1938.jpg OpenLORIS_images img4164.jpg OpenLORIS_images img1214.jpg OpenLORIS_images img2781.jpg OpenLORIS_images img114.jpg OpenLORIS_images img2665.jpg OpenLORIS_images img1486.jpg OpenLORIS_images img3444.jpg OpenLORIS_images img4628.jpg OpenLORIS_images img671.jpg OpenLORIS_images img5692.jpg OpenLORIS_images img1267.jpg OpenLORIS_images img4570.jpg OpenLORIS_images img4634.jpg OpenLORIS_images img629.jpg OpenLORIS_images img716.jpg OpenLORIS_images img3738.jpg OpenLORIS_images img5913.jpg OpenLORIS_images img1433.jpg OpenLORIS_images img1722.jpg OpenLORIS_images img4233.jpg OpenLORIS_images img967.jpg OpenLORIS_images img109.jpg OpenLORIS_images img2720.jpg OpenLORIS_images img4192.jpg OpenLORIS_images img3196.jpg OpenLORIS_images img89.jpg OpenLORIS_images img1712.jpg OpenLORIS_images img2218.jpg OpenLORIS_images img4876.jpg OpenLORIS_images img1949.jpg OpenLORIS_images img467.jpg OpenLORIS_images img572.jpg OpenLORIS_images img2025.jpg OpenLORIS_images img4677.jpg OpenLORIS_images img2471.jpg OpenLORIS_images img4878.jpg OpenLORIS_images img5088.jpg OpenLORIS_images img433.jpg OpenLORIS_images img3169.jpg OpenLORIS_images img5223.jpg OpenLORIS_images img3649.jpg OpenLORIS_images img2214.jpg OpenLORIS_images img3288.jpg OpenLORIS_images img552.jpg OpenLORIS_images img1454.jpg OpenLORIS_images img4540.jpg OpenLORIS_images img1019.jpg OpenLORIS_images img4150.jpg OpenLORIS_images img3195.jpg OpenLORIS_images img5927.jpg OpenLORIS_images img762.jpg OpenLORIS_images img5628.jpg OpenLORIS_images img2259.jpg OpenLORIS_images img1685.jpg OpenLORIS_images img3963.jpg OpenLORIS_images img280.jpg OpenLORIS_images img5759.jpg OpenLORIS_images img5671.jpg OpenLORIS_images img138.jpg OpenLORIS_images img2696.jpg OpenLORIS_images img4083.jpg OpenLORIS_images img1233.jpg OpenLORIS_images img2182.jpg OpenLORIS_images img2545.jpg OpenLORIS_images img1941.jpg OpenLORIS_images img456.jpg OpenLORIS_images img4357.jpg OpenLORIS_images img5260.jpg OpenLORIS_images img4659.jpg OpenLORIS_images img5732.jpg OpenLORIS_images img427.jpg OpenLORIS_images img5636.jpg OpenLORIS_images img4026.jpg OpenLORIS_images img3076.jpg OpenLORIS_images img5946.jpg OpenLORIS_images img4004.jpg OpenLORIS_images img2227.jpg OpenLORIS_images img2587.jpg OpenLORIS_images img1926.jpg OpenLORIS_images img3483.jpg OpenLORIS_images img3148.jpg OpenLORIS_images img4720.jpg OpenLORIS_images img100.jpg OpenLORIS_images img1802.jpg OpenLORIS_images img2383.jpg OpenLORIS_images img5380.jpg OpenLORIS_images img856.jpg OpenLORIS_images img1901.jpg OpenLORIS_images img2111.jpg OpenLORIS_images img6082.jpg OpenLORIS_images img4756.jpg OpenLORIS_images img614.jpg OpenLORIS_images img2477.jpg OpenLORIS_images img3625.jpg OpenLORIS_images img4912.jpg OpenLORIS_images img2299.jpg OpenLORIS_images img3116.jpg OpenLORIS_images img3090.jpg OpenLORIS_images img3435.jpg OpenLORIS_images img1102.jpg OpenLORIS_images img2857.jpg OpenLORIS_images img4646.jpg OpenLORIS_images img469.jpg OpenLORIS_images img2788.jpg OpenLORIS_images img5575.jpg OpenLORIS_images img5141.jpg OpenLORIS_images img1007.jpg OpenLORIS_images img1769.jpg OpenLORIS_images img1346.jpg OpenLORIS_images img4829.jpg OpenLORIS_images img5975.jpg OpenLORIS_images img3287.jpg OpenLORIS_images img3496.jpg OpenLORIS_images img4579.jpg OpenLORIS_images img1830.jpg OpenLORIS_images img485.jpg OpenLORIS_images img4809.jpg OpenLORIS_images img2650.jpg OpenLORIS_images img3970.jpg OpenLORIS_images img3985.jpg OpenLORIS_images img2224.jpg OpenLORIS_images img1455.jpg OpenLORIS_images img3602.jpg OpenLORIS_images img5847.jpg OpenLORIS_images img854.jpg OpenLORIS_images img556.jpg OpenLORIS_images img4724.jpg OpenLORIS_images img5513.jpg OpenLORIS_images img2187.jpg OpenLORIS_images img1197.jpg OpenLORIS_images img384.jpg OpenLORIS_images img5595.jpg OpenLORIS_images img4623.jpg OpenLORIS_images img5153.jpg OpenLORIS_images img118.jpg OpenLORIS_images img1375.jpg OpenLORIS_images img1974.jpg OpenLORIS_images img5055.jpg OpenLORIS_images img1425.jpg OpenLORIS_images img3378.jpg OpenLORIS_images img718.jpg OpenLORIS_images img4434.jpg OpenLORIS_images img1921.jpg OpenLORIS_images img5426.jpg OpenLORIS_images img3731.jpg OpenLORIS_images img4535.jpg OpenLORIS_images img1759.jpg OpenLORIS_images img1518.jpg OpenLORIS_images img534.jpg OpenLORIS_images img1849.jpg OpenLORIS_images img4058.jpg OpenLORIS_images img123.jpg OpenLORIS_images img5234.jpg OpenLORIS_images img4295.jpg OpenLORIS_images img497.jpg OpenLORIS_images img3215.jpg OpenLORIS_images img2485.jpg OpenLORIS_images img3796.jpg OpenLORIS_images img5862.jpg OpenLORIS_images img3926.jpg OpenLORIS_images img4545.jpg OpenLORIS_images img1132.jpg OpenLORIS_images img4742.jpg OpenLORIS_images img4104.jpg OpenLORIS_images img3256.jpg OpenLORIS_images img3931.jpg OpenLORIS_images img2631.jpg OpenLORIS_images img4280.jpg OpenLORIS_images img5693.jpg OpenLORIS_images img4892.jpg OpenLORIS_images img232.jpg OpenLORIS_images img5243.jpg OpenLORIS_images img5938.jpg OpenLORIS_images img2489.jpg OpenLORIS_images img748.jpg OpenLORIS_images img4036.jpg OpenLORIS_images img1652.jpg OpenLORIS_images img1319.jpg OpenLORIS_images img2962.jpg OpenLORIS_images img3388.jpg OpenLORIS_images img4815.jpg OpenLORIS_images img2769.jpg OpenLORIS_images img5998.jpg OpenLORIS_images img2681.jpg OpenLORIS_images img589.jpg OpenLORIS_images img3186.jpg OpenLORIS_images img1999.jpg OpenLORIS_images img908.jpg OpenLORIS_images img2042.jpg OpenLORIS_images img2946.jpg OpenLORIS_images img5738.jpg OpenLORIS_images img4072.jpg OpenLORIS_images img581.jpg OpenLORIS_images img1661.jpg OpenLORIS_images img1390.jpg OpenLORIS_images img2648.jpg OpenLORIS_images img3624.jpg OpenLORIS_images img4786.jpg OpenLORIS_images img563.jpg OpenLORIS_images img2504.jpg OpenLORIS_images img1725.jpg OpenLORIS_images img2617.jpg OpenLORIS_images img26.jpg OpenLORIS_images img5792.jpg OpenLORIS_images img1806.jpg OpenLORIS_images img2972.jpg OpenLORIS_images img780.jpg OpenLORIS_images img2668.jpg OpenLORIS_images img4580.jpg OpenLORIS_images img5374.jpg OpenLORIS_images img14.jpg OpenLORIS_images img1207.jpg OpenLORIS_images img2037.jpg OpenLORIS_images img2416.jpg OpenLORIS_images img5801.jpg OpenLORIS_images img3401.jpg OpenLORIS_images img1432.jpg OpenLORIS_images img1816.jpg OpenLORIS_images img1765.jpg OpenLORIS_images img3773.jpg OpenLORIS_images img121.jpg OpenLORIS_images img377.jpg OpenLORIS_images img3779.jpg OpenLORIS_images img284.jpg OpenLORIS_images img326.jpg OpenLORIS_images img1281.jpg OpenLORIS_images img5714.jpg OpenLORIS_images img4082.jpg OpenLORIS_images img2409.jpg OpenLORIS_images img5800.jpg OpenLORIS_images img3900.jpg OpenLORIS_images img2616.jpg OpenLORIS_images img3108.jpg OpenLORIS_images img1902.jpg OpenLORIS_images img1810.jpg OpenLORIS_images img3140.jpg OpenLORIS_images img4712.jpg OpenLORIS_images img5306.jpg OpenLORIS_images img1841.jpg OpenLORIS_images img221.jpg OpenLORIS_images img1249.jpg OpenLORIS_images img3020.jpg OpenLORIS_images img676.jpg OpenLORIS_images img1761.jpg OpenLORIS_images img3476.jpg OpenLORIS_images img3120.jpg OpenLORIS_images img5609.jpg OpenLORIS_images img2766.jpg OpenLORIS_images img1005.jpg OpenLORIS_images img2856.jpg OpenLORIS_images img4588.jpg OpenLORIS_images img5110.jpg OpenLORIS_images img5542.jpg OpenLORIS_images img4088.jpg OpenLORIS_images img5784.jpg OpenLORIS_images img5947.jpg OpenLORIS_images img1596.jpg OpenLORIS_images img3807.jpg OpenLORIS_images img1146.jpg OpenLORIS_images img585.jpg OpenLORIS_images img273.jpg OpenLORIS_images img3318.jpg OpenLORIS_images img3732.jpg OpenLORIS_images img5982.jpg OpenLORIS_images img5957.jpg OpenLORIS_images img2102.jpg OpenLORIS_images img1380.jpg OpenLORIS_images img837.jpg OpenLORIS_images img913.jpg OpenLORIS_images img116.jpg OpenLORIS_images img1825.jpg OpenLORIS_images img4096.jpg OpenLORIS_images img3708.jpg OpenLORIS_images img2045.jpg OpenLORIS_images img3863.jpg OpenLORIS_images img3326.jpg OpenLORIS_images img940.jpg OpenLORIS_images img3059.jpg OpenLORIS_images img267.jpg OpenLORIS_images img2554.jpg OpenLORIS_images img5257.jpg OpenLORIS_images img595.jpg OpenLORIS_images img507.jpg OpenLORIS_images img1124.jpg OpenLORIS_images img5419.jpg OpenLORIS_images img173.jpg OpenLORIS_images img5304.jpg OpenLORIS_images img2136.jpg OpenLORIS_images img5557.jpg OpenLORIS_images img3842.jpg OpenLORIS_images img3054.jpg OpenLORIS_images img902.jpg OpenLORIS_images img5051.jpg OpenLORIS_images img4990.jpg OpenLORIS_images img4209.jpg OpenLORIS_images img1488.jpg OpenLORIS_images img529.jpg OpenLORIS_images img4218.jpg OpenLORIS_images img1571.jpg OpenLORIS_images img2970.jpg OpenLORIS_images img2199.jpg OpenLORIS_images img665.jpg OpenLORIS_images img575.jpg OpenLORIS_images img5994.jpg OpenLORIS_images img6086.jpg OpenLORIS_images img5703.jpg OpenLORIS_images img4639.jpg OpenLORIS_images img3188.jpg OpenLORIS_images img298.jpg OpenLORIS_images img2069.jpg OpenLORIS_images img4202.jpg OpenLORIS_images img675.jpg OpenLORIS_images img5154.jpg OpenLORIS_images img670.jpg OpenLORIS_images img3301.jpg OpenLORIS_images img3482.jpg OpenLORIS_images img1017.jpg OpenLORIS_images img5128.jpg OpenLORIS_images img1279.jpg OpenLORIS_images img2677.jpg OpenLORIS_images img4331.jpg OpenLORIS_images img3307.jpg OpenLORIS_images img5910.jpg OpenLORIS_images img527.jpg OpenLORIS_images img6008.jpg OpenLORIS_images img2356.jpg OpenLORIS_images img3341.jpg OpenLORIS_images img2966.jpg OpenLORIS_images img773.jpg OpenLORIS_images img5390.jpg OpenLORIS_images img1009.jpg OpenLORIS_images img3911.jpg OpenLORIS_images img1430.jpg OpenLORIS_images img3928.jpg OpenLORIS_images img3487.jpg OpenLORIS_images img1741.jpg OpenLORIS_images img4116.jpg OpenLORIS_images img5577.jpg OpenLORIS_images img1216.jpg OpenLORIS_images img6071.jpg OpenLORIS_images img4598.jpg OpenLORIS_images img2639.jpg OpenLORIS_images img2528.jpg OpenLORIS_images img859.jpg OpenLORIS_images img4650.jpg OpenLORIS_images img4871.jpg OpenLORIS_images img3934.jpg OpenLORIS_images img4781.jpg OpenLORIS_images img186.jpg OpenLORIS_images img362.jpg OpenLORIS_images img405.jpg OpenLORIS_images img1874.jpg OpenLORIS_images img4683.jpg OpenLORIS_images img2389.jpg OpenLORIS_images img5651.jpg OpenLORIS_images img2322.jpg OpenLORIS_images img3745.jpg OpenLORIS_images img3439.jpg OpenLORIS_images img669.jpg OpenLORIS_images img3821.jpg OpenLORIS_images img4348.jpg OpenLORIS_images img2555.jpg OpenLORIS_images img5191.jpg OpenLORIS_images img1342.jpg OpenLORIS_images img2036.jpg OpenLORIS_images img3349.jpg OpenLORIS_images img3799.jpg OpenLORIS_images img731.jpg OpenLORIS_images img306.jpg OpenLORIS_images img3563.jpg OpenLORIS_images img2294.jpg OpenLORIS_images img611.jpg OpenLORIS_images img222.jpg OpenLORIS_images img6025.jpg OpenLORIS_images img1827.jpg OpenLORIS_images img2945.jpg OpenLORIS_images img5656.jpg OpenLORIS_images img2338.jpg OpenLORIS_images img1714.jpg OpenLORIS_images img2190.jpg OpenLORIS_images img2232.jpg OpenLORIS_images img1657.jpg OpenLORIS_images img3447.jpg OpenLORIS_images img2113.jpg OpenLORIS_images img5113.jpg OpenLORIS_images img3764.jpg OpenLORIS_images img3569.jpg OpenLORIS_images img5325.jpg OpenLORIS_images img6024.jpg OpenLORIS_images img3641.jpg OpenLORIS_images img559.jpg OpenLORIS_images img5949.jpg OpenLORIS_images img939.jpg OpenLORIS_images img2075.jpg OpenLORIS_images img1961.jpg OpenLORIS_images img2680.jpg OpenLORIS_images img352.jpg OpenLORIS_images img2019.jpg OpenLORIS_images img218.jpg OpenLORIS_images img338.jpg OpenLORIS_images img4826.jpg OpenLORIS_images img5212.jpg OpenLORIS_images img2339.jpg OpenLORIS_images img5209.jpg OpenLORIS_images img1073.jpg OpenLORIS_images img547.jpg OpenLORIS_images img894.jpg OpenLORIS_images img2661.jpg OpenLORIS_images img830.jpg OpenLORIS_images img5228.jpg OpenLORIS_images img2375.jpg OpenLORIS_images img375.jpg OpenLORIS_images img2838.jpg OpenLORIS_images img262.jpg OpenLORIS_images img1462.jpg OpenLORIS_images img4984.jpg OpenLORIS_images img3296.jpg OpenLORIS_images img4093.jpg OpenLORIS_images img2721.jpg OpenLORIS_images img560.jpg OpenLORIS_images img5241.jpg OpenLORIS_images img135.jpg OpenLORIS_images img3162.jpg OpenLORIS_images img4193.jpg OpenLORIS_images img440.jpg OpenLORIS_images img739.jpg OpenLORIS_images img4670.jpg OpenLORIS_images img2002.jpg OpenLORIS_images img474.jpg OpenLORIS_images img845.jpg OpenLORIS_images img4409.jpg OpenLORIS_images img1885.jpg OpenLORIS_images img3578.jpg OpenLORIS_images img1412.jpg OpenLORIS_images img2191.jpg OpenLORIS_images img1519.jpg OpenLORIS_images img4821.jpg OpenLORIS_images img43.jpg OpenLORIS_images img3704.jpg OpenLORIS_images img161.jpg OpenLORIS_images img1360.jpg OpenLORIS_images img5266.jpg OpenLORIS_images img2061.jpg OpenLORIS_images img2336.jpg OpenLORIS_images img3808.jpg OpenLORIS_images img5679.jpg OpenLORIS_images img6142.jpg OpenLORIS_images img2497.jpg OpenLORIS_images img1655.jpg OpenLORIS_images img3352.jpg OpenLORIS_images img4731.jpg OpenLORIS_images img2979.jpg OpenLORIS_images img1555.jpg OpenLORIS_images img3984.jpg OpenLORIS_images img3557.jpg OpenLORIS_images img1632.jpg OpenLORIS_images img590.jpg OpenLORIS_images img1397.jpg OpenLORIS_images img3018.jpg OpenLORIS_images img5107.jpg OpenLORIS_images img5488.jpg OpenLORIS_images img3881.jpg OpenLORIS_images img3051.jpg OpenLORIS_images img1211.jpg OpenLORIS_images img5553.jpg OpenLORIS_images img5383.jpg OpenLORIS_images img2959.jpg OpenLORIS_images img2703.jpg OpenLORIS_images img3132.jpg OpenLORIS_images img5035.jpg OpenLORIS_images img3396.jpg OpenLORIS_images img1289.jpg OpenLORIS_images img3536.jpg OpenLORIS_images img4311.jpg OpenLORIS_images img5189.jpg OpenLORIS_images img2513.jpg OpenLORIS_images img2794.jpg OpenLORIS_images img4469.jpg OpenLORIS_images img3007.jpg OpenLORIS_images img1568.jpg OpenLORIS_images img2478.jpg OpenLORIS_images img5954.jpg OpenLORIS_images img2116.jpg OpenLORIS_images img2522.jpg OpenLORIS_images img1151.jpg OpenLORIS_images img1892.jpg OpenLORIS_images img2908.jpg OpenLORIS_images img4376.jpg OpenLORIS_images img5092.jpg OpenLORIS_images img3702.jpg OpenLORIS_images img3526.jpg OpenLORIS_images img2355.jpg OpenLORIS_images img754.jpg OpenLORIS_images img3682.jpg OpenLORIS_images img5718.jpg OpenLORIS_images img3348.jpg OpenLORIS_images img5264.jpg OpenLORIS_images img4812.jpg OpenLORIS_images img1719.jpg OpenLORIS_images img4755.jpg OpenLORIS_images img5763.jpg OpenLORIS_images img2420.jpg OpenLORIS_images img5102.jpg OpenLORIS_images img2065.jpg OpenLORIS_images img2771.jpg OpenLORIS_images img5803.jpg OpenLORIS_images img3350.jpg OpenLORIS_images img4741.jpg OpenLORIS_images img292.jpg OpenLORIS_images img4911.jpg OpenLORIS_images img182.jpg OpenLORIS_images img5261.jpg OpenLORIS_images img4039.jpg OpenLORIS_images img2198.jpg OpenLORIS_images img5632.jpg OpenLORIS_images img1639.jpg OpenLORIS_images img5271.jpg OpenLORIS_images img5345.jpg OpenLORIS_images img921.jpg OpenLORIS_images img5253.jpg OpenLORIS_images img4776.jpg OpenLORIS_images img95.jpg OpenLORIS_images img612.jpg OpenLORIS_images img20.jpg OpenLORIS_images img3726.jpg OpenLORIS_images img3409.jpg OpenLORIS_images img3714.jpg OpenLORIS_images img2441.jpg OpenLORIS_images img4204.jpg OpenLORIS_images img2921.jpg OpenLORIS_images img909.jpg OpenLORIS_images img2630.jpg OpenLORIS_images img1610.jpg OpenLORIS_images img2813.jpg OpenLORIS_images img1052.jpg OpenLORIS_images img4210.jpg OpenLORIS_images img5294.jpg OpenLORIS_images img5695.jpg OpenLORIS_images img1797.jpg OpenLORIS_images img5329.jpg OpenLORIS_images img5262.jpg OpenLORIS_images img3356.jpg OpenLORIS_images img6123.jpg OpenLORIS_images img2469.jpg OpenLORIS_images img1473.jpg OpenLORIS_images img3338.jpg OpenLORIS_images img599.jpg OpenLORIS_images img4886.jpg OpenLORIS_images img169.jpg OpenLORIS_images img1033.jpg OpenLORIS_images img4063.jpg OpenLORIS_images img5591.jpg OpenLORIS_images img5535.jpg OpenLORIS_images img6011.jpg OpenLORIS_images img1148.jpg OpenLORIS_images img4574.jpg OpenLORIS_images img6030.jpg OpenLORIS_images img4906.jpg OpenLORIS_images img1404.jpg OpenLORIS_images img1615.jpg OpenLORIS_images img6131.jpg OpenLORIS_images img1059.jpg OpenLORIS_images img553.jpg OpenLORIS_images img4586.jpg OpenLORIS_images img3553.jpg OpenLORIS_images img1478.jpg OpenLORIS_images img6004.jpg OpenLORIS_images img4657.jpg OpenLORIS_images img2612.jpg OpenLORIS_images img1882.jpg OpenLORIS_images img5308.jpg OpenLORIS_images img3615.jpg OpenLORIS_images img2844.jpg OpenLORIS_images img6019.jpg OpenLORIS_images img1755.jpg OpenLORIS_images img644.jpg OpenLORIS_images img462.jpg OpenLORIS_images img4577.jpg OpenLORIS_images img2285.jpg OpenLORIS_images img3011.jpg OpenLORIS_images img21.jpg OpenLORIS_images img1954.jpg OpenLORIS_images img2032.jpg OpenLORIS_images img5948.jpg OpenLORIS_images img4457.jpg OpenLORIS_images img5440.jpg OpenLORIS_images img3809.jpg OpenLORIS_images img2858.jpg OpenLORIS_images img4939.jpg OpenLORIS_images img986.jpg OpenLORIS_images img0.jpg OpenLORIS_images img3232.jpg OpenLORIS_images img4456.jpg OpenLORIS_images img4182.jpg OpenLORIS_images img5471.jpg OpenLORIS_images img2600.jpg OpenLORIS_images img711.jpg OpenLORIS_images img3248.jpg OpenLORIS_images img4867.jpg OpenLORIS_images img1669.jpg OpenLORIS_images img713.jpg OpenLORIS_images img3528.jpg OpenLORIS_images img1738.jpg OpenLORIS_images img924.jpg OpenLORIS_images img4214.jpg OpenLORIS_images img5794.jpg OpenLORIS_images img2637.jpg OpenLORIS_images img5025.jpg OpenLORIS_images img1973.jpg OpenLORIS_images img5955.jpg OpenLORIS_images img1629.jpg OpenLORIS_images img4242.jpg OpenLORIS_images img2340.jpg OpenLORIS_images img2790.jpg OpenLORIS_images img251.jpg OpenLORIS_images img2772.jpg OpenLORIS_images img5295.jpg OpenLORIS_images img3636.jpg OpenLORIS_images img4493.jpg OpenLORIS_images img2081.jpg OpenLORIS_images img4522.jpg OpenLORIS_images img918.jpg OpenLORIS_images img3552.jpg OpenLORIS_images img295.jpg OpenLORIS_images img1600.jpg OpenLORIS_images img3089.jpg OpenLORIS_images img3037.jpg OpenLORIS_images img2097.jpg OpenLORIS_images img1469.jpg OpenLORIS_images img1953.jpg OpenLORIS_images img4612.jpg OpenLORIS_images img624.jpg OpenLORIS_images img3411.jpg OpenLORIS_images img2348.jpg OpenLORIS_images img2814.jpg OpenLORIS_images img5740.jpg OpenLORIS_images img1097.jpg OpenLORIS_images img5255.jpg OpenLORIS_images img4073.jpg OpenLORIS_images img944.jpg OpenLORIS_images img3635.jpg OpenLORIS_images img3515.jpg OpenLORIS_images img2642.jpg OpenLORIS_images img2194.jpg OpenLORIS_images img2337.jpg OpenLORIS_images img989.jpg OpenLORIS_images img2443.jpg OpenLORIS_images img3565.jpg OpenLORIS_images img4485.jpg OpenLORIS_images img397.jpg OpenLORIS_images img3353.jpg OpenLORIS_images img185.jpg OpenLORIS_images img5062.jpg OpenLORIS_images img6094.jpg OpenLORIS_images img2235.jpg OpenLORIS_images img1514.jpg OpenLORIS_images img4937.jpg OpenLORIS_images img4263.jpg OpenLORIS_images img1483.jpg OpenLORIS_images img1113.jpg OpenLORIS_images img3541.jpg OpenLORIS_images img1498.jpg OpenLORIS_images img3657.jpg OpenLORIS_images img1756.jpg OpenLORIS_images img2502.jpg OpenLORIS_images img5018.jpg OpenLORIS_images img3100.jpg OpenLORIS_images img2798.jpg OpenLORIS_images img3377.jpg OpenLORIS_images img4761.jpg OpenLORIS_images img5877.jpg OpenLORIS_images img5013.jpg OpenLORIS_images img4101.jpg OpenLORIS_images img4085.jpg OpenLORIS_images img346.jpg OpenLORIS_images img1372.jpg OpenLORIS_images img4914.jpg OpenLORIS_images img3947.jpg OpenLORIS_images img5303.jpg OpenLORIS_images img2787.jpg OpenLORIS_images img4967.jpg OpenLORIS_images img3989.jpg OpenLORIS_images img2594.jpg OpenLORIS_images img619.jpg OpenLORIS_images img4917.jpg OpenLORIS_images img3956.jpg OpenLORIS_images img2937.jpg OpenLORIS_images img1673.jpg OpenLORIS_images img2886.jpg OpenLORIS_images img1811.jpg OpenLORIS_images img5861.jpg OpenLORIS_images img2052.jpg OpenLORIS_images img1351.jpg OpenLORIS_images img577.jpg OpenLORIS_images img2932.jpg OpenLORIS_images img2724.jpg OpenLORIS_images img2258.jpg OpenLORIS_images img3118.jpg OpenLORIS_images img3200.jpg OpenLORIS_images img656.jpg OpenLORIS_images img4601.jpg OpenLORIS_images img3827.jpg OpenLORIS_images img3053.jpg OpenLORIS_images img3044.jpg OpenLORIS_images img1103.jpg OpenLORIS_images img4672.jpg OpenLORIS_images img4296.jpg OpenLORIS_images img2676.jpg OpenLORIS_images img2999.jpg OpenLORIS_images img1255.jpg OpenLORIS_images img1532.jpg OpenLORIS_images img2785.jpg OpenLORIS_images img3477.jpg OpenLORIS_images img2427.jpg OpenLORIS_images img2538.jpg OpenLORIS_images img4748.jpg OpenLORIS_images img3124.jpg OpenLORIS_images img5547.jpg OpenLORIS_images img1943.jpg OpenLORIS_images img2083.jpg OpenLORIS_images img3026.jpg OpenLORIS_images img3115.jpg OpenLORIS_images img3164.jpg OpenLORIS_images img588.jpg OpenLORIS_images img2940.jpg OpenLORIS_images img5355.jpg OpenLORIS_images img2678.jpg OpenLORIS_images img6063.jpg OpenLORIS_images img5084.jpg OpenLORIS_images img6000.jpg OpenLORIS_images img5457.jpg OpenLORIS_images img3128.jpg OpenLORIS_images img4215.jpg OpenLORIS_images img4749.jpg OpenLORIS_images img3139.jpg OpenLORIS_images img5833.jpg OpenLORIS_images img4641.jpg OpenLORIS_images img1502.jpg OpenLORIS_images img3437.jpg OpenLORIS_images img5545.jpg OpenLORIS_images img5541.jpg OpenLORIS_images img4486.jpg OpenLORIS_images img5104.jpg OpenLORIS_images img2015.jpg OpenLORIS_images img2912.jpg OpenLORIS_images img465.jpg OpenLORIS_images img112.jpg OpenLORIS_images img5081.jpg OpenLORIS_images img4255.jpg OpenLORIS_images img50.jpg OpenLORIS_images img2240.jpg OpenLORIS_images img5339.jpg OpenLORIS_images img4521.jpg OpenLORIS_images img4144.jpg OpenLORIS_images img846.jpg OpenLORIS_images img5117.jpg OpenLORIS_images img4613.jpg OpenLORIS_images img3824.jpg OpenLORIS_images img1082.jpg OpenLORIS_images img6113.jpg OpenLORIS_images img5712.jpg OpenLORIS_images img3784.jpg OpenLORIS_images img3177.jpg OpenLORIS_images img5132.jpg OpenLORIS_images img3304.jpg OpenLORIS_images img1090.jpg OpenLORIS_images img3575.jpg OpenLORIS_images img5670.jpg OpenLORIS_images img3626.jpg OpenLORIS_images img3660.jpg OpenLORIS_images img4778.jpg OpenLORIS_images img2638.jpg OpenLORIS_images img1552.jpg OpenLORIS_images img4281.jpg OpenLORIS_images img1537.jpg OpenLORIS_images img2746.jpg OpenLORIS_images img1625.jpg OpenLORIS_images img4410.jpg OpenLORIS_images img5250.jpg OpenLORIS_images img3678.jpg OpenLORIS_images img4203.jpg OpenLORIS_images img5610.jpg OpenLORIS_images img33.jpg OpenLORIS_images img5662.jpg OpenLORIS_images img4222.jpg OpenLORIS_images img5093.jpg OpenLORIS_images img2274.jpg OpenLORIS_images img2266.jpg OpenLORIS_images img1909.jpg OpenLORIS_images img2437.jpg OpenLORIS_images img2360.jpg OpenLORIS_images img4465.jpg OpenLORIS_images img5439.jpg OpenLORIS_images img1337.jpg OpenLORIS_images img4751.jpg OpenLORIS_images img428.jpg OpenLORIS_images img2601.jpg OpenLORIS_images img3663.jpg OpenLORIS_images img4367.jpg OpenLORIS_images img1022.jpg OpenLORIS_images img5908.jpg OpenLORIS_images img2740.jpg OpenLORIS_images img4763.jpg OpenLORIS_images img2636.jpg OpenLORIS_images img652.jpg OpenLORIS_images img4958.jpg OpenLORIS_images img3153.jpg OpenLORIS_images img1158.jpg OpenLORIS_images img6106.jpg OpenLORIS_images img1153.jpg OpenLORIS_images img175.jpg OpenLORIS_images img189.jpg OpenLORIS_images img5059.jpg OpenLORIS_images img5134.jpg OpenLORIS_images img3375.jpg OpenLORIS_images img511.jpg OpenLORIS_images img2686.jpg OpenLORIS_images img2344.jpg OpenLORIS_images img301.jpg OpenLORIS_images img2325.jpg OpenLORIS_images img3616.jpg OpenLORIS_images img3382.jpg OpenLORIS_images img3293.jpg OpenLORIS_images img3752.jpg OpenLORIS_images img5857.jpg OpenLORIS_images img5965.jpg OpenLORIS_images img5461.jpg OpenLORIS_images img937.jpg OpenLORIS_images img520.jpg OpenLORIS_images img278.jpg OpenLORIS_images img1581.jpg OpenLORIS_images img3961.jpg OpenLORIS_images img3219.jpg OpenLORIS_images img2774.jpg OpenLORIS_images img54.jpg OpenLORIS_images img4926.jpg OpenLORIS_images img1036.jpg OpenLORIS_images img5796.jpg OpenLORIS_images img2552.jpg OpenLORIS_images img4635.jpg OpenLORIS_images img3084.jpg OpenLORIS_images img4913.jpg OpenLORIS_images img4605.jpg OpenLORIS_images img4602.jpg OpenLORIS_images img630.jpg OpenLORIS_images img51.jpg OpenLORIS_images img1643.jpg OpenLORIS_images img1034.jpg OpenLORIS_images img4384.jpg OpenLORIS_images img4523.jpg OpenLORIS_images img4092.jpg OpenLORIS_images img3974.jpg OpenLORIS_images img5893.jpg OpenLORIS_images img728.jpg OpenLORIS_images img4526.jpg OpenLORIS_images img2444.jpg OpenLORIS_images img957.jpg OpenLORIS_images img4599.jpg OpenLORIS_images img5377.jpg OpenLORIS_images img1134.jpg OpenLORIS_images img441.jpg OpenLORIS_images img351.jpg OpenLORIS_images img6093.jpg OpenLORIS_images img2393.jpg OpenLORIS_images img1563.jpg OpenLORIS_images img4232.jpg OpenLORIS_images img3208.jpg OpenLORIS_images img3815.jpg OpenLORIS_images img5199.jpg OpenLORIS_images img2177.jpg OpenLORIS_images img2403.jpg OpenLORIS_images img5115.jpg OpenLORIS_images img3856.jpg OpenLORIS_images img5052.jpg OpenLORIS_images img3709.jpg OpenLORIS_images img4128.jpg OpenLORIS_images img4145.jpg OpenLORIS_images img686.jpg OpenLORIS_images img945.jpg OpenLORIS_images img3609.jpg OpenLORIS_images img858.jpg OpenLORIS_images img5921.jpg OpenLORIS_images img823.jpg OpenLORIS_images img3721.jpg OpenLORIS_images img5776.jpg OpenLORIS_images img4260.jpg OpenLORIS_images img2211.jpg OpenLORIS_images img1767.jpg OpenLORIS_images img3858.jpg OpenLORIS_images img5292.jpg OpenLORIS_images img1061.jpg OpenLORIS_images img2577.jpg OpenLORIS_images img4071.jpg OpenLORIS_images img2152.jpg OpenLORIS_images img3978.jpg OpenLORIS_images img5638.jpg OpenLORIS_images img1104.jpg OpenLORIS_images img1448.jpg OpenLORIS_images img3385.jpg OpenLORIS_images img2745.jpg OpenLORIS_images img966.jpg OpenLORIS_images img2076.jpg OpenLORIS_images img2140.jpg OpenLORIS_images img2895.jpg OpenLORIS_images img6002.jpg OpenLORIS_images img74.jpg OpenLORIS_images img2614.jpg OpenLORIS_images img5961.jpg OpenLORIS_images img5247.jpg OpenLORIS_images img6005.jpg OpenLORIS_images img5828.jpg OpenLORIS_images img3770.jpg OpenLORIS_images img3654.jpg OpenLORIS_images img5775.jpg OpenLORIS_images img3290.jpg OpenLORIS_images img1096.jpg OpenLORIS_images img5125.jpg OpenLORIS_images img4142.jpg OpenLORIS_images img783.jpg OpenLORIS_images img3379.jpg OpenLORIS_images img392.jpg OpenLORIS_images img2692.jpg OpenLORIS_images img4340.jpg OpenLORIS_images img5815.jpg OpenLORIS_images img5121.jpg OpenLORIS_images img289.jpg OpenLORIS_images img3728.jpg OpenLORIS_images img2262.jpg OpenLORIS_images img6103.jpg OpenLORIS_images img4648.jpg OpenLORIS_images img5017.jpg OpenLORIS_images img2317.jpg OpenLORIS_images img5641.jpg OpenLORIS_images img1027.jpg OpenLORIS_images img661.jpg OpenLORIS_images img5072.jpg OpenLORIS_images img1237.jpg OpenLORIS_images img1843.jpg OpenLORIS_images img5097.jpg OpenLORIS_images img2936.jpg OpenLORIS_images img2848.jpg OpenLORIS_images img779.jpg OpenLORIS_images img3652.jpg OpenLORIS_images img1366.jpg OpenLORIS_images img4075.jpg OpenLORIS_images img1972.jpg OpenLORIS_images img4358.jpg OpenLORIS_images img4390.jpg OpenLORIS_images img2659.jpg OpenLORIS_images img4725.jpg OpenLORIS_images img4022.jpg OpenLORIS_images img510.jpg OpenLORIS_images img3766.jpg OpenLORIS_images img855.jpg OpenLORIS_images img655.jpg OpenLORIS_images img4261.jpg OpenLORIS_images img3832.jpg OpenLORIS_images img997.jpg OpenLORIS_images img5324.jpg OpenLORIS_images img4722.jpg OpenLORIS_images img307.jpg OpenLORIS_images img3640.jpg OpenLORIS_images img536.jpg OpenLORIS_images img1040.jpg OpenLORIS_images img4589.jpg OpenLORIS_images img4594.jpg OpenLORIS_images img4035.jpg OpenLORIS_images img2363.jpg OpenLORIS_images img1058.jpg OpenLORIS_images img5048.jpg OpenLORIS_images img987.jpg OpenLORIS_images img4537.jpg OpenLORIS_images img3919.jpg OpenLORIS_images img1084.jpg OpenLORIS_images img5761.jpg OpenLORIS_images img259.jpg OpenLORIS_images img5758.jpg OpenLORIS_images img1861.jpg OpenLORIS_images img2562.jpg OpenLORIS_images img183.jpg OpenLORIS_images img3112.jpg OpenLORIS_images img3504.jpg OpenLORIS_images img3181.jpg OpenLORIS_images img5168.jpg OpenLORIS_images img1713.jpg OpenLORIS_images img4695.jpg OpenLORIS_images img3867.jpg OpenLORIS_images img4385.jpg OpenLORIS_images img1539.jpg OpenLORIS_images img5482.jpg OpenLORIS_images img3236.jpg OpenLORIS_images img2073.jpg OpenLORIS_images img3492.jpg OpenLORIS_images img1482.jpg OpenLORIS_images img2439.jpg OpenLORIS_images img1660.jpg OpenLORIS_images img5836.jpg OpenLORIS_images img2688.jpg OpenLORIS_images img1206.jpg OpenLORIS_images img519.jpg OpenLORIS_images img6137.jpg OpenLORIS_images img3226.jpg OpenLORIS_images img2561.jpg OpenLORIS_images img4328.jpg OpenLORIS_images img995.jpg OpenLORIS_images img4576.jpg OpenLORIS_images img4994.jpg OpenLORIS_images img3058.jpg OpenLORIS_images img1873.jpg OpenLORIS_images img2773.jpg OpenLORIS_images img914.jpg OpenLORIS_images img1108.jpg OpenLORIS_images img1209.jpg OpenLORIS_images img201.jpg OpenLORIS_images img5162.jpg OpenLORIS_images img3992.jpg OpenLORIS_images img1315.jpg OpenLORIS_images img2040.jpg OpenLORIS_images img3882.jpg OpenLORIS_images img1924.jpg OpenLORIS_images img1758.jpg OpenLORIS_images img2379.jpg OpenLORIS_images img4028.jpg OpenLORIS_images img5719.jpg OpenLORIS_images img3123.jpg OpenLORIS_images img5184.jpg OpenLORIS_images img1823.jpg OpenLORIS_images img5624.jpg OpenLORIS_images img5875.jpg OpenLORIS_images img2553.jpg OpenLORIS_images img3450.jpg OpenLORIS_images img110.jpg OpenLORIS_images img3844.jpg OpenLORIS_images img690.jpg OpenLORIS_images img531.jpg OpenLORIS_images img2842.jpg OpenLORIS_images img3976.jpg OpenLORIS_images img3436.jpg OpenLORIS_images img509.jpg OpenLORIS_images img5564.jpg OpenLORIS_images img5086.jpg OpenLORIS_images img1186.jpg OpenLORIS_images img261.jpg OpenLORIS_images img5501.jpg OpenLORIS_images img1156.jpg OpenLORIS_images img6135.jpg OpenLORIS_images img3175.jpg OpenLORIS_images img1656.jpg OpenLORIS_images img2943.jpg OpenLORIS_images img5404.jpg OpenLORIS_images img129.jpg OpenLORIS_images img2599.jpg OpenLORIS_images img3927.jpg OpenLORIS_images img398.jpg OpenLORIS_images img1006.jpg OpenLORIS_images img2154.jpg OpenLORIS_images img5472.jpg OpenLORIS_images img962.jpg OpenLORIS_images img2884.jpg OpenLORIS_images img3038.jpg OpenLORIS_images img4488.jpg OpenLORIS_images img3204.jpg OpenLORIS_images img1051.jpg OpenLORIS_images img1307.jpg OpenLORIS_images img2459.jpg OpenLORIS_images img1932.jpg OpenLORIS_images img3633.jpg OpenLORIS_images img5868.jpg OpenLORIS_images img5108.jpg OpenLORIS_images img4862.jpg OpenLORIS_images img2115.jpg OpenLORIS_images img4106.jpg OpenLORIS_images img2913.jpg OpenLORIS_images img3107.jpg OpenLORIS_images img1732.jpg OpenLORIS_images img5593.jpg OpenLORIS_images img4114.jpg OpenLORIS_images img2323.jpg OpenLORIS_images img1579.jpg OpenLORIS_images img2000.jpg OpenLORIS_images img5819.jpg OpenLORIS_images img1365.jpg OpenLORIS_images img2309.jpg OpenLORIS_images img4985.jpg OpenLORIS_images img6077.jpg OpenLORIS_images img4596.jpg OpenLORIS_images img6116.jpg OpenLORIS_images img5931.jpg OpenLORIS_images img495.jpg OpenLORIS_images img803.jpg OpenLORIS_images img3694.jpg OpenLORIS_images img4900.jpg OpenLORIS_images img3440.jpg OpenLORIS_images img1524.jpg OpenLORIS_images img3765.jpg OpenLORIS_images img4591.jpg OpenLORIS_images img4374.jpg OpenLORIS_images img425.jpg OpenLORIS_images img4519.jpg OpenLORIS_images img5249.jpg OpenLORIS_images img2723.jpg OpenLORIS_images img4424.jpg OpenLORIS_images img3723.jpg OpenLORIS_images img3063.jpg OpenLORIS_images img2571.jpg OpenLORIS_images img5411.jpg OpenLORIS_images img1123.jpg OpenLORIS_images img901.jpg OpenLORIS_images img415.jpg OpenLORIS_images img2710.jpg OpenLORIS_images img3191.jpg OpenLORIS_images img3119.jpg OpenLORIS_images img1266.jpg OpenLORIS_images img2031.jpg OpenLORIS_images img4854.jpg OpenLORIS_images img452.jpg OpenLORIS_images img1303.jpg OpenLORIS_images img3114.jpg OpenLORIS_images img4446.jpg OpenLORIS_images img1857.jpg OpenLORIS_images img3915.jpg OpenLORIS_images img687.jpg OpenLORIS_images img1205.jpg OpenLORIS_images img1215.jpg OpenLORIS_images img4627.jpg OpenLORIS_images img5677.jpg OpenLORIS_images img2093.jpg OpenLORIS_images img2953.jpg OpenLORIS_images img5152.jpg OpenLORIS_images img1952.jpg OpenLORIS_images img305.jpg OpenLORIS_images img1837.jpg OpenLORIS_images img1688.jpg OpenLORIS_images img5094.jpg OpenLORIS_images img2099.jpg OpenLORIS_images img2729.jpg OpenLORIS_images img3548.jpg OpenLORIS_images img5269.jpg OpenLORIS_images img2326.jpg OpenLORIS_images img1617.jpg OpenLORIS_images img4041.jpg OpenLORIS_images img5939.jpg OpenLORIS_images img2603.jpg OpenLORIS_images img4005.jpg OpenLORIS_images img4113.jpg OpenLORIS_images img2576.jpg OpenLORIS_images img2568.jpg OpenLORIS_images img1558.jpg OpenLORIS_images img2341.jpg OpenLORIS_images img4010.jpg OpenLORIS_images img1393.jpg OpenLORIS_images img1060.jpg OpenLORIS_images img3706.jpg OpenLORIS_images img272.jpg OpenLORIS_images img1213.jpg OpenLORIS_images img2566.jpg OpenLORIS_images img4183.jpg OpenLORIS_images img188.jpg OpenLORIS_images img5166.jpg OpenLORIS_images img388.jpg OpenLORIS_images img6045.jpg OpenLORIS_images img1911.jpg OpenLORIS_images img4814.jpg OpenLORIS_images img4326.jpg OpenLORIS_images img423.jpg OpenLORIS_images img5073.jpg OpenLORIS_images img2050.jpg OpenLORIS_images img5133.jpg OpenLORIS_images img3367.jpg OpenLORIS_images img3870.jpg OpenLORIS_images img4728.jpg OpenLORIS_images img5935.jpg OpenLORIS_images img5749.jpg OpenLORIS_images img1771.jpg OpenLORIS_images img932.jpg OpenLORIS_images img373.jpg OpenLORIS_images img1626.jpg OpenLORIS_images img4875.jpg OpenLORIS_images img2220.jpg OpenLORIS_images img4033.jpg OpenLORIS_images img3074.jpg OpenLORIS_images img1162.jpg OpenLORIS_images img667.jpg OpenLORIS_images img3055.jpg OpenLORIS_images img496.jpg OpenLORIS_images img5004.jpg OpenLORIS_images img1277.jpg OpenLORIS_images img4857.jpg OpenLORIS_images img3907.jpg OpenLORIS_images img1649.jpg OpenLORIS_images img3540.jpg OpenLORIS_images img5944.jpg OpenLORIS_images img1828.jpg OpenLORIS_images img5165.jpg OpenLORIS_images img2474.jpg OpenLORIS_images img1647.jpg OpenLORIS_images img1142.jpg OpenLORIS_images img4697.jpg OpenLORIS_images img5756.jpg OpenLORIS_images img3405.jpg OpenLORIS_images img4324.jpg OpenLORIS_images img1836.jpg OpenLORIS_images img4244.jpg OpenLORIS_images img2271.jpg OpenLORIS_images img2158.jpg OpenLORIS_images img4604.jpg OpenLORIS_images img955.jpg OpenLORIS_images img2709.jpg OpenLORIS_images img1648.jpg OpenLORIS_images img5273.jpg OpenLORIS_images img1510.jpg OpenLORIS_images img466.jpg OpenLORIS_images img5587.jpg OpenLORIS_images img3185.jpg OpenLORIS_images img1651.jpg OpenLORIS_images img5272.jpg OpenLORIS_images img1173.jpg OpenLORIS_images img125.jpg OpenLORIS_images img2261.jpg OpenLORIS_images img4380.jpg OpenLORIS_images img848.jpg OpenLORIS_images img2640.jpg OpenLORIS_images img492.jpg OpenLORIS_images img2871.jpg OpenLORIS_images img2457.jpg OpenLORIS_images img2440.jpg OpenLORIS_images img4843.jpg OpenLORIS_images img2675.jpg OpenLORIS_images img1609.jpg OpenLORIS_images img2157.jpg OpenLORIS_images img3173.jpg OpenLORIS_images img1983.jpg OpenLORIS_images img5903.jpg OpenLORIS_images img2532.jpg OpenLORIS_images img3939.jpg OpenLORIS_images img3707.jpg OpenLORIS_images img2935.jpg OpenLORIS_images img13.jpg OpenLORIS_images img193.jpg ================================================ FILE: plane_segmentation/pretrained_model/model.data-00000-of-00001 ================================================ [File too large to display: 48.6 MB] ================================================ FILE: plane_segmentation/pretrained_model/model.meta ================================================ [File too large to display: 11.2 MB] ================================================ FILE: plane_segmentation/requirements.txt ================================================ # This file may be used to create an environment using: # $ conda create --name --file # platform: linux-64 _libgcc_mutex=0.1=conda_forge _openmp_mutex=4.5=1_gnu addict=2.3.0=py36h9f0ad1d_2 async_generator=1.10=py_0 attrs=20.3.0=pyhd3eb1b0_0 backcall=0.2.0=py_0 blas=2.14=openblas bleach=1.5.0=py36_0 bzip2=1.0.8=h7f98852_4 c-ares=1.17.1=h36c2ea0_0 ca-certificates=2020.10.14=0 cairo=1.16.0=h9f066cc_1006 certifi=2020.6.20=py36_0 cycler=0.10.0=py_2 cython=0.29.21=py36he6710b0_0 dbus=1.13.6=hfdff14a_1 decorator=4.4.2=py_0 defusedxml=0.6.0=py_0 entrypoints=0.3=py36_0 expat=2.2.9=he1b5a44_2 ffmpeg=4.3.1=h3215721_1 fontconfig=2.13.1=h7e3eb15_1002 freetype=2.10.4=h7ca028e_0 gettext=0.19.8.1=hf34092f_1004 glib=2.66.3=h58526e2_0 gmp=6.2.1=h58526e2_0 gnutls=3.6.13=h85f3911_1 graphite2=1.3.13=h58526e2_1001 gst-plugins-base=1.14.5=h0935bb2_2 gstreamer=1.14.5=h36ae1b5_2 harfbuzz=2.7.2=ha5b49bf_1 hdf5=1.10.6=nompi_h3c11f04_101 html5lib=0.9999999=py36_0 icu=67.1=he1b5a44_0 importlib-metadata=3.3.0=py36h5fab9bb_2 intel-openmp=2020.2=254 ipykernel=5.3.4=py36h5ca1d4c_0 ipython=7.16.1=py36h5ca1d4c_0 ipython_genutils=0.2.0=pyhd3eb1b0_1 ipywidgets=7.6.0=pyhd3eb1b0_1 jasper=1.900.1=h07fcdf6_1006 jedi=0.18.0=py36h06a4308_0 jinja2=2.11.2=py_0 joblib=1.0.0=pyhd8ed1ab_0 jpeg=9d=h36c2ea0_0 jsonschema=3.0.2=py36_0 jupyter_client=6.1.7=py_0 jupyter_core=4.7.0=py36h06a4308_0 jupyterlab_pygments=0.1.2=py_0 kiwisolver=1.3.1=py36h51d7077_0 krb5=1.17.2=h926e7f8_0 lame=3.100=h14c3975_1001 lcms2=2.11=hcbb858e_1 ld_impl_linux-64=2.33.1=h53a641e_7 libblas=3.8.0=14_openblas libcblas=3.8.0=14_openblas libclang=10.0.1=default_hde54327_1 libcurl=7.71.1=hcdd3856_8 libedit=3.1.20191231=h14c3975_1 libev=4.33=h516909a_1 libevent=2.1.10=hcdb4288_3 libffi=3.2.1=he1b5a44_1007 libgcc-ng=9.3.0=h5dbcf3e_17 libgfortran-ng=7.3.0=hdf63c60_0 libglib=2.66.3=hbe7bbb4_0 libgomp=9.3.0=h5dbcf3e_17 libiconv=1.16=h516909a_0 liblapack=3.8.0=14_openblas liblapacke=3.8.0=14_openblas libllvm10=10.0.1=he513fc3_3 libnghttp2=1.41.0=h8cfc5f6_2 libopenblas=0.3.7=h5ec1e0e_6 libopencv=4.5.0=py36_4 libpng=1.6.37=h21135ba_2 libpq=12.3=h255efa7_3 libprotobuf=3.13.0.1=h8b12597_0 libsodium=1.0.18=h7b6447c_0 libssh2=1.9.0=hab1572f_5 libstdcxx-ng=9.3.0=h2ae2ef3_17 libtiff=4.1.0=h4f3a223_6 libuuid=2.32.1=h7f98852_1000 libwebp-base=1.1.0=h36c2ea0_3 libxcb=1.13=h14c3975_1002 libxkbcommon=0.10.0=he1b5a44_0 libxml2=2.9.10=h68273f3_2 llvmlite=0.34.0=py36h269e1b5_4 lz4-c=1.9.2=he1b5a44_3 markdown=3.3.3=pyh9f0ad1d_0 markupsafe=1.1.1=py36h7b6447c_0 matplotlib=3.3.3=py36h5fab9bb_0 matplotlib-base=3.3.3=py36he12231b_0 mistune=0.8.4=py36h7b6447c_0 mkl=2019.4=243 mkl-service=2.3.0=py36he8ac12f_0 mkl_fft=1.1.0=py36hc1659b7_1 mkl_random=1.1.0=py36hb3f55d8_0 mysql-common=8.0.21=2 mysql-libs=8.0.21=hf3661c5_2 nbclient=0.5.1=py_0 nbconvert=6.0.7=py36_0 nbformat=5.0.8=py_0 ncurses=6.2=he6710b0_1 nest-asyncio=1.4.3=pyhd3eb1b0_0 nettle=3.6=he412f7d_0 notebook=6.0.3=py36_0 nspr=4.29=he1b5a44_1 nss=3.59=h2c00c37_0 numba=0.51.2=py36h0573a6f_1 numpy=1.17.0=py36h95a1406_0 numpy-base=1.18.5=py36h2f8d375_0 olefile=0.46=pyh9f0ad1d_1 open3d=0.11.2=py36_0 opencv=4.5.0=py36_4 openh264=2.1.1=h8b12597_0 openssl=1.1.1h=h7b6447c_0 pandas=1.1.3=py36he6710b0_0 pandoc=2.11=hb0f4dca_0 pandocfilters=1.4.3=py36h06a4308_1 parso=0.7.0=py_0 pcre=8.44=he1b5a44_0 pexpect=4.8.0=pyhd3eb1b0_3 pickleshare=0.7.5=pyhd3eb1b0_1003 pillow=8.0.1=py36h10ecd5c_0 pip=20.3.3=py36h06a4308_0 pixman=0.40.0=h36c2ea0_0 plyfile=0.7.2=pyh9f0ad1d_0 prometheus_client=0.9.0=pyhd3eb1b0_0 prompt-toolkit=3.0.8=py_0 protobuf=3.13.0.1=py36h201607c_1 pthread-stubs=0.4=h36c2ea0_1001 ptyprocess=0.6.0=pyhd3eb1b0_2 py-opencv=4.5.0=py36he448a4c_4 pydensecrf=1.0rc2=pypi_0 pygments=2.7.3=pyhd3eb1b0_0 pyparsing=2.4.7=pyh9f0ad1d_0 pyqt=5.12.3=py36h5fab9bb_6 pyqt-impl=5.12.3=py36h7ec31b9_6 pyqt5-sip=4.19.18=py36hc4f0c31_6 pyqtchart=5.12=py36h7ec31b9_6 pyqtwebengine=5.12.1=py36h7ec31b9_6 pyrsistent=0.17.3=py36h7b6447c_0 python=3.6.11=h425cb1d_0_cpython python-dateutil=2.8.1=py_0 python_abi=3.6=1_cp36m pytz=2020.1=py_0 pyyaml=5.3.1=py36he6145b8_1 pyzmq=20.0.0=py36h2531618_1 qt=5.12.9=h1f2b2cb_0 readline=8.0=h7b6447c_0 scikit-learn=0.23.2=py36h0573a6f_0 scipy=0.19.1=py36h9976243_3 send2trash=1.5.0=pyhd3eb1b0_1 setuptools=51.0.0=py36h06a4308_2 six=1.15.0=pyh9f0ad1d_0 sqlite=3.33.0=h62c20be_0 tensorflow=1.4.1=0 tensorflow-base=1.4.1=py36hd00c003_2 tensorflow-gpu=1.4.1=0 tensorflow-gpu-base=1.4.1=py36h01caf0a_0 tensorflow-tensorboard=1.5.1=py36hf484d3e_1 terminado=0.9.1=py36_0 testpath=0.4.4=py_0 threadpoolctl=2.1.0=pyh5ca1d4c_0 tk=8.6.10=hbc83047_0 tornado=6.1=py36h1d69622_0 tqdm=4.55.0=pyhd8ed1ab_0 traitlets=4.3.3=py36_0 typing_extensions=3.7.4.3=py_0 wcwidth=0.2.5=py_0 webencodings=0.5.1=py_1 werkzeug=1.0.1=pyh9f0ad1d_0 wheel=0.36.2=pyhd3eb1b0_0 widgetsnbextension=3.5.1=py36_0 x264=1!152.20180806=h14c3975_0 xorg-kbproto=1.0.7=h14c3975_1002 xorg-libice=1.0.10=h516909a_0 xorg-libsm=1.2.3=h84519dc_1000 xorg-libx11=1.6.12=h516909a_0 xorg-libxau=1.0.9=h14c3975_0 xorg-libxdmcp=1.1.3=h516909a_0 xorg-libxext=1.3.4=h516909a_0 xorg-libxrender=0.9.10=h516909a_1002 xorg-renderproto=0.11.1=h14c3975_1002 xorg-xextproto=7.3.0=h14c3975_1002 xorg-xproto=7.0.31=h7f98852_1007 xz=5.2.5=h7b6447c_0 yaml=0.2.5=h7b6447c_0 zeromq=4.3.3=he6710b0_3 zipp=3.4.0=py_0 zlib=1.2.11=h7b6447c_3 zstd=1.4.5=h6597ccf_2 ================================================ FILE: plane_segmentation/train.py ================================================ from __future__ import division import tensorflow as tf import pprint import random import numpy as np from RecoverPlane_perpendicular import RecoverPlane import os flags = tf.app.flags flags.DEFINE_string("dataset_dir", "", "Dataset directory") flags.DEFINE_string("log_dir2", "", "log directory") flags.DEFINE_string("init_checkpoint_file",'', "Specific checkpoint file to initialize from") flags.DEFINE_float("learning_rate", 0.0001, "Learning rate of for adam") flags.DEFINE_float("beta1", 0.99, "Momentum term of adam") flags.DEFINE_float("beta2", 0.9999, "Momentum term of adam") flags.DEFINE_float("plane_weight",0.1, "Weight for plane regularization") flags.DEFINE_integer("batch_size", 8, "The size of of a sample batch") flags.DEFINE_integer("img_height", 192, "Image height") flags.DEFINE_integer("img_width", 320, "Image width") flags.DEFINE_integer("max_steps",800000 , "Maximum number of training iterations") flags.DEFINE_integer("summary_freq", 1000, "Logging every log_freq iterations") flags.DEFINE_integer("num_plane",3, "The estimated number of planes in the scenario") flags.DEFINE_integer("save_latest_freq", 5000, \ "Save the latest model every save_latest_freq iterations (overwrites the previous latest model)") flags.DEFINE_boolean("continue_train", False, "Continue training from previous checkpoint") flags.DEFINE_boolean("debug", True, "debug mode?") flags.DEFINE_string("gpu", "1", "GPU ID") check_file = flags.FLAGS.log_dir2 + '/' + flags.FLAGS.dataset_dir.split("/")[-1] +\ '_lr=' + str(flags.FLAGS.learning_rate) +\ '_b1=' + str(flags.FLAGS.beta1) + '_b2=' + str(flags.FLAGS.beta2) +\ '_weight=' + str(flags.FLAGS.plane_weight) +\ '_n_plane=' + str(flags.FLAGS.num_plane) flags.DEFINE_string("checkpoint_dir", check_file, "Directory name to save the checkpoints") FLAGS = flags.FLAGS # this is used to transfer all the params need during the app.run def main(_): seed = 8964 tf.set_random_seed(seed) np.random.seed(seed) random.seed(seed) pp = pprint.PrettyPrinter() pp.pprint(flags.FLAGS.__flags) #change this to pp.pprint(tf.app.flags.FLAGS.flag_values_dict()) for tensorflow 1.5 or higher if not os.path.exists(FLAGS.checkpoint_dir): os.mkdir(FLAGS.checkpoint_dir) planeRecover = RecoverPlane() planeRecover.train(FLAGS) if __name__ == '__main__': tf.app.run() ================================================ FILE: plane_segmentation/utils.py ================================================ from __future__ import division import matplotlib.pyplot as plt import numpy as np import tensorflow as tf import random import colorsys # ****************************colorful mask part************************************* # Usage: apply different color to each plane # the plane determination is based on the plane_threshold = 0.5 now # and the area without additional color are belong to non-plane # def random_colors(N, bright=True): """ Generate random colors. To get visually distinct colors, generate them in HSV space then convert to RGB. """ brightness = 1.0 if bright else 0.7 hsv = [(float(i) / N, 1, brightness) for i in range(N)] colors = list(map(lambda c: colorsys.hsv_to_rgb(*c), hsv)) random.shuffle(colors) return colors def apply_mask(image, mask, max_mask, color, alpha=0.4): """Apply the given mask to the image. alpha here means the alpha channel value """ b,h,w,_ = image.get_shape().as_list() res_mask = tf.concat([mask, mask, mask], axis=-1) # be consistent with color channel number ref_max_mask = tf.concat([max_mask,max_mask,max_mask],axis=-1) b_equal = tf.equal(res_mask, ref_max_mask) alpha_img = tf.where(b_equal, tf.cast(tf.scalar_mul(1 - alpha, tf.cast(image, tf.float32)), tf.uint8), image) alpha_color = alpha * np.array(color) ref_color = np.tile(alpha_color.astype(int), (b, h, w, 1)) ref_color = tf.constant(ref_color, dtype=tf.uint8) res_image = tf.where(b_equal, ref_color + alpha_img, alpha_img) return res_image def color_mask(image, pred_mask_s, colors, alpha=0.4 ): '''do iteration to assign the color to the corrosponding mask Based on experiment, the first plane will be red, and the second one will be green-blue, when num=2 and drak blue ,red ,green for num=3 ''' N = len(colors) masked_image = image for i in range(N): color = colors[i] mask = tf.expand_dims(pred_mask_s[:, :, :, i],axis=-1) max_mask = tf.reduce_max(pred_mask_s, axis=-1,keep_dims=True) masked_image = apply_mask(masked_image, mask, max_mask, color, alpha) return masked_image #********************************************************************************************* def meshgrid(batch, height, width, is_homogeneous=True): """Construct a 2D meshgrid. Args: batch: batch size height: height of the grid width: width of the grid is_homogeneous: whether to return in homogeneous coordinates Returns: x,y grid coordinates [batch, 2 (3 if homogeneous), height, width] """ x_t = tf.matmul(tf.ones(shape=tf.stack([height, 1])), tf.transpose(tf.expand_dims(tf.linspace(-1.0, 1.0, width), 1), [1, 0])) #tf.linspace(start,stop,num) y_t = tf.matmul(tf.expand_dims(tf.linspace(-1.0, 1.0, height), 1), tf.ones(shape=tf.stack([1, width]))) x_t = (x_t + 1.0) * 0.5 * tf.cast(width - 1, tf.float32) y_t = (y_t + 1.0) * 0.5 * tf.cast(height - 1, tf.float32) if is_homogeneous: ones = tf.ones_like(x_t) coords = tf.stack([x_t, y_t, ones], axis=0) else: coords = tf.stack([x_t, y_t], axis=0) coords = tf.tile(tf.expand_dims(coords, 0), [batch, 1, 1, 1]) return coords def compute_depth(img, pred_param, num_plane, intrinsics): batch, height, width, _ = img.get_shape().as_list() # Construct pixel grid coordinates pixel_coords = meshgrid(batch, height, width) # 3*128*416 cam_coords = tf.reshape(pixel_coords, [batch, 3, -1]) unscaled_ray_Q = tf.matmul(tf.matrix_inverse(intrinsics), cam_coords) for k in range(num_plane): n_div_d = tf.expand_dims(pred_param[:, k, :], axis=1) scale = tf.matmul(n_div_d, tf.matmul(tf.matrix_inverse(intrinsics), cam_coords)) plane_based_Q = scale * 1./ (unscaled_ray_Q + 1e-10) plane_based_Q = tf.reshape(plane_based_Q, [batch, 3, height, width]) plane_based_Q = tf.transpose(plane_based_Q, perm=[0, 2, 3, 1]) if k == 0: plane_inv_depth_stack = plane_based_Q[:,:,:,-1:] else: plane_inv_depth_stack = tf.concat([plane_inv_depth_stack, plane_based_Q[:, :,:, -1:]], axis=-1) return plane_inv_depth_stack def compute_unscaled_ray(img, intrinsics): batch, height, width, _ = img.get_shape().as_list() # Construct pixel grid coordinates pixel_coords = meshgrid(batch, height, width) # 3*128*416 cam_coords = tf.reshape(pixel_coords, [batch, 3, -1]) unscaled_ray_Q = tf.matmul(tf.matrix_inverse(intrinsics), cam_coords) return unscaled_ray_Q def compute_plane_equation(img, pred_param, ray, depth): batch, height, width, _ = img.get_shape().as_list() n_time_ray = tf.matmul(pred_param,ray) n_time_ray = tf.reshape(n_time_ray, [batch, 1, height, width]) n_time_ray = tf.transpose(n_time_ray, perm=[0, 2, 3, 1]) left_eq = n_time_ray * depth return left_eq def val2uint8(mat,maxVal): maxVal_mat = tf.constant(maxVal,tf.float32,mat.get_shape()) mat_vis = tf.where(tf.greater(mat,maxVal_mat), maxVal_mat, mat) return tf.cast(mat_vis * 255. / maxVal, tf.uint8) ================================================ FILE: rpvio.patch ================================================ From 3d295e1ff4c292ce203e636f363520f9b633908c Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 17 Sep 2020 12:22:47 +0530 Subject: [PATCH 01/42] Add code for homography-based initializatin --- vins_estimator/src/estimator.cpp | 14 ++-- vins_estimator/src/initial/solve_5pts.cpp | 80 ++++++++++++++++++----- vins_estimator/src/initial/solve_5pts.h | 6 +- 3 files changed, 73 insertions(+), 27 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index cb7a4af..46c93ef 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -491,14 +491,16 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l } average_parallax = 1.0 * sum_parallax / int(corres.size()); - MatrixXd::Identity(4,4) TrIC; - TrIC.block(0, 0, 3, 3) = RIC; - TrIC.col(3) = TIC; + Matrix4d TrIC = Matrix4d::Identity(); + TrIC.block(0, 0, 3, 3) = RIC[0]; + TrIC.block(0,3,3,0) = TIC[0]; - //compute preintegrated rotation - + //compute corresponding preintegrated rotation + Matrix3d R_imu = Matrix3d::Identity(); + for(int k = WINDOW_SIZE - 1; k > i; k--) + R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); - if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, relative_R, relative_T, R_imu, TrIC)) + if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T)) { l = i; ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index a1c170d..61440da 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -1,4 +1,5 @@ #include "solve_5pts.h" +#include namespace cv { @@ -241,27 +242,70 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/665); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); - decomposeH(H, K, R_imu, TIC); - - Eigen::Matrix3d R; - Eigen::Vector3d T; - for (int i = 0; i < 3; i++) - { - T(i) = trans.at(i, 0); - for (int j = 0; j < 3; j++) - R(i, j) = rot.at(i, j); - } - - Rotation = R.transpose(); - Translation = -R.transpose() * T; - if(inlier_cnt > 12) - return true; - else - return false; + Eigen::Matrix4d Tr = decomposeH(H, K, R_imu, TrIC); + Rotation = Tr.block(0,0,3,3); + Translation = Tr.block(0,3,3,0); + return true; } return false; } -void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC) +Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC) { + vector cv_Rs, cv_ts, cv_ns; + int n_sols = cv::decomposeHomographyMat(H, K, cv_Rs, cv_ts, cv_ns); + + vector positive_depth_transforms; + if(n_sols > 1) + { + for(int i = 0; i < n_sols; i++) + { + Matrix4d Tr = Matrix4d::Identity(); + Matrix3d R; + cv::cv2eigen(cv_Rs[i], R); + Tr.block(0,0,3,3) = R; + + Vector3d t; + cv::cv2eigen(cv_ts[i], t); + Tr.block(0,3,3,0) = t; + + //Tr = TrIC * Tr * TrIC.inverse(); + Tr = Tr.inverse(); + + Vector3d e3(0, 0, 1); + Vector3d n; + cv::cv2eigen(cv_ns[i], n); + if(n.dot(e3) > 0) + positive_depth_transforms.push_back(Tr); + } + + vector rot_diff; + for(size_t i = 0; i < positive_depth_transforms.size(); i++) + { + Eigen::Matrix4d Tr = TrIC * positive_depth_transforms[i] * TrIC.inverse(); + Eigen::Matrix3d R = Tr.block(0,0,3,3); + double f = (R.transpose()*R_imu - MatrixXd::Identity(3,3)).norm(); + rot_diff.push_back(f); + } + + int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin(); + Matrix4d Tr = positive_depth_transforms[min_index]; + return Tr; + } + + else + { + Matrix4d Tr = Matrix4d::Identity(); + Matrix3d R; + cv::cv2eigen(cv_Rs[0], R); + Tr.block(0,0,3,3) = R; + + Vector3d t; + cv::cv2eigen(cv_ts[0], t); + Tr.block(0,3,3,0) = t; + + //Tr = TrIC * Tr * TrIC.inverse(); + Tr = Tr.inverse(); + return Tr; + } } diff --git a/vins_estimator/src/initial/solve_5pts.h b/vins_estimator/src/initial/solve_5pts.h index 3e95c9e..9fb118e 100644 --- a/vins_estimator/src/initial/solve_5pts.h +++ b/vins_estimator/src/initial/solve_5pts.h @@ -4,8 +4,8 @@ using namespace std; #include -//#include #include +#include using namespace Eigen; #include @@ -15,8 +15,8 @@ class MotionEstimator public: bool solveRelativeRT(const vector> &corres, Matrix3d &R, Vector3d &T); - bool solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, const Matrix3d &R, Vector3d &T); - void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC); + bool solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T); + Matrix4d decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC); private: double testTriangulation(const vector &l, -- 2.17.1 From 56f9ef55ba6caf95001bba4e8cd34524fd79a089 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 17 Sep 2020 14:19:09 +0530 Subject: [PATCH 02/42] Fix runtime errors --- vins_estimator/src/estimator.cpp | 4 ++-- vins_estimator/src/initial/solve_5pts.cpp | 10 +++++----- 2 files changed, 7 insertions(+), 7 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 46c93ef..733a968 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -492,8 +492,8 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l average_parallax = 1.0 * sum_parallax / int(corres.size()); Matrix4d TrIC = Matrix4d::Identity(); - TrIC.block(0, 0, 3, 3) = RIC[0]; - TrIC.block(0,3,3,0) = TIC[0]; + TrIC.block(0,0,3,3) = ric[0]; + TrIC.block(0,3,3,1) = tic[0]; //compute corresponding preintegrated rotation Matrix3d R_imu = Matrix3d::Identity(); diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index 61440da..6be2bf1 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -244,7 +244,7 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c Eigen::Matrix4d Tr = decomposeH(H, K, R_imu, TrIC); Rotation = Tr.block(0,0,3,3); - Translation = Tr.block(0,3,3,0); + Translation = Tr.block(0,3,3,1); return true; } return false; @@ -267,10 +267,10 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M Vector3d t; cv::cv2eigen(cv_ts[i], t); - Tr.block(0,3,3,0) = t; + Tr.block(0,3,3,1) = t; //Tr = TrIC * Tr * TrIC.inverse(); - Tr = Tr.inverse(); + Tr = Tr.inverse().eval(); Vector3d e3(0, 0, 1); Vector3d n; @@ -302,10 +302,10 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M Vector3d t; cv::cv2eigen(cv_ts[0], t); - Tr.block(0,3,3,0) = t; + Tr.block(0,3,3,1) = t; //Tr = TrIC * Tr * TrIC.inverse(); - Tr = Tr.inverse(); + Tr = Tr.inverse().eval(); return Tr; } } -- 2.17.1 From f9db6ec5f6d62bc608e5acd96c45414f384eadc4 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sat, 19 Sep 2020 08:21:32 +0530 Subject: [PATCH 03/42] Reject with H --- feature_tracker/src/feature_tracker.cpp | 3 ++- vins_estimator/src/initial/solve_5pts.cpp | 3 +++ 2 files changed, 5 insertions(+), 1 deletion(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index d33709c..41322b8 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -188,7 +188,8 @@ void FeatureTracker::rejectWithF() } vector status; - cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); + //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); + cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status); int size_a = cur_pts.size(); reduceVector(prev_pts, status); reduceVector(cur_pts, status); diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index 6be2bf1..50999c3 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -227,6 +227,7 @@ bool MotionEstimator::solveRelativeRT(const vector> &co return false; } + bool MotionEstimator::solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &Rotation, Vector3d &Translation) { if (corres.size() >= 15) @@ -241,6 +242,7 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask); cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/665); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); + //cv::Mat K = (cv::Mat_(3,3) << 665, 0, 511, 0, 665, 383, 0, 0, 1); Eigen::Matrix4d Tr = decomposeH(H, K, R_imu, TrIC); Rotation = Tr.block(0,0,3,3); @@ -250,6 +252,7 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c return false; } + Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC) { vector cv_Rs, cv_ts, cv_ns; -- 2.17.1 From 64247beda4b05f162fbec72127dd1f79bade8709 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Tue, 22 Sep 2020 10:39:26 +0530 Subject: [PATCH 04/42] Add code for homography-constrained visual BA for init --- vins_estimator/src/estimator.cpp | 11 +- vins_estimator/src/estimator.h | 2 +- vins_estimator/src/initial/initial_sfm.cpp | 216 ++++++++++++++++++++- vins_estimator/src/initial/initial_sfm.h | 40 +++- vins_estimator/src/initial/solve_5pts.cpp | 27 ++- vins_estimator/src/initial/solve_5pts.h | 4 +- 6 files changed, 280 insertions(+), 20 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 733a968..c561fd6 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -267,15 +267,16 @@ bool Estimator::initialStructure() } Matrix3d relative_R; Vector3d relative_T; + Vector3d n; int l; - if (!relativeHPose(relative_R, relative_T, l)) + if (!relativeHPose(relative_R, relative_T, n, l)) { ROS_INFO("Not enough features or parallax; Move device around"); return false; } GlobalSFM sfm; - if(!sfm.construct(frame_count + 1, Q, T, l, - relative_R, relative_T, + if(!sfm.constructH(frame_count + 1, Q, T, l, + relative_R, relative_T, n, sfm_f, sfm_tracked_points)) { ROS_DEBUG("global SFM failed!"); @@ -470,7 +471,7 @@ bool Estimator::relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l) return false; } -bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l) +bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l) { // find previous frame which contains enough correspondence and parallex with newest frame for (int i = 0; i < WINDOW_SIZE; i++) @@ -500,7 +501,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l for(int k = WINDOW_SIZE - 1; k > i; k--) R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); - if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T)) + if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n)) { l = i; ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index 7f912da..9b2a731 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -40,7 +40,7 @@ class Estimator bool initialStructure(); bool visualInitialAlign(); bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l); - bool relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l); + bool relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l); void slideWindow(); void solveOdometry(); void slideWindowNew(); diff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp index 6fbcc17..c7b1661 100644 --- a/vins_estimator/src/initial/initial_sfm.cpp +++ b/vins_estimator/src/initial/initial_sfm.cpp @@ -280,11 +280,11 @@ bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l, //std::cout << summary.BriefReport() << "\n"; if (summary.termination_type == ceres::CONVERGENCE || summary.final_cost < 5e-03) { - //cout << "vision only BA converge" << endl; + cout << "vision only BA converge" << endl; } else { - //cout << "vision only BA not converge " << endl; + cout << "vision only BA not converge " << endl; return false; } for (int i = 0; i < frame_num; i++) @@ -311,3 +311,215 @@ bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l, } + +bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, + const Matrix3d relative_R, const Vector3d relative_T, const Vector3d n, + vector &sfm_f, map &sfm_tracked_points) +{ + feature_num = sfm_f.size(); + //cout << "set 0 and " << l << " as known " << endl; + // have relative_r relative_t + // intial two view + q[l].w() = 1; + q[l].x() = 0; + q[l].y() = 0; + q[l].z() = 0; + T[l].setZero(); + q[frame_num - 1] = q[l] * Quaterniond(relative_R); + T[frame_num - 1] = relative_T; + //cout << "init q_l " << q[l].w() << " " << q[l].vec().transpose() << endl; + //cout << "init t_l " << T[l].transpose() << endl; + + //rotate to cam frame + Matrix3d c_Rotation[frame_num]; + Vector3d c_Translation[frame_num]; + Quaterniond c_Quat[frame_num]; + double c_rotation[frame_num][4]; + double c_translation[frame_num][3]; + double c_normal[3]; + Eigen::Matrix Pose[frame_num]; + + c_Quat[l] = q[l].inverse(); + c_Rotation[l] = c_Quat[l].toRotationMatrix(); + c_Translation[l] = -1 * (c_Rotation[l] * T[l]); + Pose[l].block<3, 3>(0, 0) = c_Rotation[l]; + Pose[l].block<3, 1>(0, 3) = c_Translation[l]; + + c_Quat[frame_num - 1] = q[frame_num - 1].inverse(); + c_Rotation[frame_num - 1] = c_Quat[frame_num - 1].toRotationMatrix(); + c_Translation[frame_num - 1] = -1 * (c_Rotation[frame_num - 1] * T[frame_num - 1]); + Pose[frame_num - 1].block<3, 3>(0, 0) = c_Rotation[frame_num - 1]; + Pose[frame_num - 1].block<3, 1>(0, 3) = c_Translation[frame_num - 1]; + + //1: trangulate between l ----- frame_num - 1 + //2: solve pnp l + 1; trangulate l + 1 ------- frame_num - 1; + for (int i = l; i < frame_num - 1 ; i++) + { + // solve pnp + if (i > l) + { + Matrix3d R_initial = c_Rotation[i - 1]; + Vector3d P_initial = c_Translation[i - 1]; + if(!solveFrameByPnP(R_initial, P_initial, i, sfm_f)) + return false; + c_Rotation[i] = R_initial; + c_Translation[i] = P_initial; + c_Quat[i] = c_Rotation[i]; + Pose[i].block<3, 3>(0, 0) = c_Rotation[i]; + Pose[i].block<3, 1>(0, 3) = c_Translation[i]; + } + + // triangulate point based on the solve pnp result + triangulateTwoFrames(i, Pose[i], frame_num - 1, Pose[frame_num - 1], sfm_f); + } + //3: triangulate l-----l+1 l+2 ... frame_num -2 + for (int i = l + 1; i < frame_num - 1; i++) + triangulateTwoFrames(l, Pose[l], i, Pose[i], sfm_f); + //4: solve pnp l-1; triangulate l-1 ----- l + // l-2 l-2 ----- l + for (int i = l - 1; i >= 0; i--) + { + //solve pnp + Matrix3d R_initial = c_Rotation[i + 1]; + Vector3d P_initial = c_Translation[i + 1]; + if(!solveFrameByPnP(R_initial, P_initial, i, sfm_f)) + return false; + c_Rotation[i] = R_initial; + c_Translation[i] = P_initial; + c_Quat[i] = c_Rotation[i]; + Pose[i].block<3, 3>(0, 0) = c_Rotation[i]; + Pose[i].block<3, 1>(0, 3) = c_Translation[i]; + //triangulate + triangulateTwoFrames(i, Pose[i], l, Pose[l], sfm_f); + } + //5: triangulate all other points + for (int j = 0; j < feature_num; j++) + { + if (sfm_f[j].state == true) + continue; + if ((int)sfm_f[j].observation.size() >= 2) + { + Vector2d point0, point1; + int frame_0 = sfm_f[j].observation[0].first; + point0 = sfm_f[j].observation[0].second; + int frame_1 = sfm_f[j].observation.back().first; + point1 = sfm_f[j].observation.back().second; + Vector3d point_3d; + triangulatePoint(Pose[frame_0], Pose[frame_1], point0, point1, point_3d); + sfm_f[j].state = true; + sfm_f[j].position[0] = point_3d(0); + sfm_f[j].position[1] = point_3d(1); + sfm_f[j].position[2] = point_3d(2); + //cout << "trangulated : " << frame_0 << " " << frame_1 << " 3d point : " << j << " " << point_3d.transpose() << endl; + } + } + +/* + for (int i = 0; i < frame_num; i++) + { + q[i] = c_Rotation[i].transpose(); + cout << "solvePnP q" << " i " << i <<" " < + bool operator()(const T* const R, const T* const t, const T* const n, const T* point, T* residuals) const + { + T p[3]; + ceres::QuaternionRotatePoint(R, point, p); + T np = n[0] * point[0] + n[1] * point[1] + n[2] * point[2]; + p[0] += t[0]*np; p[1] += t[1]*np; p[2] += t[2]*np; + T xp = p[0] / p[2]; + T yp = p[1] / p[2]; + residuals[0] = xp - T(observed_u); + residuals[1] = yp - T(observed_v); + return true; + } + + static ceres::CostFunction* Create(const double observed_x, + const double observed_y) + { + return (new ceres::AutoDiffCostFunction< + ReprojectionErrorH, 2, 4, 3, 3, 3>( + new ReprojectionErrorH(observed_x,observed_y))); + } + + double observed_u; + double observed_v; +}; + + class GlobalSFM { public: @@ -61,6 +95,10 @@ public: const Matrix3d relative_R, const Vector3d relative_T, vector &sfm_f, map &sfm_tracked_points); + bool constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, + const Matrix3d relative_R, const Vector3d relative_T, const Vector3d n, + vector &sfm_f, map &sfm_tracked_points); + private: bool solveFrameByPnP(Matrix3d &R_initial, Vector3d &P_initial, int i, vector &sfm_f); @@ -71,4 +109,4 @@ private: vector &sfm_f); int feature_num; -}; \ No newline at end of file +}; diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index 50999c3..bdc87a3 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -228,7 +228,7 @@ bool MotionEstimator::solveRelativeRT(const vector> &co } -bool MotionEstimator::solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &Rotation, Vector3d &Translation) +bool MotionEstimator::solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &Rotation, Vector3d &Translation, Vector3d &n) { if (corres.size() >= 15) { @@ -244,21 +244,26 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); //cv::Mat K = (cv::Mat_(3,3) << 665, 0, 511, 0, 665, 383, 0, 0, 1); - Eigen::Matrix4d Tr = decomposeH(H, K, R_imu, TrIC); - Rotation = Tr.block(0,0,3,3); - Translation = Tr.block(0,3,3,1); + Eigen::Matrix4d est_Tr; + Eigen::Vector3d est_n; + decomposeH(H, K, R_imu, TrIC, est_Tr, est_n); + Rotation = est_Tr.block(0,0,3,3); + Translation = est_Tr.block(0,3,3,1); + n = est_n; return true; } return false; } -Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC) +void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix4d &est_Tr, Vector3d &est_n) { vector cv_Rs, cv_ts, cv_ns; int n_sols = cv::decomposeHomographyMat(H, K, cv_Rs, cv_ts, cv_ns); vector positive_depth_transforms; + vector positive_depth_normals; + if(n_sols > 1) { for(int i = 0; i < n_sols; i++) @@ -280,6 +285,7 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M cv::cv2eigen(cv_ns[i], n); if(n.dot(e3) > 0) positive_depth_transforms.push_back(Tr); + positive_depth_normals.push_back(n); } vector rot_diff; @@ -292,8 +298,8 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M } int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin(); - Matrix4d Tr = positive_depth_transforms[min_index]; - return Tr; + est_Tr = positive_depth_transforms[min_index]; + est_n = positive_depth_normals[min_index]; } else @@ -307,8 +313,11 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M cv::cv2eigen(cv_ts[0], t); Tr.block(0,3,3,1) = t; + Vector3d n; + cv::cv2eigen(cv_ns[0], n); + //Tr = TrIC * Tr * TrIC.inverse(); - Tr = Tr.inverse().eval(); - return Tr; + est_Tr = Tr.inverse().eval(); + est_n = n; } } diff --git a/vins_estimator/src/initial/solve_5pts.h b/vins_estimator/src/initial/solve_5pts.h index 9fb118e..f090370 100644 --- a/vins_estimator/src/initial/solve_5pts.h +++ b/vins_estimator/src/initial/solve_5pts.h @@ -15,8 +15,8 @@ class MotionEstimator public: bool solveRelativeRT(const vector> &corres, Matrix3d &R, Vector3d &T); - bool solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T); - Matrix4d decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC); + bool solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T, Vector3d &n); + void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix4d &est_Tr, Vector3d &est_n); private: double testTriangulation(const vector &l, -- 2.17.1 From fb6f3cf6cf15d5119c170b55e86ca5784773f970 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Wed, 23 Sep 2020 08:12:13 +0530 Subject: [PATCH 05/42] Fix parameter bugs --- vins_estimator/src/initial/initial_sfm.cpp | 15 +++++++++++---- vins_estimator/src/initial/initial_sfm.h | 1 + 2 files changed, 12 insertions(+), 4 deletions(-) diff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp index c7b1661..05917a0 100644 --- a/vins_estimator/src/initial/initial_sfm.cpp +++ b/vins_estimator/src/initial/initial_sfm.cpp @@ -104,6 +104,9 @@ void GlobalSFM::triangulateTwoFrames(int frame0, Eigen::Matrix &Po sfm_f[j].position[0] = point_3d(0); sfm_f[j].position[1] = point_3d(1); sfm_f[j].position[2] = point_3d(2); + sfm_f[j].image[0] = point1[0]; + sfm_f[j].image[1] = point1[1]; + sfm_f[j].image[2] = 1; //cout << "trangulated : " << frame1 << " 3d point : " << j << " " << point_3d.transpose() << endl; } } @@ -212,6 +215,9 @@ bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_f[j].position[0] = point_3d(0); sfm_f[j].position[1] = point_3d(1); sfm_f[j].position[2] = point_3d(2); + sfm_f[j].image[0] = point1[0]; + sfm_f[j].image[1] = point1[1]; + sfm_f[j].image[2] = 1; //cout << "trangulated : " << frame_0 << " " << frame_1 << " 3d point : " << j << " " << point_3d.transpose() << endl; } } @@ -410,6 +416,9 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_f[j].position[0] = point_3d(0); sfm_f[j].position[1] = point_3d(1); sfm_f[j].position[2] = point_3d(2); + sfm_f[j].image[0] = point1[0]; + sfm_f[j].image[1] = point1[1]; + sfm_f[j].image[2] = 1; //cout << "trangulated : " << frame_0 << " " << frame_1 << " 3d point : " << j << " " << point_3d.transpose() << endl; } } @@ -471,16 +480,14 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_f[i].observation[j].second.y()); problem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l], - sfm_f[i].position); + sfm_f[i].position); ceres::CostFunction* h_cost_function = ReprojectionErrorH::Create( sfm_f[i].observation[j].second.x(), sfm_f[i].observation[j].second.y()); - problem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l],c_normal, - sfm_f[i].position); + problem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l], c_normal, sfm_f[i].image); } - } ceres::Solver::Options options; options.linear_solver_type = ceres::DENSE_SCHUR; diff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h index a1e6854..7780732 100644 --- a/vins_estimator/src/initial/initial_sfm.h +++ b/vins_estimator/src/initial/initial_sfm.h @@ -19,6 +19,7 @@ struct SFMFeature int id; vector> observation; double position[3]; + double image[3]; // u v 1 double depth; }; -- 2.17.1 From d33f9314e26d538f07fdb97cc6278b5a4f449b41 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 1 Oct 2020 11:11:06 +0530 Subject: [PATCH 06/42] Add homography const functor into estimator --- vins_estimator/src/estimator.cpp | 5 +- vins_estimator/src/estimator.h | 2 + vins_estimator/src/factor/homography_factor.h | 47 +++++++++++++++++++ vins_estimator/src/initial/initial_sfm.cpp | 7 ++- vins_estimator/src/initial/initial_sfm.h | 4 +- 5 files changed, 61 insertions(+), 4 deletions(-) create mode 100644 vins_estimator/src/factor/homography_factor.h diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index c561fd6..97a3821 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -435,7 +435,7 @@ bool Estimator::visualInitialAlign() Vs[i] = rot_diff * Vs[i]; } ROS_DEBUG_STREAM("g0 " << g.transpose()); - ROS_DEBUG_STREAM("my R0 " << Utility::R2ypr(Rs[0]).transpose()); + ROS_DEBUG_STREAM("my R0 " << Utility::R2ypr(Rs[0]).transpose()); return true; } @@ -800,6 +800,9 @@ void Estimator::optimization() { ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); + + ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); + problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); } f_m_cnt++; } diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index 9b2a731..f3feedf 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -17,6 +17,7 @@ #include "factor/projection_factor.h" #include "factor/projection_td_factor.h" #include "factor/marginalization_factor.h" +#include "factor/homography_factor.h" #include #include @@ -114,6 +115,7 @@ class Estimator double para_Retrive_Pose[SIZE_POSE]; double para_Td[1][1]; double para_Tr[1][1]; + double para_N[3]; int loop_window_index; diff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h new file mode 100644 index 0000000..86b03ff --- /dev/null +++ b/vins_estimator/src/factor/homography_factor.h @@ -0,0 +1,47 @@ +#pragma once +#include +#include + +struct HomographyFactor +{ + HomographyFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) {} + + template + bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const ex_pose, + T* residuals) const + { + Eigen::Map> pi(pose_i); + Eigen::Quaternion qi; + qi.coeffs() << pose_i[3], pose_i[4], pose_i[5], pose_i[6]; + + Eigen::Map> pj(pose_j); + Eigen::Quaternion qj; + qj.coeffs() << pose_j[3], pose_j[4], pose_j[5], pose_j[6]; + + Eigen::Map> tic(ex_pose); + Eigen::Quaternion qic; + qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6]; + + Eigen::Map> n(para_n); + + Eigen::Quaternion qji = qj.inverse() * qi; + Eigen::Matrix tji = qj.inverse().toRotationMatrix() * (pi - pj); + Eigen::Matrix pts_imu_i = qic * pts_i.cast() + tic; + Eigen::Matrix pts_imu_j = qji * pts_imu_i + tji * n.transpose() * pts_imu_i; + Eigen::Matrix pts_cam_j = qic.inverse() * (pts_imu_j - tic); + + pts_cam_j = (pts_cam_j / pts_cam_j[2]); + residuals[0] = pts_cam_j[0] - pts_j[0]; + residuals[1] = pts_cam_j[1] - pts_j[1]; + + return true; + } + + static ceres::CostFunction* Create(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) + { + return (new ceres::AutoDiffCostFunction + (new HomographyFactor(_pts_i, _pts_j))); + } + + Eigen::Vector3d pts_i, pts_j; +}; diff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp index 05917a0..76a4f88 100644 --- a/vins_estimator/src/initial/initial_sfm.cpp +++ b/vins_estimator/src/initial/initial_sfm.cpp @@ -319,7 +319,7 @@ bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l, bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, - const Matrix3d relative_R, const Vector3d relative_T, const Vector3d n, + const Matrix3d relative_R, const Vector3d relative_T, Vector3d &n, vector &sfm_f, map &sfm_tracked_points) { feature_num = sfm_f.size(); @@ -525,6 +525,11 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, if(sfm_f[i].state) sfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]); } + + n[0] = c_normal[0]; + n[1] = c_normal[1]; + n[2] = c_normal[2]; + return true; } diff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h index 7780732..e79dcea 100644 --- a/vins_estimator/src/initial/initial_sfm.h +++ b/vins_estimator/src/initial/initial_sfm.h @@ -1,4 +1,4 @@ -#pragma once +#pragma once #include #include #include @@ -97,7 +97,7 @@ public: vector &sfm_f, map &sfm_tracked_points); bool constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, - const Matrix3d relative_R, const Vector3d relative_T, const Vector3d n, + const Matrix3d relative_R, const Vector3d relative_T, Vector3d &n, vector &sfm_f, map &sfm_tracked_points); private: -- 2.17.1 From d3d1faeaa37f6b413622e6b3373c80cffc4182f7 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 1 Oct 2020 11:57:44 +0530 Subject: [PATCH 07/42] Add code for transforming normal vector --- vins_estimator/src/estimator.cpp | 3 +++ vins_estimator/src/factor/homography_factor.h | 5 ++++- 2 files changed, 7 insertions(+), 1 deletion(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 97a3821..0112bcd 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -283,6 +283,9 @@ bool Estimator::initialStructure() marginalization_flag = MARGIN_OLD; return false; } + para_N[0] = n[0]; + para_N[1] = n[1]; + para_N[2] = n[2]; //solve pnp for all frame map::iterator frame_it; diff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h index 86b03ff..8473930 100644 --- a/vins_estimator/src/factor/homography_factor.h +++ b/vins_estimator/src/factor/homography_factor.h @@ -23,11 +23,14 @@ struct HomographyFactor qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6]; Eigen::Map> n(para_n); + Eigen::Matrix n_imu_1 = qic*(n.normalized()) + tic; + Eigen::Matrix n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi; + Eigen::Matrix n_cam_i = qic.inverse()*n_imu_i - qic.inverse()*tic; Eigen::Quaternion qji = qj.inverse() * qi; Eigen::Matrix tji = qj.inverse().toRotationMatrix() * (pi - pj); Eigen::Matrix pts_imu_i = qic * pts_i.cast() + tic; - Eigen::Matrix pts_imu_j = qji * pts_imu_i + tji * n.transpose() * pts_imu_i; + Eigen::Matrix pts_imu_j = qji * pts_imu_i + tji * n_cam_i.transpose() * pts_imu_i; Eigen::Matrix pts_cam_j = qic.inverse() * (pts_imu_j - tic); pts_cam_j = (pts_cam_j / pts_cam_j[2]); -- 2.17.1 From dbbe91b1f26b80fe235e3f7dfee9a55c98f108ad Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 2 Oct 2020 10:27:49 +0530 Subject: [PATCH 08/42] Change to pixel coordinates in h cost functor --- vins_estimator/src/estimator.cpp | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 0112bcd..6f31fb5 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -774,6 +774,7 @@ void Estimator::optimization() int imu_i = it_per_id.start_frame, imu_j = imu_i - 1; Vector3d pts_i = it_per_id.feature_per_frame[0].point; + Vector3d img_pts_i(it_per_id.feature_per_frame[0].uv.x(), it_per_id.feature_per_frame[1].uv.y(), 1); for (auto &it_per_frame : it_per_id.feature_per_frame) { @@ -782,7 +783,10 @@ void Estimator::optimization() { continue; } + Vector3d pts_j = it_per_frame.point; + Vector3d img_pts_j(it_per_frame.uv.x(), it_per_frame.uv.y(), 1); + if (ESTIMATE_TD) { ProjectionTdFactor *f_td = new ProjectionTdFactor(pts_i, pts_j, it_per_id.feature_per_frame[0].velocity, it_per_frame.velocity, @@ -804,7 +808,7 @@ void Estimator::optimization() ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); - ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); + ceres::CostFunction *h_cost_function = HomographyFactor::Create(img_pts_i, img_pts_j); problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); } f_m_cnt++; -- 2.17.1 From 9cfa806aac4f59176a59e778e3570c4e69ebb1d2 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 2 Oct 2020 12:21:06 +0530 Subject: [PATCH 09/42] Fix reference frame bug --- vins_estimator/src/factor/homography_factor.h | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h index 8473930..2a35d40 100644 --- a/vins_estimator/src/factor/homography_factor.h +++ b/vins_estimator/src/factor/homography_factor.h @@ -25,12 +25,11 @@ struct HomographyFactor Eigen::Map> n(para_n); Eigen::Matrix n_imu_1 = qic*(n.normalized()) + tic; Eigen::Matrix n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi; - Eigen::Matrix n_cam_i = qic.inverse()*n_imu_i - qic.inverse()*tic; Eigen::Quaternion qji = qj.inverse() * qi; - Eigen::Matrix tji = qj.inverse().toRotationMatrix() * (pi - pj); + Eigen::Matrix tji = qj.inverse() * (pi - pj); Eigen::Matrix pts_imu_i = qic * pts_i.cast() + tic; - Eigen::Matrix pts_imu_j = qji * pts_imu_i + tji * n_cam_i.transpose() * pts_imu_i; + Eigen::Matrix pts_imu_j = qji * pts_imu_i + tji * n_imu_i.transpose() * pts_imu_i; Eigen::Matrix pts_cam_j = qic.inverse() * (pts_imu_j - tic); pts_cam_j = (pts_cam_j / pts_cam_j[2]); -- 2.17.1 From 80c091ccbffd376f8bef7f19c1129f47d321e8c2 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 2 Oct 2020 12:39:00 +0530 Subject: [PATCH 10/42] Fix indexing bug --- vins_estimator/src/estimator.cpp | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 6f31fb5..141ba1c 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -774,7 +774,7 @@ void Estimator::optimization() int imu_i = it_per_id.start_frame, imu_j = imu_i - 1; Vector3d pts_i = it_per_id.feature_per_frame[0].point; - Vector3d img_pts_i(it_per_id.feature_per_frame[0].uv.x(), it_per_id.feature_per_frame[1].uv.y(), 1); + Vector3d img_pts_i(it_per_id.feature_per_frame[0].uv.x(), it_per_id.feature_per_frame[0].uv.y(), 1); for (auto &it_per_frame : it_per_id.feature_per_frame) { -- 2.17.1 From 608edba769955088a250ac4ac416a5f8bb766588 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sat, 3 Oct 2020 11:21:59 +0530 Subject: [PATCH 11/42] Change to point values in cost functor --- vins_estimator/src/estimator.cpp | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 141ba1c..096ef10 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -808,7 +808,7 @@ void Estimator::optimization() ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); - ceres::CostFunction *h_cost_function = HomographyFactor::Create(img_pts_i, img_pts_j); + ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); } f_m_cnt++; -- 2.17.1 From 55e116d9bfa5350e2e717a758fcd9dd1dfa7d0f9 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Mon, 19 Oct 2020 10:40:18 +0530 Subject: [PATCH 12/42] Apply mask before feature extraction --- config/vins_rviz_config.rviz | 192 +++++++++--------- feature_tracker/CMakeLists.txt | 1 + feature_tracker/src/feature_tracker.cpp | 9 +- feature_tracker/src/feature_tracker.h | 4 +- feature_tracker/src/feature_tracker_node.cpp | 195 ++++++++++++++++++- feature_tracker/src/parameters.cpp | 2 + feature_tracker/src/parameters.h | 1 + 7 files changed, 302 insertions(+), 102 deletions(-) diff --git a/config/vins_rviz_config.rviz b/config/vins_rviz_config.rviz index 6ed63a9..92ff49d 100644 --- a/config/vins_rviz_config.rviz +++ b/config/vins_rviz_config.rviz @@ -4,8 +4,8 @@ Panels: Name: Displays Property Tree Widget: Expanded: ~ - Splitter Ratio: 0.465115994 - Tree Height: 221 + Splitter Ratio: 0.4651159942150116 + Tree Height: 151 - Class: rviz/Selection Name: Selection - Class: rviz/Tool Properties @@ -14,7 +14,7 @@ Panels: - /2D Nav Goal1 - /Publish Point1 Name: Tool Properties - Splitter Ratio: 0.588679016 + Splitter Ratio: 0.5886790156364441 - Class: rviz/Views Expanded: - /Current View1 @@ -24,14 +24,18 @@ Panels: Experimental: false Name: Time SyncMode: 0 - SyncSource: tracked image + SyncSource: "" - Class: rviz/Displays Help Height: 78 Name: Displays Property Tree Widget: Expanded: ~ Splitter Ratio: 0.5 - Tree Height: 359 + Tree Height: 363 +Preferences: + PromptSaveOnExit: true +Toolbars: + toolButtonStyle: 2 Visualization Manager: Class: "" Displays: @@ -41,7 +45,7 @@ Visualization Manager: Color: 130; 130; 130 Enabled: true Line Style: - Line Width: 0.0299999993 + Line Width: 0.029999999329447746 Value: Lines Name: Grid Normal Cell Count: 0 @@ -57,7 +61,7 @@ Visualization Manager: Enabled: true Length: 1 Name: Axes - Radius: 0.100000001 + Radius: 0.10000000149011612 Reference Frame: Value: true - Alpha: 1 @@ -65,9 +69,9 @@ Visualization Manager: Class: rviz/Path Color: 255; 0; 0 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines Line Width: 0.5 Name: ground_truth_path @@ -77,9 +81,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /benchmark_publisher/path Unreliable: false Value: true @@ -114,11 +118,11 @@ Visualization Manager: Class: rviz/Path Color: 0; 255; 0 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.0299999993 + Line Width: 0.029999999329447746 Name: Path Offset: X: 0 @@ -126,9 +130,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /vins_estimator/path Unreliable: false Value: true @@ -163,7 +167,7 @@ Visualization Manager: Queue Size: 10 Selectable: true Size (Pixels): 1 - Size (m): 0.00999999978 + Size (m): 0.009999999776482582 Style: Points Topic: /vins_estimator/point_cloud Unreliable: false @@ -214,7 +218,7 @@ Visualization Manager: {} Update Interval: 0 Value: true - Enabled: false + Enabled: true Name: VIO - Class: rviz/Group Displays: @@ -223,11 +227,11 @@ Visualization Manager: Class: rviz/Path Color: 0; 255; 0 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.0299999993 + Line Width: 0.029999999329447746 Name: pose_graph_path Offset: X: 0 @@ -235,9 +239,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /pose_graph/pose_graph_path Unreliable: false Value: true @@ -246,11 +250,11 @@ Visualization Manager: Class: rviz/Path Color: 255; 170; 0 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.0299999993 + Line Width: 0.029999999329447746 Name: base_path Offset: X: 0 @@ -258,9 +262,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /pose_graph/base_path Unreliable: false Value: true @@ -305,11 +309,11 @@ Visualization Manager: Class: rviz/Path Color: 25; 255; 0 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.300000012 + Line Width: 0.30000001192092896 Name: Sequence1 Offset: X: 0 @@ -317,9 +321,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_1 Unreliable: false Value: true @@ -328,11 +332,11 @@ Visualization Manager: Class: rviz/Path Color: 255; 0; 0 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.300000012 + Line Width: 0.30000001192092896 Name: Sequence2 Offset: X: 0 @@ -340,9 +344,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_2 Unreliable: false Value: true @@ -351,11 +355,11 @@ Visualization Manager: Class: rviz/Path Color: 0; 85; 255 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.300000012 + Line Width: 0.30000001192092896 Name: Sequence3 Offset: X: 0 @@ -363,9 +367,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_3 Unreliable: false Value: true @@ -374,11 +378,11 @@ Visualization Manager: Class: rviz/Path Color: 255; 170; 0 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.300000012 + Line Width: 0.30000001192092896 Name: Sequence4 Offset: X: 0 @@ -386,9 +390,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_4 Unreliable: false Value: true @@ -397,11 +401,11 @@ Visualization Manager: Class: rviz/Path Color: 255; 170; 255 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.0299999993 + Line Width: 0.029999999329447746 Name: Sequence5 Offset: X: 0 @@ -409,9 +413,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /pose_graph/path_5 Unreliable: false Value: true @@ -420,11 +424,11 @@ Visualization Manager: Class: rviz/Path Color: 25; 255; 0 Enabled: true - Head Diameter: 0.300000012 - Head Length: 0.200000003 - Length: 0.300000012 + Head Diameter: 0.30000001192092896 + Head Length: 0.20000000298023224 + Length: 0.30000001192092896 Line Style: Lines - Line Width: 0.0299999993 + Line Width: 0.029999999329447746 Name: no_loop_path Offset: X: 0 @@ -432,9 +436,9 @@ Visualization Manager: Z: 0 Pose Color: 255; 85; 255 Pose Style: None - Radius: 0.0299999993 - Shaft Diameter: 0.100000001 - Shaft Length: 0.100000001 + Radius: 0.029999999329447746 + Shaft Diameter: 0.10000000149011612 + Shaft Length: 0.10000000149011612 Topic: /pose_graph/no_loop_path Unreliable: false Value: true @@ -443,6 +447,7 @@ Visualization Manager: Enabled: true Global Options: Background Color: 0; 0; 0 + Default Light: true Fixed Frame: world Frame Rate: 30 Name: root @@ -452,7 +457,10 @@ Visualization Manager: - Class: rviz/FocusCamera - Class: rviz/Measure - Class: rviz/SetInitialPose + Theta std deviation: 0.2617993950843811 Topic: /initialpose + X std deviation: 0.5 + Y std deviation: 0.5 - Class: rviz/SetGoal Topic: /move_base_simple/goal - Class: rviz/PublishPoint @@ -462,33 +470,33 @@ Visualization Manager: Views: Current: Class: rviz/XYOrbit - Distance: 20.9684067 + Distance: 20.968406677246094 Enable Stereo Rendering: - Stereo Eye Separation: 0.0599999987 + Stereo Eye Separation: 0.05999999865889549 Stereo Focal Distance: 1 Swap Stereo Eyes: false Value: false Focal Point: - X: -2.7069304 - Y: -1.26974416 - Z: 2.1410624e-05 + X: -2.70693039894104 + Y: -1.2697441577911377 + Z: 2.141062395821791e-5 Focal Shape Fixed Size: true - Focal Shape Size: 0.0500000007 + Focal Shape Size: 0.05000000074505806 Invert Z Axis: false Name: Current View - Near Clip Distance: 0.00999999978 - Pitch: 1.0797962 + Near Clip Distance: 0.009999999776482582 + Pitch: 1.0797961950302124 Target Frame: Value: XYOrbit (rviz) - Yaw: 3.08722663 + Yaw: 3.087226629257202 Saved: ~ Window Geometry: Displays: collapsed: false - Height: 912 + Height: 734 Hide Left Dock: false Hide Right Dock: true - QMainWindow State: 000000ff00000000fd0000000400000000000001df00000309fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000006400fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d00610067006501000000280000012c0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d006100670065010000015a000000b30000001600fffffffb000000100044006900730070006c00610079007301000002130000011e000000dd00fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000016a00fffffffb0000000a00560069006500770073000000023f0000016a0000010f00fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e100000197000000030000056e0000003bfc0100000002fb0000000800540069006d006501000000000000056e0000030000fffffffb0000000800540069006d00650100000000000004500000000000000000000003890000030900000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000 + QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000000de0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650100000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001ac000000d4000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002a30000003bfc0100000002fb0000000800540069006d00650100000000000002a3000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000001470000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000 Selection: collapsed: false Time: @@ -497,9 +505,9 @@ Window Geometry: collapsed: false Views: collapsed: true - Width: 1390 - X: 2127 - Y: 109 + Width: 675 + X: 685 + Y: 4 loop_match_image: collapsed: false raw_image: diff --git a/feature_tracker/CMakeLists.txt b/feature_tracker/CMakeLists.txt index bc3fb44..439f706 100644 --- a/feature_tracker/CMakeLists.txt +++ b/feature_tracker/CMakeLists.txt @@ -11,6 +11,7 @@ find_package(catkin REQUIRED COMPONENTS sensor_msgs cv_bridge camera_model + message_filters ) find_package(OpenCV REQUIRED) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index 41322b8..9d5de16 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -33,12 +33,13 @@ FeatureTracker::FeatureTracker() { } -void FeatureTracker::setMask() +void FeatureTracker::setMask(const cv::Mat &_mask) { if(FISHEYE) mask = fisheye_mask.clone(); else - mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255)); + //mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255)); + mask = _mask.clone(); // prefer to keep features that are tracked for long time @@ -78,7 +79,7 @@ void FeatureTracker::addPoints() } } -void FeatureTracker::readImage(const cv::Mat &_img, double _cur_time) +void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time) { cv::Mat img; TicToc t_r; @@ -132,7 +133,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, double _cur_time) rejectWithF(); ROS_DEBUG("set mask begins"); TicToc t_m; - setMask(); + setMask(_mask); ROS_DEBUG("set mask costs %fms", t_m.toc()); ROS_DEBUG("detect feature begins"); diff --git a/feature_tracker/src/feature_tracker.h b/feature_tracker/src/feature_tracker.h index 9862ec1..cb376db 100644 --- a/feature_tracker/src/feature_tracker.h +++ b/feature_tracker/src/feature_tracker.h @@ -30,9 +30,9 @@ class FeatureTracker public: FeatureTracker(); - void readImage(const cv::Mat &_img,double _cur_time); + void readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time); - void setMask(); + void setMask(const cv::Mat &_mask); void addPoints(); diff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp index 5a5261c..a0dbd25 100644 --- a/feature_tracker/src/feature_tracker_node.cpp +++ b/feature_tracker/src/feature_tracker_node.cpp @@ -6,6 +6,7 @@ #include #include #include +#include #include "feature_tracker.h" @@ -25,7 +26,7 @@ bool first_image_flag = true; double last_image_time = 0; bool init_pub = 0; -void img_callback(const sensor_msgs::ImageConstPtr &img_msg) +void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::ImageConstPtr &mask_msg) { if(first_image_flag) { @@ -61,7 +62,7 @@ void img_callback(const sensor_msgs::ImageConstPtr &img_msg) else PUB_THIS_FRAME = false; - cv_bridge::CvImageConstPtr ptr; + cv_bridge::CvImageConstPtr ptr, mask_ptr; if (img_msg->encoding == "8UC1") { sensor_msgs::Image img; @@ -77,13 +78,15 @@ void img_callback(const sensor_msgs::ImageConstPtr &img_msg) else ptr = cv_bridge::toCvCopy(img_msg, sensor_msgs::image_encodings::MONO8); + mask_ptr = cv_bridge::toCvCopy(mask_msg, sensor_msgs::image_encodings::MONO8); cv::Mat show_img = ptr->image; + TicToc t_r; for (int i = 0; i < NUM_OF_CAM; i++) { ROS_DEBUG("processing camera %d", i); if (i != 1 || !STEREO_TRACK) - trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), img_msg->header.stamp.toSec()); + trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image, img_msg->header.stamp.toSec()); else { if (EQUALIZE) @@ -201,8 +204,187 @@ void img_callback(const sensor_msgs::ImageConstPtr &img_msg) } } ROS_INFO("whole feature tracker processing costs: %f", t_r.toc()); + } +//void img_callback(const sensor_msgs::ImageConstPtr &img_msg) +//{ +// if(first_image_flag) +// { +// first_image_flag = false; +// first_image_time = img_msg->header.stamp.toSec(); +// last_image_time = img_msg->header.stamp.toSec(); +// return; +// } +// // detect unstable camera stream +// if (img_msg->header.stamp.toSec() - last_image_time > 1.0 || img_msg->header.stamp.toSec() < last_image_time) +// { +// ROS_WARN("image discontinue! reset the feature tracker!"); +// first_image_flag = true; +// last_image_time = 0; +// pub_count = 1; +// std_msgs::Bool restart_flag; +// restart_flag.data = true; +// pub_restart.publish(restart_flag); +// return; +// } +// last_image_time = img_msg->header.stamp.toSec(); +// // frequency control +// if (round(1.0 * pub_count / (img_msg->header.stamp.toSec() - first_image_time)) <= FREQ) +// { +// PUB_THIS_FRAME = true; +// // reset the frequency control +// if (abs(1.0 * pub_count / (img_msg->header.stamp.toSec() - first_image_time) - FREQ) < 0.01 * FREQ) +// { +// first_image_time = img_msg->header.stamp.toSec(); +// pub_count = 0; +// } +// } +// else +// PUB_THIS_FRAME = false; +// +// cv_bridge::CvImageConstPtr ptr; +// if (img_msg->encoding == "8UC1") +// { +// sensor_msgs::Image img; +// img.header = img_msg->header; +// img.height = img_msg->height; +// img.width = img_msg->width; +// img.is_bigendian = img_msg->is_bigendian; +// img.step = img_msg->step; +// img.data = img_msg->data; +// img.encoding = "mono8"; +// ptr = cv_bridge::toCvCopy(img, sensor_msgs::image_encodings::MONO8); +// } +// else +// ptr = cv_bridge::toCvCopy(img_msg, sensor_msgs::image_encodings::MONO8); +// +// cv::Mat show_img = ptr->image; +// TicToc t_r; +// for (int i = 0; i < NUM_OF_CAM; i++) +// { +// ROS_DEBUG("processing camera %d", i); +// if (i != 1 || !STEREO_TRACK) +// trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), img_msg->header.stamp.toSec()); +// else +// { +// if (EQUALIZE) +// { +// cv::Ptr clahe = cv::createCLAHE(); +// clahe->apply(ptr->image.rowRange(ROW * i, ROW * (i + 1)), trackerData[i].cur_img); +// } +// else +// trackerData[i].cur_img = ptr->image.rowRange(ROW * i, ROW * (i + 1)); +// } +// +//#if SHOW_UNDISTORTION +// trackerData[i].showUndistortion("undistrotion_" + std::to_string(i)); +//#endif +// } +// +// for (unsigned int i = 0;; i++) +// { +// bool completed = false; +// for (int j = 0; j < NUM_OF_CAM; j++) +// if (j != 1 || !STEREO_TRACK) +// completed |= trackerData[j].updateID(i); +// if (!completed) +// break; +// } +// +// if (PUB_THIS_FRAME) +// { +// pub_count++; +// sensor_msgs::PointCloudPtr feature_points(new sensor_msgs::PointCloud); +// sensor_msgs::ChannelFloat32 id_of_point; +// sensor_msgs::ChannelFloat32 u_of_point; +// sensor_msgs::ChannelFloat32 v_of_point; +// sensor_msgs::ChannelFloat32 velocity_x_of_point; +// sensor_msgs::ChannelFloat32 velocity_y_of_point; +// +// feature_points->header = img_msg->header; +// feature_points->header.frame_id = "world"; +// +// vector> hash_ids(NUM_OF_CAM); +// for (int i = 0; i < NUM_OF_CAM; i++) +// { +// auto &un_pts = trackerData[i].cur_un_pts; +// auto &cur_pts = trackerData[i].cur_pts; +// auto &ids = trackerData[i].ids; +// auto &pts_velocity = trackerData[i].pts_velocity; +// for (unsigned int j = 0; j < ids.size(); j++) +// { +// if (trackerData[i].track_cnt[j] > 1) +// { +// int p_id = ids[j]; +// hash_ids[i].insert(p_id); +// geometry_msgs::Point32 p; +// p.x = un_pts[j].x; +// p.y = un_pts[j].y; +// p.z = 1; +// +// feature_points->points.push_back(p); +// id_of_point.values.push_back(p_id * NUM_OF_CAM + i); +// u_of_point.values.push_back(cur_pts[j].x); +// v_of_point.values.push_back(cur_pts[j].y); +// velocity_x_of_point.values.push_back(pts_velocity[j].x); +// velocity_y_of_point.values.push_back(pts_velocity[j].y); +// } +// } +// } +// feature_points->channels.push_back(id_of_point); +// feature_points->channels.push_back(u_of_point); +// feature_points->channels.push_back(v_of_point); +// feature_points->channels.push_back(velocity_x_of_point); +// feature_points->channels.push_back(velocity_y_of_point); +// ROS_DEBUG("publish %f, at %f", feature_points->header.stamp.toSec(), ros::Time::now().toSec()); +// // skip the first image; since no optical speed on frist image +// if (!init_pub) +// { +// init_pub = 1; +// } +// else +// pub_img.publish(feature_points); +// +// if (SHOW_TRACK) +// { +// ptr = cv_bridge::cvtColor(ptr, sensor_msgs::image_encodings::BGR8); +// //cv::Mat stereo_img(ROW * NUM_OF_CAM, COL, CV_8UC3); +// cv::Mat stereo_img = ptr->image; +// +// for (int i = 0; i < NUM_OF_CAM; i++) +// { +// cv::Mat tmp_img = stereo_img.rowRange(i * ROW, (i + 1) * ROW); +// cv::cvtColor(show_img, tmp_img, CV_GRAY2RGB); +// +// for (unsigned int j = 0; j < trackerData[i].cur_pts.size(); j++) +// { +// double len = std::min(1.0, 1.0 * trackerData[i].track_cnt[j] / WINDOW_SIZE); +// cv::circle(tmp_img, trackerData[i].cur_pts[j], 2, cv::Scalar(255 * (1 - len), 0, 255 * len), 2); +// //draw speed line +// /* +// Vector2d tmp_cur_un_pts (trackerData[i].cur_un_pts[j].x, trackerData[i].cur_un_pts[j].y); +// Vector2d tmp_pts_velocity (trackerData[i].pts_velocity[j].x, trackerData[i].pts_velocity[j].y); +// Vector3d tmp_prev_un_pts; +// tmp_prev_un_pts.head(2) = tmp_cur_un_pts - 0.10 * tmp_pts_velocity; +// tmp_prev_un_pts.z() = 1; +// Vector2d tmp_prev_uv; +// trackerData[i].m_camera->spaceToPlane(tmp_prev_un_pts, tmp_prev_uv); +// cv::line(tmp_img, trackerData[i].cur_pts[j], cv::Point2f(tmp_prev_uv.x(), tmp_prev_uv.y()), cv::Scalar(255 , 0, 0), 1 , 8, 0); +// */ +// //char name[10]; +// //sprintf(name, "%d", trackerData[i].ids[j]); +// //cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0)); +// } +// } +// //cv::imshow("vis", stereo_img); +// //cv::waitKey(5); +// pub_match.publish(ptr->toImageMsg()); +// } +// } +// ROS_INFO("whole feature tracker processing costs: %f", t_r.toc()); +//} +// int main(int argc, char **argv) { ros::init(argc, argv, "feature_tracker"); @@ -228,7 +410,12 @@ int main(int argc, char **argv) } } - ros::Subscriber sub_img = n.subscribe(IMAGE_TOPIC, 100, img_callback); + //ros::Subscriber sub_img = n.subscribe(IMAGE_TOPIC, 100, img_callback); + message_filters::Subscriber image_sub(n, IMAGE_TOPIC, 100); + message_filters::Subscriber mask_image_sub(n, MASK_TOPIC, 100); + + message_filters::TimeSynchronizer sync(image_sub, mask_image_sub, 10); + sync.registerCallback(boost::bind(&callback, _1, _2)); pub_img = n.advertise("feature", 1000); pub_match = n.advertise("feature_img",1000); diff --git a/feature_tracker/src/parameters.cpp b/feature_tracker/src/parameters.cpp index 23c35c8..417efb2 100644 --- a/feature_tracker/src/parameters.cpp +++ b/feature_tracker/src/parameters.cpp @@ -1,6 +1,7 @@ #include "parameters.h" std::string IMAGE_TOPIC; +std::string MASK_TOPIC; std::string IMU_TOPIC; std::vector CAM_NAMES; std::string FISHEYE_MASK; @@ -46,6 +47,7 @@ void readParameters(ros::NodeHandle &n) std::string VINS_FOLDER_PATH = readParam(n, "vins_folder"); fsSettings["image_topic"] >> IMAGE_TOPIC; + fsSettings["mask_topic"] >> MASK_TOPIC; fsSettings["imu_topic"] >> IMU_TOPIC; MAX_CNT = fsSettings["max_cnt"]; MIN_DIST = fsSettings["min_dist"]; diff --git a/feature_tracker/src/parameters.h b/feature_tracker/src/parameters.h index 1bb578b..44e5f9c 100644 --- a/feature_tracker/src/parameters.h +++ b/feature_tracker/src/parameters.h @@ -9,6 +9,7 @@ const int NUM_OF_CAM = 1; extern std::string IMAGE_TOPIC; +extern std::string MASK_TOPIC; extern std::string IMU_TOPIC; extern std::string FISHEYE_MASK; extern std::vector CAM_NAMES; -- 2.17.1 From 8948e61ce641d532d902f6adacd97a890c7456dd Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Mon, 19 Oct 2020 11:11:28 +0530 Subject: [PATCH 13/42] Apply mask after extraction and with erosion --- feature_tracker/src/feature_tracker.cpp | 22 +++++++++++++--------- feature_tracker/src/feature_tracker.h | 4 ++-- 2 files changed, 15 insertions(+), 11 deletions(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index 9d5de16..4384928 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -33,13 +33,12 @@ FeatureTracker::FeatureTracker() { } -void FeatureTracker::setMask(const cv::Mat &_mask) +void FeatureTracker::setMask() { if(FISHEYE) mask = fisheye_mask.clone(); else - //mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255)); - mask = _mask.clone(); + mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255)); // prefer to keep features that are tracked for long time @@ -69,13 +68,18 @@ void FeatureTracker::setMask(const cv::Mat &_mask) } } -void FeatureTracker::addPoints() +void FeatureTracker::addPoints(const cv::Mat &_mask) { + cv::Mat dynamic_mask; + cv::erode(_mask, dynamic_mask, cv::Mat(), cv::Point(-1, -1), 2, 1, 1); for (auto &p : n_pts) { - forw_pts.push_back(p); - ids.push_back(-1); - track_cnt.push_back(1); + if(dynamic_mask.at(p) == 255) + { + forw_pts.push_back(p); + ids.push_back(-1); + track_cnt.push_back(1); + } } } @@ -133,7 +137,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double rejectWithF(); ROS_DEBUG("set mask begins"); TicToc t_m; - setMask(_mask); + setMask(); ROS_DEBUG("set mask costs %fms", t_m.toc()); ROS_DEBUG("detect feature begins"); @@ -155,7 +159,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double ROS_DEBUG("add feature begins"); TicToc t_a; - addPoints(); + addPoints(_mask); ROS_DEBUG("selectFeature costs: %fms", t_a.toc()); } prev_img = cur_img; diff --git a/feature_tracker/src/feature_tracker.h b/feature_tracker/src/feature_tracker.h index cb376db..c98c7ac 100644 --- a/feature_tracker/src/feature_tracker.h +++ b/feature_tracker/src/feature_tracker.h @@ -32,9 +32,9 @@ class FeatureTracker void readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time); - void setMask(const cv::Mat &_mask); + void setMask(); - void addPoints(); + void addPoints(const cv::Mat &_mask); bool updateID(unsigned int i); -- 2.17.1 From cac43fd693103c98cbf95fa628b5f953681b3384 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sun, 29 Nov 2020 11:29:04 +0530 Subject: [PATCH 14/42] Changes for debugging --- vins_estimator/src/estimator.cpp | 17 ++++++++++------- vins_estimator/src/initial/solve_5pts.cpp | 7 ++++--- 2 files changed, 14 insertions(+), 10 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 096ef10..5b302ea 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -275,9 +275,12 @@ bool Estimator::initialStructure() return false; } GlobalSFM sfm; - if(!sfm.constructH(frame_count + 1, Q, T, l, - relative_R, relative_T, n, - sfm_f, sfm_tracked_points)) + //if(!sfm.constructH(frame_count + 1, Q, T, l, + // relative_R, relative_T, n, + // sfm_f, sfm_tracked_points)) + if(!sfm.construct(frame_count + 1, Q, T, l, + relative_R, relative_T, + sfm_f, sfm_tracked_points)) { ROS_DEBUG("global SFM failed!"); marginalization_flag = MARGIN_OLD; @@ -808,8 +811,8 @@ void Estimator::optimization() ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); - ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); - problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); + //ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); + //problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); } f_m_cnt++; } @@ -846,7 +849,7 @@ void Estimator::optimization() ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); problem.AddResidualBlock(f, loss_function, para_Pose[start], relo_Pose, para_Ex_Pose[0], para_Feature[feature_index]); retrive_feature_index++; - } + } } } @@ -882,7 +885,7 @@ void Estimator::optimization() if (last_marginalization_info) { - vector drop_set; + vector drop_set; for (int i = 0; i < static_cast(last_marginalization_parameter_blocks.size()); i++) { if (last_marginalization_parameter_blocks[i] == para_Pose[0] || diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index bdc87a3..57485af 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -275,7 +275,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri Vector3d t; cv::cv2eigen(cv_ts[i], t); - Tr.block(0,3,3,1) = t; + Tr.block(0,3,3,1) = t.normalized(); //Tr = TrIC * Tr * TrIC.inverse(); Tr = Tr.inverse().eval(); @@ -283,6 +283,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri Vector3d e3(0, 0, 1); Vector3d n; cv::cv2eigen(cv_ns[i], n); + n.normalize(); if(n.dot(e3) > 0) positive_depth_transforms.push_back(Tr); positive_depth_normals.push_back(n); @@ -311,13 +312,13 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri Vector3d t; cv::cv2eigen(cv_ts[0], t); - Tr.block(0,3,3,1) = t; + Tr.block(0,3,3,1) = t.normalized(); Vector3d n; cv::cv2eigen(cv_ns[0], n); //Tr = TrIC * Tr * TrIC.inverse(); est_Tr = Tr.inverse().eval(); - est_n = n; + est_n = n.normalized(); } } -- 2.17.1 From 592df9a9008d83b575ec808197e53cfdf0084dff Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Tue, 1 Dec 2020 11:28:04 +0530 Subject: [PATCH 15/42] Change K, avg parallax, and decomposition parameters --- feature_tracker/src/feature_tracker.cpp | 2 +- vins_estimator/src/estimator.cpp | 15 ++++++--------- vins_estimator/src/initial/solve_5pts.cpp | 11 +++++++---- 3 files changed, 14 insertions(+), 14 deletions(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index 4384928..50251ca 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -173,7 +173,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double void FeatureTracker::rejectWithF() { - if (forw_pts.size() >= 8) + if (forw_pts.size() >= 4) { ROS_DEBUG("FM ransac begins"); TicToc t_f; diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 5b302ea..d493036 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -275,12 +275,9 @@ bool Estimator::initialStructure() return false; } GlobalSFM sfm; - //if(!sfm.constructH(frame_count + 1, Q, T, l, - // relative_R, relative_T, n, - // sfm_f, sfm_tracked_points)) - if(!sfm.construct(frame_count + 1, Q, T, l, - relative_R, relative_T, - sfm_f, sfm_tracked_points)) + if(!sfm.constructH(frame_count + 1, Q, T, l, + relative_R, relative_T, n, + sfm_f, sfm_tracked_points)) { ROS_DEBUG("global SFM failed!"); marginalization_flag = MARGIN_OLD; @@ -507,7 +504,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector for(int k = WINDOW_SIZE - 1; k > i; k--) R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); - if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n)) + if(average_parallax * 480 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n)) { l = i; ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); @@ -811,8 +808,8 @@ void Estimator::optimization() ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); - //ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); - //problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); + ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); + problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); } f_m_cnt++; } diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index 57485af..94a2a04 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -240,7 +240,8 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c } cv::Mat mask; //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask); - cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/665); + cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480); + //cv::Mat K = (cv::Mat_(3, 3) << 320, 0, 320, 0, 320, 240, 0, 0, 1); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); //cv::Mat K = (cv::Mat_(3,3) << 665, 0, 511, 0, 665, 383, 0, 0, 1); @@ -275,7 +276,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri Vector3d t; cv::cv2eigen(cv_ts[i], t); - Tr.block(0,3,3,1) = t.normalized(); + Tr.block(0,3,3,1) = t; //Tr = TrIC * Tr * TrIC.inverse(); Tr = Tr.inverse().eval(); @@ -285,8 +286,10 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri cv::cv2eigen(cv_ns[i], n); n.normalize(); if(n.dot(e3) > 0) + { positive_depth_transforms.push_back(Tr); positive_depth_normals.push_back(n); + } } vector rot_diff; @@ -300,7 +303,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin(); est_Tr = positive_depth_transforms[min_index]; - est_n = positive_depth_normals[min_index]; + est_n = positive_depth_normals[min_index].normalized(); } else @@ -312,7 +315,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri Vector3d t; cv::cv2eigen(cv_ts[0], t); - Tr.block(0,3,3,1) = t.normalized(); + Tr.block(0,3,3,1) = t; Vector3d n; cv::cv2eigen(cv_ns[0], n); -- 2.17.1 From 0cf646f9323df0d94551e541e1221febd4ad3e7a Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Wed, 2 Dec 2020 08:17:02 +0530 Subject: [PATCH 16/42] Use h only for two-frame init --- config/vins_rviz_config.rviz | 28 +++++++++++++++++++--------- vins_estimator/src/estimator.cpp | 6 +++--- 2 files changed, 22 insertions(+), 12 deletions(-) diff --git a/config/vins_rviz_config.rviz b/config/vins_rviz_config.rviz index 92ff49d..55ed131 100644 --- a/config/vins_rviz_config.rviz +++ b/config/vins_rviz_config.rviz @@ -3,7 +3,8 @@ Panels: Help Height: 0 Name: Displays Property Tree Widget: - Expanded: ~ + Expanded: + - /VIO1 Splitter Ratio: 0.4651159942150116 Tree Height: 151 - Class: rviz/Selection @@ -24,7 +25,7 @@ Panels: Experimental: false Name: Time SyncMode: 0 - SyncSource: "" + SyncSource: tracked image - Class: rviz/Displays Help Height: 78 Name: Displays @@ -141,7 +142,7 @@ Visualization Manager: Marker Topic: /vins_estimator/camera_pose_visual Name: camera_visual Namespaces: - {} + CameraPoseVisualization: true Queue Size: 100 Value: true - Alpha: 1 @@ -186,7 +187,7 @@ Visualization Manager: Color: 255; 255; 255 Color Transformer: FlatColor Decay Time: 3000 - Enabled: false + Enabled: true Invert Rainbow: true Max Color: 0; 0; 0 Max Intensity: 4096 @@ -203,19 +204,28 @@ Visualization Manager: Unreliable: false Use Fixed Frame: true Use rainbow: false - Value: false + Value: true - Class: rviz/TF Enabled: true Frame Timeout: 15 Frames: All Enabled: true + body: + Value: true + camera: + Value: true + world: + Value: true Marker Scale: 1 Name: TF Show Arrows: true Show Axes: true Show Names: true Tree: - {} + world: + body: + camera: + {} Update Interval: 0 Value: true Enabled: true @@ -470,7 +480,7 @@ Visualization Manager: Views: Current: Class: rviz/XYOrbit - Distance: 20.968406677246094 + Distance: 117.1939697265625 Enable Stereo Rendering: Stereo Eye Separation: 0.05999999865889549 Stereo Focal Distance: 1 @@ -485,10 +495,10 @@ Visualization Manager: Invert Z Axis: false Name: Current View Near Clip Distance: 0.009999999776482582 - Pitch: 1.0797961950302124 + Pitch: 0.01979677751660347 Target Frame: Value: XYOrbit (rviz) - Yaw: 3.087226629257202 + Yaw: 3.0572268962860107 Saved: ~ Window Geometry: Displays: diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index d493036..84fa09e 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -275,8 +275,8 @@ bool Estimator::initialStructure() return false; } GlobalSFM sfm; - if(!sfm.constructH(frame_count + 1, Q, T, l, - relative_R, relative_T, n, + if(!sfm.construct(frame_count + 1, Q, T, l, + relative_R, relative_T, sfm_f, sfm_tracked_points)) { ROS_DEBUG("global SFM failed!"); @@ -809,7 +809,7 @@ void Estimator::optimization() problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); - problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); + //problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); } f_m_cnt++; } -- 2.17.1 From d625b4dc0982f33cdd17d64620d40de34244f78c Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 17 Dec 2020 13:42:40 +0530 Subject: [PATCH 17/42] Change to qn parameterization, with plane depth --- vins_estimator/src/estimator.cpp | 22 +++++++++++------- vins_estimator/src/estimator.h | 4 +++- vins_estimator/src/factor/homography_factor.h | 12 ++++++---- vins_estimator/src/initial/initial_sfm.cpp | 23 ++++++++++--------- vins_estimator/src/initial/initial_sfm.h | 4 ++-- vins_estimator/src/initial/solve_5pts.cpp | 4 +--- 6 files changed, 40 insertions(+), 29 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 84fa09e..7a5245c 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -275,17 +275,18 @@ bool Estimator::initialStructure() return false; } GlobalSFM sfm; - if(!sfm.construct(frame_count + 1, Q, T, l, - relative_R, relative_T, + if(!sfm.constructH(frame_count + 1, Q, T, l, + relative_R, relative_T, n, sfm_f, sfm_tracked_points)) { ROS_DEBUG("global SFM failed!"); marginalization_flag = MARGIN_OLD; return false; } - para_N[0] = n[0]; - para_N[1] = n[1]; - para_N[2] = n[2]; + para_N[0] = 0; + para_N[1] = n[0]; + para_N[2] = n[1]; + para_N[3] = n[2]; //solve pnp for all frame map::iterator frame_it; @@ -400,7 +401,8 @@ bool Estimator::visualInitialAlign() f_manager.setRic(ric); f_manager.triangulate(Ps, &(TIC_TMP[0]), &(RIC[0])); - double s = (x.tail<1>())(0); + s = (x.tail<1>())(0); + para_s[0] = s; for (int i = 0; i <= WINDOW_SIZE; i++) { pre_integrations[i]->repropagate(Vector3d::Zero(), Bgs[i]); @@ -504,7 +506,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector for(int k = WINDOW_SIZE - 1; k > i; k--) R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); - if(average_parallax * 480 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n)) + if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n)) { l = i; ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); @@ -724,6 +726,10 @@ void Estimator::optimization() problem.AddParameterBlock(para_Pose[i], SIZE_POSE, local_parameterization); problem.AddParameterBlock(para_SpeedBias[i], SIZE_SPEEDBIAS); } + + ceres::LocalParameterization *local_n_parameterization = new ceres::QuaternionParameterization(); + problem.AddParameterBlock(para_N, 4, local_n_parameterization); + for (int i = 0; i < NUM_OF_CAM; i++) { ceres::LocalParameterization *local_parameterization = new PoseLocalParameterization(); @@ -809,7 +815,7 @@ void Estimator::optimization() problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); - //problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]); + problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Feature[feature_index], para_Ex_Pose[0]); } f_m_cnt++; } diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index f3feedf..ba73f41 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -79,6 +79,7 @@ class Estimator Vector3d Bas[(WINDOW_SIZE + 1)]; Vector3d Bgs[(WINDOW_SIZE + 1)]; double td; + double s; Matrix3d back_R0, last_R, last_R0; Vector3d back_P0, last_P, last_P0; @@ -115,7 +116,8 @@ class Estimator double para_Retrive_Pose[SIZE_POSE]; double para_Td[1][1]; double para_Tr[1][1]; - double para_N[3]; + double para_N[4]; + double para_s[1]; int loop_window_index; diff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h index 2a35d40..2fdccb5 100644 --- a/vins_estimator/src/factor/homography_factor.h +++ b/vins_estimator/src/factor/homography_factor.h @@ -7,7 +7,7 @@ struct HomographyFactor HomographyFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) {} template - bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const ex_pose, + bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const para_inv_depth, const T* const ex_pose, T* residuals) const { Eigen::Map> pi(pose_i); @@ -22,14 +22,18 @@ struct HomographyFactor Eigen::Quaternion qic; qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6]; - Eigen::Map> n(para_n); + Eigen::Map> q_n(para_n); + Eigen::Matrix n; + n << q_n[1], q_n[2], q_n[3]; Eigen::Matrix n_imu_1 = qic*(n.normalized()) + tic; Eigen::Matrix n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi; + Eigen::Map> inv_depth(para_inv_depth); + Eigen::Quaternion qji = qj.inverse() * qi; Eigen::Matrix tji = qj.inverse() * (pi - pj); Eigen::Matrix pts_imu_i = qic * pts_i.cast() + tic; - Eigen::Matrix pts_imu_j = qji * pts_imu_i + tji * n_imu_i.transpose() * pts_imu_i; + Eigen::Matrix pts_imu_j = qji * pts_imu_i + ((tji*inv_depth[0]) * n_imu_i.transpose()) * pts_imu_i; Eigen::Matrix pts_cam_j = qic.inverse() * (pts_imu_j - tic); pts_cam_j = (pts_cam_j / pts_cam_j[2]); @@ -41,7 +45,7 @@ struct HomographyFactor static ceres::CostFunction* Create(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) { - return (new ceres::AutoDiffCostFunction + return (new ceres::AutoDiffCostFunction (new HomographyFactor(_pts_i, _pts_j))); } diff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp index 76a4f88..2877082 100644 --- a/vins_estimator/src/initial/initial_sfm.cpp +++ b/vins_estimator/src/initial/initial_sfm.cpp @@ -342,7 +342,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, Quaterniond c_Quat[frame_num]; double c_rotation[frame_num][4]; double c_translation[frame_num][3]; - double c_normal[3]; + double c_normal[4]; Eigen::Matrix Pose[frame_num]; c_Quat[l] = q[l].inverse(); @@ -462,10 +462,11 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, } } - c_normal[0] = n[0]; - c_normal[1] = n[1]; - c_normal[2] = n[2]; - problem.AddParameterBlock(c_normal, 3); + c_normal[0] = 0; + c_normal[1] = n[0]; + c_normal[2] = n[1]; + c_normal[3] = n[2]; + problem.AddParameterBlock(c_normal, 4, local_parameterization); for (int i = 0; i < feature_num; i++) { @@ -479,14 +480,14 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_f[i].observation[j].second.x(), sfm_f[i].observation[j].second.y()); - problem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l], - sfm_f[i].position); + //problem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l], + // sfm_f[i].position); ceres::CostFunction* h_cost_function = ReprojectionErrorH::Create( sfm_f[i].observation[j].second.x(), sfm_f[i].observation[j].second.y()); - problem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l], c_normal, sfm_f[i].image); + problem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l], c_normal, sfm_f[i].position); } } ceres::Solver::Options options; @@ -526,9 +527,9 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]); } - n[0] = c_normal[0]; - n[1] = c_normal[1]; - n[2] = c_normal[2]; + n[0] = c_normal[1]; + n[1] = c_normal[2]; + n[2] = c_normal[3]; return true; diff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h index e79dcea..f191434 100644 --- a/vins_estimator/src/initial/initial_sfm.h +++ b/vins_estimator/src/initial/initial_sfm.h @@ -66,7 +66,7 @@ struct ReprojectionErrorH { T p[3]; ceres::QuaternionRotatePoint(R, point, p); - T np = n[0] * point[0] + n[1] * point[1] + n[2] * point[2]; + T np = n[1] * point[0] + n[2] * point[1] + n[3] * point[2]; p[0] += t[0]*np; p[1] += t[1]*np; p[2] += t[2]*np; T xp = p[0] / p[2]; T yp = p[1] / p[2]; @@ -79,7 +79,7 @@ struct ReprojectionErrorH const double observed_y) { return (new ceres::AutoDiffCostFunction< - ReprojectionErrorH, 2, 4, 3, 3, 3>( + ReprojectionErrorH, 2, 4, 3, 4, 3>( new ReprojectionErrorH(observed_x,observed_y))); } diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index 94a2a04..d05faa5 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -241,9 +241,7 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c cv::Mat mask; //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask); cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480); - //cv::Mat K = (cv::Mat_(3, 3) << 320, 0, 320, 0, 320, 240, 0, 0, 1); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); - //cv::Mat K = (cv::Mat_(3,3) << 665, 0, 511, 0, 665, 383, 0, 0, 1); Eigen::Matrix4d est_Tr; Eigen::Vector3d est_n; @@ -303,7 +301,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin(); est_Tr = positive_depth_transforms[min_index]; - est_n = positive_depth_normals[min_index].normalized(); + est_n = positive_depth_normals[min_index]; } else -- 2.17.1 From 208ef88ed7f9fc41793852797b86c3d411f1c1e4 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 1 Jan 2021 12:53:32 +0530 Subject: [PATCH 18/42] Use mean point vector to disambiguate H, change thresholds --- feature_tracker/src/feature_tracker.cpp | 2 +- vins_estimator/src/initial/solve_5pts.cpp | 20 ++++++++++++++++---- vins_estimator/src/initial/solve_5pts.h | 2 +- 3 files changed, 18 insertions(+), 6 deletions(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index 50251ca..afc174e 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -194,7 +194,7 @@ void FeatureTracker::rejectWithF() vector status; //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); - cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status); + cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 0.99, status); int size_a = cur_pts.size(); reduceVector(prev_pts, status); reduceVector(cur_pts, status); diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index d05faa5..f86c2bf 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -240,12 +240,23 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c } cv::Mat mask; //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask); - cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480); + cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/460); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); + // Compute mean point vector + Eigen::Vector3d mean_l(0, 0, 1); + for(cv::Point2f &point : ll) + { + mean_l(0) += point.x; + mean_l(1) += point.y; + } + + mean_l(0) /= int(ll.size()); + mean_l(1) /= int(ll.size()); + Eigen::Matrix4d est_Tr; Eigen::Vector3d est_n; - decomposeH(H, K, R_imu, TrIC, est_Tr, est_n); + decomposeH(H, K, R_imu, TrIC, mean_l, est_Tr, est_n); Rotation = est_Tr.block(0,0,3,3); Translation = est_Tr.block(0,3,3,1); n = est_n; @@ -255,7 +266,8 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c } -void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix4d &est_Tr, Vector3d &est_n) +void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, + const Matrix4d &TrIC, const Vector3d &mean_l, Matrix4d &est_Tr, Vector3d &est_n) { vector cv_Rs, cv_ts, cv_ns; int n_sols = cv::decomposeHomographyMat(H, K, cv_Rs, cv_ts, cv_ns); @@ -283,7 +295,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri Vector3d n; cv::cv2eigen(cv_ns[i], n); n.normalize(); - if(n.dot(e3) > 0) + if(n.dot(mean_l) > 0) { positive_depth_transforms.push_back(Tr); positive_depth_normals.push_back(n); diff --git a/vins_estimator/src/initial/solve_5pts.h b/vins_estimator/src/initial/solve_5pts.h index f090370..0da7ba2 100644 --- a/vins_estimator/src/initial/solve_5pts.h +++ b/vins_estimator/src/initial/solve_5pts.h @@ -16,7 +16,7 @@ class MotionEstimator bool solveRelativeRT(const vector> &corres, Matrix3d &R, Vector3d &T); bool solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T, Vector3d &n); - void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix4d &est_Tr, Vector3d &est_n); + void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, const Vector3d &mean_l, Matrix4d &est_Tr, Vector3d &est_n); private: double testTriangulation(const vector &l, -- 2.17.1 From c3c81a017e2a2ecf8d54ce1efacc0e602a8bea3a Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 1 Jan 2021 12:56:14 +0530 Subject: [PATCH 19/42] Propogate plane-d from initial window, instead of using per-point depth --- vins_estimator/src/estimator.cpp | 5 ++--- vins_estimator/src/estimator.h | 2 +- vins_estimator/src/factor/homography_factor.h | 4 +++- 3 files changed, 6 insertions(+), 5 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 7a5245c..7105514 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -402,7 +402,7 @@ bool Estimator::visualInitialAlign() f_manager.triangulate(Ps, &(TIC_TMP[0]), &(RIC[0])); s = (x.tail<1>())(0); - para_s[0] = s; + para_d[0] = s; for (int i = 0; i <= WINDOW_SIZE; i++) { pre_integrations[i]->repropagate(Vector3d::Zero(), Bgs[i]); @@ -462,7 +462,6 @@ bool Estimator::relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l) Vector2d pts_1(corres[j].second(0), corres[j].second(1)); double parallax = (pts_0 - pts_1).norm(); sum_parallax = sum_parallax + parallax; - } average_parallax = 1.0 * sum_parallax / int(corres.size()); if(average_parallax * 460 > 30 && m_estimator.solveRelativeRT(corres, relative_R, relative_T)) @@ -815,7 +814,7 @@ void Estimator::optimization() problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); - problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Feature[feature_index], para_Ex_Pose[0]); + problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_d, para_Ex_Pose[0]); } f_m_cnt++; } diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index ba73f41..adad3cd 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -117,7 +117,7 @@ class Estimator double para_Td[1][1]; double para_Tr[1][1]; double para_N[4]; - double para_s[1]; + double para_d[1]; int loop_window_index; diff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h index 2fdccb5..ce6612b 100644 --- a/vins_estimator/src/factor/homography_factor.h +++ b/vins_estimator/src/factor/homography_factor.h @@ -32,8 +32,10 @@ struct HomographyFactor Eigen::Quaternion qji = qj.inverse() * qi; Eigen::Matrix tji = qj.inverse() * (pi - pj); + Eigen::Matrix di; + di(0,0) = inv_depth(0,0) - pi.dot(n_imu_i); Eigen::Matrix pts_imu_i = qic * pts_i.cast() + tic; - Eigen::Matrix pts_imu_j = qji * pts_imu_i + ((tji*inv_depth[0]) * n_imu_i.transpose()) * pts_imu_i; + Eigen::Matrix pts_imu_j = qji * pts_imu_i + (tji*(1.0/di(0,0)) * n_imu_i.transpose()) * pts_imu_i; Eigen::Matrix pts_cam_j = qic.inverse() * (pts_imu_j - tic); pts_cam_j = (pts_cam_j / pts_cam_j[2]); -- 2.17.1 From c9f6349946574ca616acb11987d76d40055e59cf Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 7 Jan 2021 11:05:10 +0530 Subject: [PATCH 20/42] Revert to previous thresholds --- feature_tracker/src/feature_tracker.cpp | 2 +- vins_estimator/src/estimator.cpp | 2 -- vins_estimator/src/initial/initial_sfm.cpp | 4 ++-- vins_estimator/src/initial/initial_sfm.h | 10 +++++----- vins_estimator/src/initial/solve_5pts.cpp | 2 +- 5 files changed, 9 insertions(+), 11 deletions(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index afc174e..50251ca 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -194,7 +194,7 @@ void FeatureTracker::rejectWithF() vector status; //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); - cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 0.99, status); + cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status); int size_a = cur_pts.size(); reduceVector(prev_pts, status); reduceVector(cur_pts, status); diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 7105514..bfa5c1e 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -779,7 +779,6 @@ void Estimator::optimization() int imu_i = it_per_id.start_frame, imu_j = imu_i - 1; Vector3d pts_i = it_per_id.feature_per_frame[0].point; - Vector3d img_pts_i(it_per_id.feature_per_frame[0].uv.x(), it_per_id.feature_per_frame[0].uv.y(), 1); for (auto &it_per_frame : it_per_id.feature_per_frame) { @@ -790,7 +789,6 @@ void Estimator::optimization() } Vector3d pts_j = it_per_frame.point; - Vector3d img_pts_j(it_per_frame.uv.x(), it_per_frame.uv.y(), 1); if (ESTIMATE_TD) { diff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp index 2877082..ad9f729 100644 --- a/vins_estimator/src/initial/initial_sfm.cpp +++ b/vins_estimator/src/initial/initial_sfm.cpp @@ -480,8 +480,8 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_f[i].observation[j].second.x(), sfm_f[i].observation[j].second.y()); - //problem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l], - // sfm_f[i].position); + problem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l], + sfm_f[i].position); ceres::CostFunction* h_cost_function = ReprojectionErrorH::Create( sfm_f[i].observation[j].second.x(), diff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h index f191434..a070742 100644 --- a/vins_estimator/src/initial/initial_sfm.h +++ b/vins_estimator/src/initial/initial_sfm.h @@ -64,12 +64,12 @@ struct ReprojectionErrorH template bool operator()(const T* const R, const T* const t, const T* const n, const T* point, T* residuals) const { - T p[3]; - ceres::QuaternionRotatePoint(R, point, p); + T rp[3]; + ceres::QuaternionRotatePoint(R, point, rp); T np = n[1] * point[0] + n[2] * point[1] + n[3] * point[2]; - p[0] += t[0]*np; p[1] += t[1]*np; p[2] += t[2]*np; - T xp = p[0] / p[2]; - T yp = p[1] / p[2]; + rp[0] += t[0]*np; rp[1] += t[1]*np; rp[2] += t[2]*np; + T xp = rp[0] / rp[2]; + T yp = rp[1] / rp[2]; residuals[0] = xp - T(observed_u); residuals[1] = yp - T(observed_v); return true; diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index f86c2bf..a14ec1c 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -240,7 +240,7 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c } cv::Mat mask; //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask); - cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/460); + cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); // Compute mean point vector -- 2.17.1 From 2c6255f97f3f02da71082d34e861befc5e9accec Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 8 Jan 2021 11:40:55 +0530 Subject: [PATCH 21/42] Add code for generic mask to single mask conversion --- feature_tracker/src/feature_tracker.cpp | 40 ++++++++++++++++++-- feature_tracker/src/feature_tracker_node.cpp | 2 +- 2 files changed, 37 insertions(+), 5 deletions(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index 50251ca..178a7a7 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -28,7 +28,6 @@ void reduceVector(vector &v, vector status) v.resize(j); } - FeatureTracker::FeatureTracker() { } @@ -70,11 +69,44 @@ void FeatureTracker::setMask() void FeatureTracker::addPoints(const cv::Mat &_mask) { - cv::Mat dynamic_mask; - cv::erode(_mask, dynamic_mask, cv::Mat(), cv::Point(-1, -1), 2, 1, 1); + // Find unique labels + cv::Mat mask_copy = _mask.clone(); + std::sort(mask_copy.begin(), mask_copy.end()); + auto last = std::unique(mask_copy.begin(), mask_copy.end()); + std::vector unique_labels(mask_copy.begin(), last); + + // Create mask for each label and erode, find largest mask + std::map unique_masks; + for(auto &label : unique_labels) + { + unique_masks.emplace(label, cv::Mat::zeros(_mask.size(), CV_8UC1)); + std::cout << int(label) << std::endl; + } + + for(int i = 0; i < _mask.rows; i++) + for(int j = 0; j < _mask.cols; j++) + unique_masks[_mask.at(i,j)].at(i,j) = 255; + + uchar largest_label = 0; + int largest_label_count = 0; + for(auto &mask : unique_masks) + { + if(mask.first == 0) + continue; + cv::erode(mask.second, mask.second, cv::Mat(), cv::Point(-1, -1), 2, 1, 1); + mask.second); + int label_count = cv::countNonZero(mask.second); + if(label_count > largest_label_count) + { + largest_label_count = label_count; + largest_label = mask.first; + } + } + + // Track features that only belong to largest mask for (auto &p : n_pts) { - if(dynamic_mask.at(p) == 255) + if(unique_masks[largest_label].at(p) == 255) { forw_pts.push_back(p); ids.push_back(-1); diff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp index a0dbd25..d1a9432 100644 --- a/feature_tracker/src/feature_tracker_node.cpp +++ b/feature_tracker/src/feature_tracker_node.cpp @@ -86,7 +86,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag { ROS_DEBUG("processing camera %d", i); if (i != 1 || !STEREO_TRACK) - trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image, img_msg->header.stamp.toSec()); + trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image.rowRange(ROW*i,ROW*(i+1)), img_msg->header.stamp.toSec()); else { if (EQUALIZE) -- 2.17.1 From cb655fa6061ceb11c7ae7fbddd11af779d738430 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Wed, 13 Jan 2021 11:48:28 +0530 Subject: [PATCH 22/42] Publish plane id for each feature --- feature_tracker/src/feature_tracker.cpp | 58 ++++++++++---------- feature_tracker/src/feature_tracker.h | 1 + feature_tracker/src/feature_tracker_node.cpp | 10 +++- 3 files changed, 37 insertions(+), 32 deletions(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index 178a7a7..0de4262 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -41,26 +41,31 @@ void FeatureTracker::setMask() // prefer to keep features that are tracked for long time - vector>> cnt_pts_id; + vector>>> cnt_pts_id_pid; for (unsigned int i = 0; i < forw_pts.size(); i++) - cnt_pts_id.push_back(make_pair(track_cnt[i], make_pair(forw_pts[i], ids[i]))); + cnt_pts_id_pid.push_back(make_pair(track_cnt[i], + make_pair(forw_pts[i], make_pair(ids[i], plane_ids[i])))); - sort(cnt_pts_id.begin(), cnt_pts_id.end(), [](const pair> &a, const pair> &b) + sort(cnt_pts_id_pid.begin(), cnt_pts_id_pid.end(), + [](const pair>> &a, + const pair>> &b) { return a.first > b.first; }); forw_pts.clear(); ids.clear(); + plane_ids.clear(); track_cnt.clear(); - for (auto &it : cnt_pts_id) + for (auto &it : cnt_pts_id_pid) { if (mask.at(it.second.first) == 255) { forw_pts.push_back(it.second.first); - ids.push_back(it.second.second); + ids.push_back(it.second.second.first); + plane_ids.push_back(it.second.second.second); track_cnt.push_back(it.first); cv::circle(mask, it.second.first, MIN_DIST, 0, -1); } @@ -75,42 +80,35 @@ void FeatureTracker::addPoints(const cv::Mat &_mask) auto last = std::unique(mask_copy.begin(), mask_copy.end()); std::vector unique_labels(mask_copy.begin(), last); - // Create mask for each label and erode, find largest mask + // Create mask for each label std::map unique_masks; for(auto &label : unique_labels) - { unique_masks.emplace(label, cv::Mat::zeros(_mask.size(), CV_8UC1)); - std::cout << int(label) << std::endl; - } for(int i = 0; i < _mask.rows; i++) for(int j = 0; j < _mask.cols; j++) unique_masks[_mask.at(i,j)].at(i,j) = 255; - uchar largest_label = 0; - int largest_label_count = 0; - for(auto &mask : unique_masks) - { - if(mask.first == 0) - continue; - cv::erode(mask.second, mask.second, cv::Mat(), cv::Point(-1, -1), 2, 1, 1); - mask.second); - int label_count = cv::countNonZero(mask.second); - if(label_count > largest_label_count) - { - largest_label_count = label_count; - largest_label = mask.first; - } - } + // Erode each mask to avoid dynamic edges + for(auto &unique_mask : unique_masks) + cv::erode(unique_mask.second, unique_mask.second, cv::Mat(), cv::Point(-1, -1), 2, 1, 1); - // Track features that only belong to largest mask + // Store plane id for each feature for (auto &p : n_pts) { - if(unique_masks[largest_label].at(p) == 255) + for(auto &unique_mask : unique_masks) { - forw_pts.push_back(p); - ids.push_back(-1); - track_cnt.push_back(1); + if(unique_mask.first == 0) + continue; + + if(unique_mask.second.at(p) == 255) + { + forw_pts.push_back(p); + ids.push_back(-1); + track_cnt.push_back(1); + plane_ids.push_back(unique_mask.first); + break; + } } } } @@ -156,6 +154,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double reduceVector(cur_pts, status); reduceVector(forw_pts, status); reduceVector(ids, status); + reduceVector(plane_ids, status); reduceVector(cur_un_pts, status); reduceVector(track_cnt, status); ROS_DEBUG("temporal optical flow costs: %fms", t_o.toc()); @@ -233,6 +232,7 @@ void FeatureTracker::rejectWithF() reduceVector(forw_pts, status); reduceVector(cur_un_pts, status); reduceVector(ids, status); + reduceVector(plane_ids, status); reduceVector(track_cnt, status); ROS_DEBUG("FM ransac: %d -> %lu: %f", size_a, forw_pts.size(), 1.0 * forw_pts.size() / size_a); ROS_DEBUG("FM ransac costs: %fms", t_f.toc()); diff --git a/feature_tracker/src/feature_tracker.h b/feature_tracker/src/feature_tracker.h index c98c7ac..5db3810 100644 --- a/feature_tracker/src/feature_tracker.h +++ b/feature_tracker/src/feature_tracker.h @@ -54,6 +54,7 @@ class FeatureTracker vector prev_un_pts, cur_un_pts; vector pts_velocity; vector ids; + vector plane_ids; vector track_cnt; map cur_un_pts_map; map prev_un_pts_map; diff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp index d1a9432..b9de94a 100644 --- a/feature_tracker/src/feature_tracker_node.cpp +++ b/feature_tracker/src/feature_tracker_node.cpp @@ -118,6 +118,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag pub_count++; sensor_msgs::PointCloudPtr feature_points(new sensor_msgs::PointCloud); sensor_msgs::ChannelFloat32 id_of_point; + sensor_msgs::ChannelFloat32 plane_id_of_point; sensor_msgs::ChannelFloat32 u_of_point; sensor_msgs::ChannelFloat32 v_of_point; sensor_msgs::ChannelFloat32 velocity_x_of_point; @@ -132,6 +133,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag auto &un_pts = trackerData[i].cur_un_pts; auto &cur_pts = trackerData[i].cur_pts; auto &ids = trackerData[i].ids; + auto &plane_ids = trackerData[i].plane_ids; auto &pts_velocity = trackerData[i].pts_velocity; for (unsigned int j = 0; j < ids.size(); j++) { @@ -146,6 +148,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag feature_points->points.push_back(p); id_of_point.values.push_back(p_id * NUM_OF_CAM + i); + plane_id_of_point.values.push_back(plane_ids[j]); u_of_point.values.push_back(cur_pts[j].x); v_of_point.values.push_back(cur_pts[j].y); velocity_x_of_point.values.push_back(pts_velocity[j].x); @@ -154,6 +157,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag } } feature_points->channels.push_back(id_of_point); + feature_points->channels.push_back(plane_id_of_point); feature_points->channels.push_back(u_of_point); feature_points->channels.push_back(v_of_point); feature_points->channels.push_back(velocity_x_of_point); @@ -193,9 +197,9 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag trackerData[i].m_camera->spaceToPlane(tmp_prev_un_pts, tmp_prev_uv); cv::line(tmp_img, trackerData[i].cur_pts[j], cv::Point2f(tmp_prev_uv.x(), tmp_prev_uv.y()), cv::Scalar(255 , 0, 0), 1 , 8, 0); */ - //char name[10]; - //sprintf(name, "%d", trackerData[i].ids[j]); - //cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0)); + char name[10]; + sprintf(name, "%d", trackerData[i].plane_ids[j]); + cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0)); } } //cv::imshow("vis", stereo_img); -- 2.17.1 From b54e7f50b03e22ccf7bf9acd91a75f7486d828c6 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Mon, 18 Jan 2021 18:34:33 +0530 Subject: [PATCH 23/42] process plane id up to homography computation --- vins_estimator/src/estimator.cpp | 121 ++++++++++-------- vins_estimator/src/estimator.h | 2 +- vins_estimator/src/estimator_node.cpp | 17 +-- vins_estimator/src/feature_manager.cpp | 11 +- vins_estimator/src/feature_manager.h | 12 +- .../src/initial/initial_alignment.h | 4 +- vins_estimator/src/initial/initial_sfm.h | 1 + 7 files changed, 91 insertions(+), 77 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index bfa5c1e..d4b94b3 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -117,7 +117,7 @@ void Estimator::processIMU(double dt, const Vector3d &linear_acceleration, const gyr_0 = angular_velocity; } -void Estimator::processImage(const map>>> &image, const std_msgs::Header &header) +void Estimator::processImage(const map>>> &image, const std_msgs::Header &header) { ROS_DEBUG("new image coming ------------------------------------------"); ROS_DEBUG("Adding feature points %lu", image.size()); @@ -137,23 +137,23 @@ void Estimator::processImage(const map> corres = f_manager.getCorresponding(frame_count - 1, frame_count); - Matrix3d calib_ric; - if (initial_ex_rotation.CalibrationExRotation(corres, pre_integrations[frame_count]->delta_q, calib_ric)) - { - ROS_WARN("initial extrinsic rotation calib success"); - ROS_WARN_STREAM("initial extrinsic rotation: " << endl << calib_ric); - ric[0] = calib_ric; - RIC[0] = calib_ric; - ESTIMATE_EXTRINSIC = 1; - } - } - } + //if(ESTIMATE_EXTRINSIC == 2) + //{ + // ROS_INFO("calibrating extrinsic param, rotation movement is needed"); + // if (frame_count != 0) + // { + // vector> corres = f_manager.getCorresponding(frame_count - 1, frame_count); + // Matrix3d calib_ric; + // if (initial_ex_rotation.CalibrationExRotation(corres, pre_integrations[frame_count]->delta_q, calib_ric)) + // { + // ROS_WARN("initial extrinsic rotation calib success"); + // ROS_WARN_STREAM("initial extrinsic rotation: " << endl << calib_ric); + // ric[0] = calib_ric; + // RIC[0] = calib_ric; + // ESTIMATE_EXTRINSIC = 1; + // } + // } + //} if (solver_flag == INITIAL) { @@ -257,6 +257,7 @@ bool Estimator::initialStructure() SFMFeature tmp_feature; tmp_feature.state = false; tmp_feature.id = it_per_id.feature_id; + tmp_feature.plane_id = it_per_id.plane_id; for (auto &it_per_frame : it_per_id.feature_per_frame) { imu_j++; @@ -445,56 +446,66 @@ bool Estimator::visualInitialAlign() return true; } -bool Estimator::relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l) +//bool Estimator::relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l) +//{ +// // find previous frame which contians enough correspondance and parallex with newest frame +// for (int i = 0; i < WINDOW_SIZE; i++) +// { +// vector> corres; +// corres = f_manager.getCorresponding(i, WINDOW_SIZE); +// if (corres.size() > 20) +// { +// double sum_parallax = 0; +// double average_parallax; +// for (int j = 0; j < int(corres.size()); j++) +// { +// Vector2d pts_0(corres[j].first(0), corres[j].first(1)); +// Vector2d pts_1(corres[j].second(0), corres[j].second(1)); +// double parallax = (pts_0 - pts_1).norm(); +// sum_parallax = sum_parallax + parallax; +// } +// average_parallax = 1.0 * sum_parallax / int(corres.size()); +// if(average_parallax * 460 > 30 && m_estimator.solveRelativeRT(corres, relative_R, relative_T)) +// { +// l = i; +// ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); +// return true; +// } +// } +// } +// return false; +//} + +bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l) { - // find previous frame which contians enough correspondance and parallex with newest frame + // find previous frame which contains enough correspondence and parallex with newest frame for (int i = 0; i < WINDOW_SIZE; i++) { - vector> corres; - corres = f_manager.getCorresponding(i, WINDOW_SIZE); - if (corres.size() > 20) + int lplane_id, lplane_size = 0; + vector> lplane_corres; + map>> plane_corres = f_manager.getCorresponding(i, WINDOW_SIZE); + for(auto &corres : plane_corres) { - double sum_parallax = 0; - double average_parallax; - for (int j = 0; j < int(corres.size()); j++) + if(corres.second.size() > lplane_size) { - Vector2d pts_0(corres[j].first(0), corres[j].first(1)); - Vector2d pts_1(corres[j].second(0), corres[j].second(1)); - double parallax = (pts_0 - pts_1).norm(); - sum_parallax = sum_parallax + parallax; - } - average_parallax = 1.0 * sum_parallax / int(corres.size()); - if(average_parallax * 460 > 30 && m_estimator.solveRelativeRT(corres, relative_R, relative_T)) - { - l = i; - ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); - return true; + lplane_id = corres.first; + lplane_size = corres.second.size(); + lplane_corres = corres.second; } } - } - return false; -} - -bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l) -{ - // find previous frame which contains enough correspondence and parallex with newest frame - for (int i = 0; i < WINDOW_SIZE; i++) - { - vector> corres; - corres = f_manager.getCorresponding(i, WINDOW_SIZE); - if (corres.size() > 20) + if (lplane_size > 20) { double sum_parallax = 0; double average_parallax; - for (int j = 0; j < int(corres.size()); j++) + for (int j = 0; j < int(lplane_corres.size()); j++) { - Vector2d pts_0(corres[j].first(0), corres[j].first(1)); - Vector2d pts_1(corres[j].second(0), corres[j].second(1)); + Vector2d pts_0(lplane_corres[j].first(0), lplane_corres[j].first(1)); + Vector2d pts_1(lplane_corres[j].second(0), lplane_corres[j].second(1)); double parallax = (pts_0 - pts_1).norm(); sum_parallax = sum_parallax + parallax; - } - average_parallax = 1.0 * sum_parallax / int(corres.size()); + + average_parallax = 1.0 * sum_parallax / int(lplane_corres.size()); Matrix4d TrIC = Matrix4d::Identity(); TrIC.block(0,0,3,3) = ric[0]; @@ -505,7 +516,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector for(int k = WINDOW_SIZE - 1; k > i; k--) R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); - if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n)) + if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(lplane_corres, R_imu, TrIC, relative_R, relative_T, n)) { l = i; ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index adad3cd..e109500 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -33,7 +33,7 @@ class Estimator // interface void processIMU(double t, const Vector3d &linear_acceleration, const Vector3d &angular_velocity); - void processImage(const map>>> &image, const std_msgs::Header &header); + void processImage(const map>>> &image, const std_msgs::Header &header); void setReloFrame(double _frame_stamp, int _frame_index, vector &_match_points, Vector3d _relo_t, Matrix3d _relo_r); // internal diff --git a/vins_estimator/src/estimator_node.cpp b/vins_estimator/src/estimator_node.cpp index 1297936..d20f9eb 100644 --- a/vins_estimator/src/estimator_node.cpp +++ b/vins_estimator/src/estimator_node.cpp @@ -293,23 +293,24 @@ void process() ROS_DEBUG("processing vision data with stamp %f \n", img_msg->header.stamp.toSec()); TicToc t_s; - map>>> image; + map>>> image; for (unsigned int i = 0; i < img_msg->points.size(); i++) { int v = img_msg->channels[0].values[i] + 0.5; int feature_id = v / NUM_OF_CAM; int camera_id = v % NUM_OF_CAM; + int plane_id = img_msg->channels[1].values[i]; double x = img_msg->points[i].x; double y = img_msg->points[i].y; double z = img_msg->points[i].z; - double p_u = img_msg->channels[1].values[i]; - double p_v = img_msg->channels[2].values[i]; - double velocity_x = img_msg->channels[3].values[i]; - double velocity_y = img_msg->channels[4].values[i]; + double p_u = img_msg->channels[2].values[i]; + double p_v = img_msg->channels[3].values[i]; + double velocity_x = img_msg->channels[4].values[i]; + double velocity_y = img_msg->channels[5].values[i]; ROS_ASSERT(z == 1); - Eigen::Matrix xyz_uv_velocity; - xyz_uv_velocity << x, y, z, p_u, p_v, velocity_x, velocity_y; - image[feature_id].emplace_back(camera_id, xyz_uv_velocity); + Eigen::Matrix xyz_uv_velocity_pid; + xyz_uv_velocity_pid << x, y, z, p_u, p_v, velocity_x, velocity_y, plane_id; + image[feature_id].emplace_back(camera_id, xyz_uv_velocity_pid); } estimator.processImage(image, img_msg->header); diff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp index 7d5aed9..17f75b9 100644 --- a/vins_estimator/src/feature_manager.cpp +++ b/vins_estimator/src/feature_manager.cpp @@ -42,7 +42,7 @@ int FeatureManager::getFeatureCount() } -bool FeatureManager::addFeatureCheckParallax(int frame_count, const map>>> &image, double td) +bool FeatureManager::addFeatureCheckParallax(int frame_count, const map>>> &image, double td) { ROS_DEBUG("input feature: %d", (int)image.size()); ROS_DEBUG("num of feature: %d", getFeatureCount()); @@ -61,7 +61,8 @@ bool FeatureManager::addFeatureCheckParallax(int frame_count, const mapfeature_id == feature_id) @@ -117,9 +118,9 @@ void FeatureManager::debugShow() } } -vector> FeatureManager::getCorresponding(int frame_count_l, int frame_count_r) +map>> FeatureManager::getCorresponding(int frame_count_l, int frame_count_r) { - vector> corres; + map>> corres; for (auto &it : feature) { if (it.start_frame <= frame_count_l && it.endFrame() >= frame_count_r) @@ -132,7 +133,7 @@ vector> FeatureManager::getCorresponding(int frame_coun b = it.feature_per_frame[idx_r].point; - corres.push_back(make_pair(a, b)); + corres[it.plane_id].push_back(make_pair(a, b)); } } return corres; diff --git a/vins_estimator/src/feature_manager.h b/vins_estimator/src/feature_manager.h index 4e5d3ce..ca94634 100644 --- a/vins_estimator/src/feature_manager.h +++ b/vins_estimator/src/feature_manager.h @@ -18,7 +18,7 @@ using namespace Eigen; class FeaturePerFrame { public: - FeaturePerFrame(const Eigen::Matrix &_point, double td) + FeaturePerFrame(const Eigen::Matrix &_point, double td) { point.x() = _point(0); point.y() = _point(1); @@ -44,7 +44,7 @@ class FeaturePerFrame class FeaturePerId { public: - const int feature_id; + const int feature_id, plane_id; int start_frame; vector feature_per_frame; @@ -56,8 +56,8 @@ class FeaturePerId Vector3d gt_p; - FeaturePerId(int _feature_id, int _start_frame) - : feature_id(_feature_id), start_frame(_start_frame), + FeaturePerId(int _feature_id, int _plane_id, int _start_frame) + : feature_id(_feature_id), plane_id(_plane_id), start_frame(_start_frame), used_num(0), estimated_depth(-1.0), solve_flag(0) { } @@ -76,9 +76,9 @@ class FeatureManager int getFeatureCount(); - bool addFeatureCheckParallax(int frame_count, const map>>> &image, double td); + bool addFeatureCheckParallax(int frame_count, const map>>> &image, double td); void debugShow(); - vector> getCorresponding(int frame_count_l, int frame_count_r); + map>> getCorresponding(int frame_count_l, int frame_count_r); //void updateDepth(const VectorXd &x); void setDepth(const VectorXd &x); diff --git a/vins_estimator/src/initial/initial_alignment.h b/vins_estimator/src/initial/initial_alignment.h index 49bc466..392337e 100644 --- a/vins_estimator/src/initial/initial_alignment.h +++ b/vins_estimator/src/initial/initial_alignment.h @@ -14,11 +14,11 @@ class ImageFrame { public: ImageFrame(){}; - ImageFrame(const map>>>& _points, double _t):t{_t},is_key_frame{false} + ImageFrame(const map>>>& _points, double _t):t{_t},is_key_frame{false} { points = _points; }; - map> > > points; + map> > > points; double t; Matrix3d R; Vector3d T; diff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h index a070742..3c51e44 100644 --- a/vins_estimator/src/initial/initial_sfm.h +++ b/vins_estimator/src/initial/initial_sfm.h @@ -17,6 +17,7 @@ struct SFMFeature { bool state; int id; + int plane_id; vector> observation; double position[3]; double image[3]; // u v 1 -- 2.17.1 From f953992bf0a51d466f28bcd0e78e155e8063560d Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Wed, 20 Jan 2021 14:53:49 +0530 Subject: [PATCH 24/42] Compute homographies for each plane --- vins_estimator/src/estimator.cpp | 61 +++++++++++++++++++++++--------- vins_estimator/src/estimator.h | 2 +- 2 files changed, 45 insertions(+), 18 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index d4b94b3..be5bc42 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -246,6 +246,19 @@ bool Estimator::initialStructure() //return false; } } + + // Bootstrap SfM + map relative_R; + map relative_T; + map n; + int l, lplane_id; + + if (!relativeHPose(relative_R, relative_T, n, l, lplane_id)) + { + ROS_INFO("Not enough features or parallax; Move device around"); + return false; + } + // global sfm Quaterniond Q[frame_count + 1]; Vector3d T[frame_count + 1]; @@ -253,6 +266,10 @@ bool Estimator::initialStructure() vector sfm_f; for (auto &it_per_id : f_manager.feature) { + // Consider only features from largest plne + if(it_per_id.plane_id != lplane_id) + continue; + int imu_j = it_per_id.start_frame - 1; SFMFeature tmp_feature; tmp_feature.state = false; @@ -265,19 +282,11 @@ bool Estimator::initialStructure() tmp_feature.observation.push_back(make_pair(imu_j, Eigen::Vector2d{pts_j.x(), pts_j.y()})); } sfm_f.push_back(tmp_feature); - } - Matrix3d relative_R; - Vector3d relative_T; - Vector3d n; - int l; - if (!relativeHPose(relative_R, relative_T, n, l)) - { - ROS_INFO("Not enough features or parallax; Move device around"); - return false; } + GlobalSFM sfm; if(!sfm.constructH(frame_count + 1, Q, T, l, - relative_R, relative_T, n, + relative_R[lplane_id], relative_T[lplane_id], n[lplane_id], sfm_f, sfm_tracked_points)) { ROS_DEBUG("global SFM failed!"); @@ -285,9 +294,9 @@ bool Estimator::initialStructure() return false; } para_N[0] = 0; - para_N[1] = n[0]; - para_N[2] = n[1]; - para_N[3] = n[2]; + para_N[1] = n[lplane_id][0]; + para_N[2] = n[lplane_id][1]; + para_N[3] = n[lplane_id][2]; //solve pnp for all frame map::iterator frame_it; @@ -476,12 +485,12 @@ bool Estimator::visualInitialAlign() // return false; //} -bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l) +bool Estimator::relativeHPose(map &relative_R, map &relative_t, map &n, int &l, int &lplane_id) { // find previous frame which contains enough correspondence and parallex with newest frame for (int i = 0; i < WINDOW_SIZE; i++) { - int lplane_id, lplane_size = 0; + int lplane_size = 0; vector> lplane_corres; map>> plane_corres = f_manager.getCorresponding(i, WINDOW_SIZE); for(auto &corres : plane_corres) @@ -507,7 +516,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector average_parallax = 1.0 * sum_parallax / int(lplane_corres.size()); - Matrix4d TrIC = Matrix4d::Identity(); + Matrix4d TrIC = Matrix4d::Identity(); TrIC.block(0,0,3,3) = ric[0]; TrIC.block(0,3,3,1) = tic[0]; @@ -516,10 +525,28 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector for(int k = WINDOW_SIZE - 1; k > i; k--) R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); - if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(lplane_corres, R_imu, TrIC, relative_R, relative_T, n)) + Eigen::Matrix3d est_R; + Eigen::Vector3d est_t, est_n; + if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(lplane_corres, R_imu, TrIC, est_R, est_t, est_n)) { l = i; ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); + relative_R[lplane_id] = est_R; + relative_t[lplane_id] = est_t; + n[lplane_id] = est_n; + + for(auto &corres : plane_corres) + { + if(corres.first == lplane_id) + continue; + + if(m_estimator.solveRelativeHRT(corres.second, R_imu, TrIC, est_R, est_t, est_n)) + { + relative_R[corres.first] = est_R; + relative_t[corres.first] = est_t; + n[corres.first] = est_n; + } + } return true; } } diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index e109500..8efe106 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -41,7 +41,7 @@ class Estimator bool initialStructure(); bool visualInitialAlign(); bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l); - bool relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l); + bool relativeHPose(map &relative_R, map &relative_T, map &n, int &l, int &lplane_id); void slideWindow(); void solveOdometry(); void slideWindowNew(); -- 2.17.1 From 1ae6a50d3a5dbd1b9d193a7dfe00be2adadf987c Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 21 Jan 2021 19:28:53 +0530 Subject: [PATCH 25/42] Use individual planes inside estimator --- vins_estimator/src/estimator.cpp | 46 ++++++++++++++------ vins_estimator/src/estimator.h | 8 ++-- vins_estimator/src/feature_manager.cpp | 1 + vins_estimator/src/utility/visualization.cpp | 1 + 4 files changed, 38 insertions(+), 18 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index be5bc42..74261f9 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -176,7 +176,6 @@ void Estimator::processImage(const map relative_R; map relative_T; map n; - int l, lplane_id; + int l; if (!relativeHPose(relative_R, relative_T, n, l, lplane_id)) { @@ -293,10 +293,12 @@ bool Estimator::initialStructure() marginalization_flag = MARGIN_OLD; return false; } - para_N[0] = 0; - para_N[1] = n[lplane_id][0]; - para_N[2] = n[lplane_id][1]; - para_N[3] = n[lplane_id][2]; + + // Save estimated normal variables + for(auto &est_n : n) + { + para_N[est_n.first] = {0, est_n.second(0), est_n.second(1), est_n.second(2)}; + } //solve pnp for all frame map::iterator frame_it; @@ -366,17 +368,16 @@ bool Estimator::initialStructure() frame_it->second.R = R_pnp * RIC[0].transpose(); frame_it->second.T = T_pnp; } - if (visualInitialAlign()) + if (visualInitialAlign(relative_T, T[frame_count])) return true; else { ROS_INFO("misalign visual structure with IMU"); return false; } - } -bool Estimator::visualInitialAlign() +bool Estimator::visualInitialAlign(map &relative_T, Eigen::Vector3d <) { TicToc t_g; VectorXd x; @@ -412,7 +413,7 @@ bool Estimator::visualInitialAlign() f_manager.triangulate(Ps, &(TIC_TMP[0]), &(RIC[0])); s = (x.tail<1>())(0); - para_d[0] = s; + for (int i = 0; i <= WINDOW_SIZE; i++) { pre_integrations[i]->repropagate(Vector3d::Zero(), Bgs[i]); @@ -434,9 +435,19 @@ bool Estimator::visualInitialAlign() it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; + it_per_id.estimated_depth *= s; } + //estimate plane d variables + for(auto &t : relative_T) + { + if(t.first == lplane_id) + para_d[lplane_id] = {s}; + else + para_d[t.first] = {s*lt.norm()/t.second.norm()}; + } + Matrix3d R0 = Utility::g2R(g); double yaw = Utility::R2ypr(R0 * Rs[0]).x(); R0 = Utility::ypr2R(Eigen::Vector3d{-yaw, 0, 0}) * R0; @@ -764,8 +775,11 @@ void Estimator::optimization() problem.AddParameterBlock(para_SpeedBias[i], SIZE_SPEEDBIAS); } - ceres::LocalParameterization *local_n_parameterization = new ceres::QuaternionParameterization(); - problem.AddParameterBlock(para_N, 4, local_n_parameterization); + for(auto &N : para_N) + { + ceres::LocalParameterization *local_n_parameterization = new ceres::QuaternionParameterization(); + problem.AddParameterBlock(N.second.data(), 4, local_n_parameterization); + } for (int i = 0; i < NUM_OF_CAM; i++) { @@ -811,12 +825,16 @@ void Estimator::optimization() it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; - + ++feature_index; int imu_i = it_per_id.start_frame, imu_j = imu_i - 1; Vector3d pts_i = it_per_id.feature_per_frame[0].point; + int pid = it_per_id.plane_id; + + if(!(para_N.count(pid) > 0 && para_d.count(pid) > 0)) + continue; for (auto &it_per_frame : it_per_id.feature_per_frame) { @@ -850,7 +868,7 @@ void Estimator::optimization() problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); - problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_d, para_Ex_Pose[0]); + problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N[pid].data(), para_d[pid].data(), para_Ex_Pose[0]); } f_m_cnt++; } diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index 8efe106..b7c32db 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -39,7 +39,7 @@ class Estimator // internal void clearState(); bool initialStructure(); - bool visualInitialAlign(); + bool visualInitialAlign(map &relative_T, Eigen::Vector3d <); bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l); bool relativeHPose(map &relative_R, map &relative_T, map &n, int &l, int &lplane_id); void slideWindow(); @@ -80,6 +80,7 @@ class Estimator Vector3d Bgs[(WINDOW_SIZE + 1)]; double td; double s; + int lplane_id; Matrix3d back_R0, last_R, last_R0; Vector3d back_P0, last_P, last_P0; @@ -108,7 +109,6 @@ class Estimator vector key_poses; double initial_timestamp; - double para_Pose[WINDOW_SIZE + 1][SIZE_POSE]; double para_SpeedBias[WINDOW_SIZE + 1][SIZE_SPEEDBIAS]; double para_Feature[NUM_OF_F][SIZE_FEATURE]; @@ -116,8 +116,8 @@ class Estimator double para_Retrive_Pose[SIZE_POSE]; double para_Td[1][1]; double para_Tr[1][1]; - double para_N[4]; - double para_d[1]; + map> para_N; + map> para_d; int loop_window_index; diff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp index 17f75b9..45f14b8 100644 --- a/vins_estimator/src/feature_manager.cpp +++ b/vins_estimator/src/feature_manager.cpp @@ -210,6 +210,7 @@ void FeatureManager::triangulate(Vector3d Ps[], Vector3d tic[], Matrix3d ric[]) if (it_per_id.estimated_depth > 0) continue; + int imu_i = it_per_id.start_frame, imu_j = imu_i - 1; ROS_ASSERT(NUM_OF_CAM == 1); diff --git a/vins_estimator/src/utility/visualization.cpp b/vins_estimator/src/utility/visualization.cpp index 167e913..eb87e13 100644 --- a/vins_estimator/src/utility/visualization.cpp +++ b/vins_estimator/src/utility/visualization.cpp @@ -252,6 +252,7 @@ void pubPointCloud(const Estimator &estimator, const std_msgs::Header &header) continue; if (it_per_id.start_frame > WINDOW_SIZE * 3.0 / 4.0 || it_per_id.solve_flag != 1) continue; + int imu_i = it_per_id.start_frame; Vector3d pts_i = it_per_id.feature_per_frame[0].point * it_per_id.estimated_depth; Vector3d w_pts_i = estimator.Rs[imu_i] * (estimator.ric[0] * pts_i + estimator.tic[0]) + estimator.Ps[imu_i]; -- 2.17.1 From 29ecb88038821e1d73c60136485cb1435ac38e06 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 22 Jan 2021 13:03:28 +0530 Subject: [PATCH 26/42] Initialize new planes --- vins_estimator/src/estimator.cpp | 61 +++++++++++++++++++++++++- vins_estimator/src/estimator.h | 3 +- vins_estimator/src/feature_manager.cpp | 25 +++++++++++ vins_estimator/src/feature_manager.h | 2 + 4 files changed, 89 insertions(+), 2 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 74261f9..eee27b5 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -186,6 +186,7 @@ void Estimator::processImage(const map &relative_R, map lplane_corres = corres.second; } } + if (lplane_size > 20) { double sum_parallax = 0; @@ -550,12 +552,13 @@ bool Estimator::relativeHPose(map &relative_R, map { if(corres.first == lplane_id) continue; - + if(m_estimator.solveRelativeHRT(corres.second, R_imu, TrIC, est_R, est_t, est_n)) { relative_R[corres.first] = est_R; relative_t[corres.first] = est_t; n[corres.first] = est_n; + ROS_INFO("Initialized plane %d features", corres.first); } } return true; @@ -565,6 +568,62 @@ bool Estimator::relativeHPose(map &relative_R, map return false; } +void Estimator::initializePlanes() +{ + // check for new plane ids + for(auto &plane_id : f_manager.plane_ids) + { + if(para_N.count(plane_id) > 0) + continue; + else + { + for(int i = 0; i < WINDOW_SIZE - 1; i++) + { + vector> corres = f_manager.getCorrespondingByPlane( + i, WINDOW_SIZE-1, plane_id); + + if(corres.size() > 20) + { + double sum_parallax = 0; + double average_parallax; + for (int j = 0; j < int(corres.size()); j++) + { + Vector2d pts_0(corres[j].first(0), corres[j].first(1)); + Vector2d pts_1(corres[j].second(0), corres[j].second(1)); + double parallax = (pts_0 - pts_1).norm(); + sum_parallax = sum_parallax + parallax; + } + + average_parallax = 1.0 * sum_parallax / int(corres.size()); + + Matrix4d TrIC = Matrix4d::Identity(); + TrIC.block(0,0,3,3) = ric[0]; + TrIC.block(0,3,3,1) = tic[0]; + + //compute corresponding preintegrated rotation + Matrix3d R_imu = Matrix3d::Identity(); + for(int k = WINDOW_SIZE - 2; k > i; k--) + R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); + + Eigen::Matrix3d est_R; + Eigen::Vector3d est_t, est_n, vi_t; + if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, est_R, est_t, est_n)) + { + // Transform est_n to global frame + est_n = Rs[i] * est_n; + para_N[plane_id] = {0, est_n(0), est_n(1), est_n(2)}; + + // Estimate plane d using known metric t from vio + vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]); + para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())}; + ROS_INFO("Initialized plane %d features", plane_id); + } + } + } + } + } +} + void Estimator::solveOdometry() { if (frame_count < WINDOW_SIZE) diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index b7c32db..7c54363 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -42,6 +42,7 @@ class Estimator bool visualInitialAlign(map &relative_T, Eigen::Vector3d <); bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l); bool relativeHPose(map &relative_R, map &relative_T, map &n, int &l, int &lplane_id); + void initializePlanes(); void slideWindow(); void solveOdometry(); void slideWindowNew(); @@ -127,7 +128,7 @@ class Estimator map all_image_frame; IntegrationBase *tmp_pre_integration; - //relocalization variable + //relocalization variables bool relocalization_info; double relo_frame_stamp; double relo_frame_index; diff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp index 45f14b8..64828ee 100644 --- a/vins_estimator/src/feature_manager.cpp +++ b/vins_estimator/src/feature_manager.cpp @@ -62,6 +62,7 @@ bool FeatureManager::addFeatureCheckParallax(int frame_count, const map>> FeatureManager::getCorresponding(int return corres; } +vector> FeatureManager::getCorrespondingByPlane(int frame_count_l, int frame_count_r, int plane_id) +{ + vector> corres; + for (auto &it : feature) + { + if(it.plane_id != plane_id) + continue; + + if (it.start_frame <= frame_count_l && it.endFrame() >= frame_count_r) + { + Vector3d a = Vector3d::Zero(), b = Vector3d::Zero(); + int idx_l = frame_count_l - it.start_frame; + int idx_r = frame_count_r - it.start_frame; + + a = it.feature_per_frame[idx_l].point; + + b = it.feature_per_frame[idx_r].point; + + corres.push_back(make_pair(a, b)); + } + } + return corres; +} + void FeatureManager::setDepth(const VectorXd &x) { int feature_index = -1; diff --git a/vins_estimator/src/feature_manager.h b/vins_estimator/src/feature_manager.h index ca94634..a7f46fc 100644 --- a/vins_estimator/src/feature_manager.h +++ b/vins_estimator/src/feature_manager.h @@ -79,6 +79,8 @@ class FeatureManager bool addFeatureCheckParallax(int frame_count, const map>>> &image, double td); void debugShow(); map>> getCorresponding(int frame_count_l, int frame_count_r); + vector> getCorrespondingByPlane(int frame_count_l, int frame_count_r, int plane_id); + set plane_ids; //void updateDepth(const VectorXd &x); void setDepth(const VectorXd &x); -- 2.17.1 From 41b63dbda66a5cef24d407c3ae2952ff8d876bca Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 22 Jan 2021 19:33:33 +0530 Subject: [PATCH 27/42] Call add points only when there are new points --- feature_tracker/src/feature_tracker.cpp | 10 +++++----- feature_tracker/src/feature_tracker_node.cpp | 2 +- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index 0de4262..c42426c 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -183,15 +183,15 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double if (mask.size() != forw_img.size()) cout << "wrong size " << endl; cv::goodFeaturesToTrack(forw_img, n_pts, MAX_CNT - forw_pts.size(), 0.01, MIN_DIST, mask); + ROS_DEBUG("add feature begins"); + TicToc t_a; + addPoints(_mask); + ROS_DEBUG("selectFeature costs: %fms", t_a.toc()); } + else n_pts.clear(); ROS_DEBUG("detect feature costs: %fms", t_t.toc()); - - ROS_DEBUG("add feature begins"); - TicToc t_a; - addPoints(_mask); - ROS_DEBUG("selectFeature costs: %fms", t_a.toc()); } prev_img = cur_img; prev_pts = cur_pts; diff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp index b9de94a..ba64fed 100644 --- a/feature_tracker/src/feature_tracker_node.cpp +++ b/feature_tracker/src/feature_tracker_node.cpp @@ -86,7 +86,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag { ROS_DEBUG("processing camera %d", i); if (i != 1 || !STEREO_TRACK) - trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image.rowRange(ROW*i,ROW*(i+1)), img_msg->header.stamp.toSec()); + trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image.rowRange(ROW * i,ROW * (i + 1)), img_msg->header.stamp.toSec()); else { if (EQUALIZE) -- 2.17.1 From b8a80ed0a1ce5187c6eaafc514f0f75b6a724d11 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 22 Jan 2021 20:35:06 +0530 Subject: [PATCH 28/42] Revert to F rejection --- feature_tracker/src/feature_tracker.cpp | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index c42426c..6548c36 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -204,7 +204,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double void FeatureTracker::rejectWithF() { - if (forw_pts.size() >= 4) + if (forw_pts.size() >= 8) { ROS_DEBUG("FM ransac begins"); TicToc t_f; @@ -224,8 +224,8 @@ void FeatureTracker::rejectWithF() } vector status; - //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); - cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status); + cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); + //cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status); int size_a = cur_pts.size(); reduceVector(prev_pts, status); reduceVector(cur_pts, status); -- 2.17.1 From 9756346e6b551d799f7610884250094d83da9488 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sun, 24 Jan 2021 09:26:00 +0530 Subject: [PATCH 29/42] Clear plane parameters upon failure, fix multiple initialization bug --- vins_estimator/src/estimator.cpp | 13 ++++++++++--- vins_estimator/src/estimator.h | 2 +- vins_estimator/src/feature_manager.cpp | 1 + 3 files changed, 12 insertions(+), 4 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index eee27b5..88e0942 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -52,6 +52,9 @@ void Estimator::clearState() } } + para_N.clear(); + para_d.clear(); + solver_flag = INITIAL; first_imu = false, sum_of_back = 0; @@ -186,7 +189,7 @@ void Estimator::processImage(const map &relative_T, Eigen: para_d[lplane_id] = {s}; else para_d[t.first] = {s*lt.norm()/t.second.norm()}; + + ROS_INFO("Initialized plane %d features", t.first); } Matrix3d R0 = Utility::g2R(g); @@ -558,7 +565,6 @@ bool Estimator::relativeHPose(map &relative_R, map relative_R[corres.first] = est_R; relative_t[corres.first] = est_t; n[corres.first] = est_n; - ROS_INFO("Initialized plane %d features", corres.first); } } return true; @@ -568,7 +574,7 @@ bool Estimator::relativeHPose(map &relative_R, map return false; } -void Estimator::initializePlanes() +void Estimator::initializeNewPlanes() { // check for new plane ids for(auto &plane_id : f_manager.plane_ids) @@ -617,6 +623,7 @@ void Estimator::initializePlanes() vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]); para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())}; ROS_INFO("Initialized plane %d features", plane_id); + break; } } } diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index 7c54363..35aaff8 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -42,7 +42,7 @@ class Estimator bool visualInitialAlign(map &relative_T, Eigen::Vector3d <); bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l); bool relativeHPose(map &relative_R, map &relative_T, map &n, int &l, int &lplane_id); - void initializePlanes(); + void initializeNewPlanes(); void slideWindow(); void solveOdometry(); void slideWindowNew(); diff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp index 64828ee..e82c694 100644 --- a/vins_estimator/src/feature_manager.cpp +++ b/vins_estimator/src/feature_manager.cpp @@ -23,6 +23,7 @@ void FeatureManager::setRic(Matrix3d _ric[]) void FeatureManager::clearState() { feature.clear(); + plane_ids.clear(); } int FeatureManager::getFeatureCount() -- 2.17.1 From 6ae97fe2b8e72f6e48bd9e857e6ee3e6d21bce44 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sun, 24 Jan 2021 09:35:02 +0530 Subject: [PATCH 30/42] Add airsim config, launch files --- config/airsim_config.yaml | 83 +++++++++++++++++++ .../launch/playback_vins_airsim.launch | 29 +++++++ 2 files changed, 112 insertions(+) create mode 100644 config/airsim_config.yaml create mode 100644 vins_estimator/launch/playback_vins_airsim.launch diff --git a/config/airsim_config.yaml b/config/airsim_config.yaml new file mode 100644 index 0000000..f7dda5c --- /dev/null +++ b/config/airsim_config.yaml @@ -0,0 +1,83 @@ +%YAML:1.0 + +#common parameters +imu_topic: "/imu" +image_topic: "/image" +mask_topic: "/mask" +output_path: "/home/karnik/output/" + +#camera calibration +model_type: PINHOLE +camera_name: camera +image_width: 640 +image_height: 480 +distortion_parameters: + k1: 0 + k2: 0 + p1: 0 + p2: 0 +projection_parameters: + fx: 320 + fy: 320 + cx: 320 + cy: 240 + +# Extrinsic parameter between IMU and Camera. +estimate_extrinsic: 0 # 0 Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it. + # 1 Have an initial guess about extrinsic parameters. We will optimize around your initial guess. + # 2 Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning. +#If you choose 0 or 1, you should write down the following matrix. + +#Rotation from camera frame to imu frame, imu^R_cam +extrinsicRotation: !!opencv-matrix + rows: 3 + cols: 3 + dt: d + data: [-0.00051157,-0.00020788,0.99999985,0.99999972,-0.00054877,0.00051145,0.00054867,0.99999983,0.00020816] #[0,0,1,1,0,0,0,1,0] + +#Translation from camera frame to imu frame, imu^T_cam +extrinsicTranslation: !!opencv-matrix + rows: 3 + cols: 1 + dt: d + data: [0.53937284, 0.02825352, 0.02064487] #[0.50,0,0] + +#feature traker parameters +max_cnt: 300 # max feature number in feature tracking +min_dist: 30 # min distance between two features +freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image +F_threshold: 1.0 # ransac threshold (pixel) +show_track: 1 # publish tracking image as topic +equalize: 1 # if image is too dark or light, trun on equalize to find enough features +fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points + +#optimization parameters +max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time +max_num_iterations: 8 # max solver itrations, to guarantee real time +keyframe_parallax: 10.0 # keyframe selection threshold (pixel) + +#imu parameters The more accurate parameters you provide, the better performance +acc_n: 0.1 # accelerometer measurement noise standard deviation. #0.2 0.04 +gyr_n: 0.001 # gyroscope measurement noise standard deviation. #0.05 0.004 +acc_w: 0.0002 # accelerometer bias random work noise standard deviation. #0.02 +gyr_w: 2.0e-5 # gyroscope bias random work noise standard deviation. #4.0e-5 +g_norm: 9.8067 # gravity magnitude + +#loop closure parameters +loop_closure: 1 # start loop closure +load_previous_pose_graph: 0 # load and reuse previous pose graph; load from 'pose_graph_save_path' +fast_relocalization: 0 # useful in real-time and large project +pose_graph_save_path: "/home/karnik/" # save and load path + +#unsynchronization parameters +estimate_td: 0 # online estimate time offset between camera and imu +td: -0.051134656900876006 # -0.04109 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock) + +#rolling shutter parameters +rolling_shutter: 0 # 0: global shutter camera, 1: rolling shutter camera +rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per frame (from data sheet). + +#visualization parameters +save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0 +visualize_imu_forward: 0 # output imu forward propogation to achieve low latency and high frequence results +visualize_camera_size: 0.4 # size of camera marker in RVIZ diff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch new file mode 100644 index 0000000..82ed3a8 --- /dev/null +++ b/vins_estimator/launch/playback_vins_airsim.launch @@ -0,0 +1,29 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + -- 2.17.1 From 3e110dc2125a47fd1f0bdad80065a0a2f1d0cf4f Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Tue, 2 Feb 2021 09:19:45 +0530 Subject: [PATCH 31/42] Use only largest plane inside estimator --- config/airsim_config.yaml | 2 +- .../launch/playback_vins_airsim.launch | 4 ++-- vins_estimator/src/estimator.cpp | 8 +++++-- vins_estimator/src/estimator.h | 2 ++ vins_estimator/src/feature_manager.cpp | 24 +++++++++++++++++++ vins_estimator/src/feature_manager.h | 1 + 6 files changed, 36 insertions(+), 5 deletions(-) diff --git a/config/airsim_config.yaml b/config/airsim_config.yaml index f7dda5c..ed9c012 100644 --- a/config/airsim_config.yaml +++ b/config/airsim_config.yaml @@ -43,7 +43,7 @@ extrinsicTranslation: !!opencv-matrix data: [0.53937284, 0.02825352, 0.02064487] #[0.50,0,0] #feature traker parameters -max_cnt: 300 # max feature number in feature tracking +max_cnt: 250 # max feature number in feature tracking min_dist: 30 # min distance between two features freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image F_threshold: 1.0 # ransac threshold (pixel) diff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch index 82ed3a8..e86876a 100644 --- a/vins_estimator/launch/playback_vins_airsim.launch +++ b/vins_estimator/launch/playback_vins_airsim.launch @@ -1,8 +1,8 @@ - - + + diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 88e0942..89d5b15 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -54,6 +54,7 @@ void Estimator::clearState() para_N.clear(); para_d.clear(); + init_pids.clear(); solver_flag = INITIAL; first_imu = false, @@ -453,6 +454,7 @@ bool Estimator::visualInitialAlign(map &relative_T, Eigen: else para_d[t.first] = {s*lt.norm()/t.second.norm()}; + init_pids.push_back(t.first); ROS_INFO("Initialized plane %d features", t.first); } @@ -622,6 +624,7 @@ void Estimator::initializeNewPlanes() // Estimate plane d using known metric t from vio vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]); para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())}; + init_pids.push_back(plane_id); ROS_INFO("Initialized plane %d features", plane_id); break; } @@ -886,6 +889,8 @@ void Estimator::optimization() } int f_m_cnt = 0; int feature_index = -1; + int largest_pid = f_manager.getLargestPlaneId(init_pids); + ROS_INFO("Using features from plane %d", largest_pid); for (auto &it_per_id : f_manager.feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); @@ -899,7 +904,7 @@ void Estimator::optimization() Vector3d pts_i = it_per_id.feature_per_frame[0].point; int pid = it_per_id.plane_id; - if(!(para_N.count(pid) > 0 && para_d.count(pid) > 0)) + if(pid != largest_pid) continue; for (auto &it_per_frame : it_per_id.feature_per_frame) @@ -974,7 +979,6 @@ void Estimator::optimization() } } } - } ceres::Solver::Options options; diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index 35aaff8..49cd656 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -120,6 +120,8 @@ class Estimator map> para_N; map> para_d; + vector init_pids; + int loop_window_index; MarginalizationInfo *last_marginalization_info; diff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp index e82c694..16cf5e5 100644 --- a/vins_estimator/src/feature_manager.cpp +++ b/vins_estimator/src/feature_manager.cpp @@ -42,6 +42,30 @@ int FeatureManager::getFeatureCount() return cnt; } +int FeatureManager::getLargestPlaneId(std::vector &init_pids) +{ + std::map plane_counts; + for(auto &pid : init_pids) + plane_counts[pid] = 0; + + for(auto &it : feature) + { + if(plane_counts.count(it.plane_id) > 0) + plane_counts[it.plane_id]++; + } + + int largest_count = 0, largest_pid; + for(auto &plane_count : plane_counts) + { + if(plane_count.second > largest_count) + { + largest_count = plane_count.second; + largest_pid = plane_count.first; + } + } + + return largest_pid; +} bool FeatureManager::addFeatureCheckParallax(int frame_count, const map>>> &image, double td) { diff --git a/vins_estimator/src/feature_manager.h b/vins_estimator/src/feature_manager.h index a7f46fc..1080e7f 100644 --- a/vins_estimator/src/feature_manager.h +++ b/vins_estimator/src/feature_manager.h @@ -75,6 +75,7 @@ class FeatureManager void clearState(); int getFeatureCount(); + int getLargestPlaneId(std::vector &init_pids); bool addFeatureCheckParallax(int frame_count, const map>>> &image, double td); void debugShow(); -- 2.17.1 From 20033975cd339f20def81dab5904dd2843d692cc Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 4 Feb 2021 09:46:20 +0530 Subject: [PATCH 32/42] Enforce unit N using homo parameterization --- .../launch/playback_vins_airsim.launch | 3 ++- vins_estimator/src/estimator.cpp | 8 ++++---- vins_estimator/src/estimator.h | 2 +- vins_estimator/src/factor/homography_factor.h | 7 +++---- vins_estimator/src/initial/initial_sfm.cpp | 20 +++++++++---------- vins_estimator/src/initial/initial_sfm.h | 4 ++-- 6 files changed, 22 insertions(+), 22 deletions(-) diff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch index e86876a..61d8cac 100644 --- a/vins_estimator/launch/playback_vins_airsim.launch +++ b/vins_estimator/launch/playback_vins_airsim.launch @@ -2,7 +2,7 @@ - + @@ -26,4 +26,5 @@ + diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 89d5b15..d4ec043 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -302,7 +302,7 @@ bool Estimator::initialStructure() // Save estimated normal variables for(auto &est_n : n) { - para_N[est_n.first] = {0, est_n.second(0), est_n.second(1), est_n.second(2)}; + para_N[est_n.first] = {est_n.second(0), est_n.second(1), est_n.second(2)}; } //solve pnp for all frame @@ -619,7 +619,7 @@ void Estimator::initializeNewPlanes() { // Transform est_n to global frame est_n = Rs[i] * est_n; - para_N[plane_id] = {0, est_n(0), est_n(1), est_n(2)}; + para_N[plane_id] = {est_n(0), est_n(1), est_n(2)}; // Estimate plane d using known metric t from vio vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]); @@ -846,8 +846,8 @@ void Estimator::optimization() for(auto &N : para_N) { - ceres::LocalParameterization *local_n_parameterization = new ceres::QuaternionParameterization(); - problem.AddParameterBlock(N.second.data(), 4, local_n_parameterization); + ceres::LocalParameterization *local_n_parameterization = new ceres::HomogeneousVectorParameterization(3); + problem.AddParameterBlock(N.second.data(), 3, local_n_parameterization); } for (int i = 0; i < NUM_OF_CAM; i++) diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index 49cd656..e794866 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -117,7 +117,7 @@ class Estimator double para_Retrive_Pose[SIZE_POSE]; double para_Td[1][1]; double para_Tr[1][1]; - map> para_N; + map> para_N; map> para_d; vector init_pids; diff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h index ce6612b..adb6397 100644 --- a/vins_estimator/src/factor/homography_factor.h +++ b/vins_estimator/src/factor/homography_factor.h @@ -22,10 +22,9 @@ struct HomographyFactor Eigen::Quaternion qic; qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6]; - Eigen::Map> q_n(para_n); Eigen::Matrix n; - n << q_n[1], q_n[2], q_n[3]; - Eigen::Matrix n_imu_1 = qic*(n.normalized()) + tic; + n << para_n[0], para_n[1], para_n[2]; + Eigen::Matrix n_imu_1 = qic*n + tic; Eigen::Matrix n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi; Eigen::Map> inv_depth(para_inv_depth); @@ -47,7 +46,7 @@ struct HomographyFactor static ceres::CostFunction* Create(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) { - return (new ceres::AutoDiffCostFunction + return (new ceres::AutoDiffCostFunction (new HomographyFactor(_pts_i, _pts_j))); } diff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp index ad9f729..7726410 100644 --- a/vins_estimator/src/initial/initial_sfm.cpp +++ b/vins_estimator/src/initial/initial_sfm.cpp @@ -342,7 +342,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, Quaterniond c_Quat[frame_num]; double c_rotation[frame_num][4]; double c_translation[frame_num][3]; - double c_normal[4]; + double c_normal[3]; Eigen::Matrix Pose[frame_num]; c_Quat[l] = q[l].inverse(); @@ -420,7 +420,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_f[j].image[1] = point1[1]; sfm_f[j].image[2] = 1; //cout << "trangulated : " << frame_0 << " " << frame_1 << " 3d point : " << j << " " << point_3d.transpose() << endl; - } + } } /* @@ -439,6 +439,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, //full BA ceres::Problem problem; ceres::LocalParameterization* local_parameterization = new ceres::QuaternionParameterization(); + ceres::LocalParameterization* local_n_parameterization = new ceres::HomogeneousVectorParameterization(3); //cout << " begin full BA " << endl; for (int i = 0; i < frame_num; i++) { @@ -462,11 +463,10 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, } } - c_normal[0] = 0; - c_normal[1] = n[0]; - c_normal[2] = n[1]; - c_normal[3] = n[2]; - problem.AddParameterBlock(c_normal, 4, local_parameterization); + c_normal[0] = n[0]; + c_normal[1] = n[1]; + c_normal[2] = n[2]; + problem.AddParameterBlock(c_normal, 3, local_n_parameterization); for (int i = 0; i < feature_num; i++) { @@ -527,9 +527,9 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]); } - n[0] = c_normal[1]; - n[1] = c_normal[2]; - n[2] = c_normal[3]; + n[0] = c_normal[0]; + n[1] = c_normal[1]; + n[2] = c_normal[2]; return true; diff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h index 3c51e44..2ee3418 100644 --- a/vins_estimator/src/initial/initial_sfm.h +++ b/vins_estimator/src/initial/initial_sfm.h @@ -67,7 +67,7 @@ struct ReprojectionErrorH { T rp[3]; ceres::QuaternionRotatePoint(R, point, rp); - T np = n[1] * point[0] + n[2] * point[1] + n[3] * point[2]; + T np = n[0] * point[0] + n[1] * point[1] + n[2] * point[2]; rp[0] += t[0]*np; rp[1] += t[1]*np; rp[2] += t[2]*np; T xp = rp[0] / rp[2]; T yp = rp[1] / rp[2]; @@ -80,7 +80,7 @@ struct ReprojectionErrorH const double observed_y) { return (new ceres::AutoDiffCostFunction< - ReprojectionErrorH, 2, 4, 3, 4, 3>( + ReprojectionErrorH, 2, 4, 3, 3, 3>( new ReprojectionErrorH(observed_x,observed_y))); } -- 2.17.1 From 8e05539a6ebdaa1ebe9e867551124d7cd7867ee3 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Tue, 9 Feb 2021 09:22:40 +0530 Subject: [PATCH 33/42] Fix bug in normal transformation --- vins_estimator/src/factor/homography_factor.h | 4 ++-- vins_estimator/src/initial/initial_sfm.cpp | 2 +- vins_estimator/src/initial/initial_sfm.h | 7 +++---- 3 files changed, 6 insertions(+), 7 deletions(-) diff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h index adb6397..7107036 100644 --- a/vins_estimator/src/factor/homography_factor.h +++ b/vins_estimator/src/factor/homography_factor.h @@ -24,8 +24,8 @@ struct HomographyFactor Eigen::Matrix n; n << para_n[0], para_n[1], para_n[2]; - Eigen::Matrix n_imu_1 = qic*n + tic; - Eigen::Matrix n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi; + Eigen::Matrix n_imu_1 = qic*n;// + tic; + Eigen::Matrix n_imu_i = qi.inverse()*n_imu_1;// - qi.inverse()*pi; Eigen::Map> inv_depth(para_inv_depth); diff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp index 7726410..f319bd9 100644 --- a/vins_estimator/src/initial/initial_sfm.cpp +++ b/vins_estimator/src/initial/initial_sfm.cpp @@ -481,7 +481,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, sfm_f[i].observation[j].second.y()); problem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l], - sfm_f[i].position); + sfm_f[i].position); ceres::CostFunction* h_cost_function = ReprojectionErrorH::Create( sfm_f[i].observation[j].second.x(), diff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h index 2ee3418..9699356 100644 --- a/vins_estimator/src/initial/initial_sfm.h +++ b/vins_estimator/src/initial/initial_sfm.h @@ -11,8 +11,6 @@ using namespace Eigen; using namespace std; - - struct SFMFeature { bool state; @@ -66,8 +64,9 @@ struct ReprojectionErrorH bool operator()(const T* const R, const T* const t, const T* const n, const T* point, T* residuals) const { T rp[3]; - ceres::QuaternionRotatePoint(R, point, rp); - T np = n[0] * point[0] + n[1] * point[1] + n[2] * point[2]; + T norm_point[3] = {point[0]/point[2], point[1]/point[2], point[2]/point[2]}; + ceres::QuaternionRotatePoint(R, norm_point, rp); + T np = n[0] * norm_point[0] + n[1] * norm_point[1] + n[2] * norm_point[2]; rp[0] += t[0]*np; rp[1] += t[1]*np; rp[2] += t[2]*np; T xp = rp[0] / rp[2]; T yp = rp[1] / rp[2]; -- 2.17.1 From e5f5255debd246caf27345f43b185b5bc58d968b Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Tue, 9 Feb 2021 09:23:13 +0530 Subject: [PATCH 34/42] Modify config --- config/hvins_airsim_config.yaml | 83 +++++++++++++++++++ .../launch/playback_vins_airsim.launch | 4 +- 2 files changed, 85 insertions(+), 2 deletions(-) create mode 100644 config/hvins_airsim_config.yaml diff --git a/config/hvins_airsim_config.yaml b/config/hvins_airsim_config.yaml new file mode 100644 index 0000000..5af6480 --- /dev/null +++ b/config/hvins_airsim_config.yaml @@ -0,0 +1,83 @@ +%YAML:1.0 + +#common parameters +imu_topic: "/imu_throttled" +image_topic: "/image" +mask_topic: "/mask" +output_path: "/home/karnik/output/" + +#camera calibration +model_type: PINHOLE +camera_name: camera +image_width: 640 +image_height: 480 +distortion_parameters: + k1: 0 + k2: 0 + p1: 0 + p2: 0 +projection_parameters: + fx: 320 + fy: 320 + cx: 320 + cy: 240 + +# Extrinsic parameter between IMU and Camera. +estimate_extrinsic: 0 # 0 Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it. + # 1 Have an initial guess about extrinsic parameters. We will optimize around your initial guess. + # 2 Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning. +#If you choose 0 or 1, you should write down the following matrix. + +#Rotation from camera frame to imu frame, imu^R_cam +extrinsicRotation: !!opencv-matrix + rows: 3 + cols: 3 + dt: d + data: [-0.00017913,-0.00042042,0.9999999,0.99999996,-0.00023746, 0.00017903,0.00023739,0.99999988,0.00042046] #[0,0,1,1,0,0,0,1,0] + +#Translation from camera frame to imu frame, imu^T_cam +extrinsicTranslation: !!opencv-matrix + rows: 3 + cols: 1 + dt: d + data: [0.49118794, 0.02176473, 0.01374164] #[0.50,0,0] + +#feature traker parameters +max_cnt: 150 # max feature number in feature tracking +min_dist: 30 # min distance between two features +freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image +F_threshold: 1.0 # ransac threshold (pixel) +show_track: 1 # publish tracking image as topic +equalize: 1 # if image is too dark or light, trun on equalize to find enough features +fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points + +#optimization parameters +max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time +max_num_iterations: 8 # max solver itrations, to guarantee real time +keyframe_parallax: 10.0 # keyframe selection threshold (pixel) + +#imu parameters The more accurate parameters you provide, the better performance +acc_n: 0.074427240 # accelerometer measurement noise standard deviation. #0.2 0.04 +gyr_n: 0.002759607 # gyroscope measurement noise standard deviation. #0.05 0.004 +acc_w: 3.9471004e-7 # accelerometer bias random work noise standard deviation. #0.02 +gyr_w: 3.1538983e-8 # gyroscope bias random work noise standard deviation. #4.0e-5 +g_norm: 9.80665 # gravity magnitude + +#loop closure parameters +loop_closure: 1 # start loop closure +load_previous_pose_graph: 0 # load and reuse previous pose graph; load from 'pose_graph_save_path' +fast_relocalization: 0 # useful in real-time and large project +pose_graph_save_path: "/home/karnik/" # save and load path + +#unsynchronization parameters +estimate_td: 0 # online estimate time offset between camera and imu +td: -0.03079132514112524 # -0.04109 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock) + +#rolling shutter parameters +rolling_shutter: 0 # 0: global shutter camera, 1: rolling shutter camera +rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per frame (from data sheet). + +#visualization parameters +save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0 +visualize_imu_forward: 0 # output imu forward propogation to achieve low latency and high frequence results +visualize_camera_size: 0.4 # size of camera marker in RVIZ diff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch index 61d8cac..9925a21 100644 --- a/vins_estimator/launch/playback_vins_airsim.launch +++ b/vins_estimator/launch/playback_vins_airsim.launch @@ -8,10 +8,10 @@ - + - + -- 2.17.1 From 2de4f06206bb221405ffa3e10f99766d6e0847e0 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 11 Feb 2021 11:13:51 +0530 Subject: [PATCH 35/42] Modify config, use only H inliers for mean vector --- config/hvins_airsim_config.yaml | 2 +- config/vins_rviz_config.rviz | 10 +++++----- vins_estimator/src/initial/solve_5pts.cpp | 18 +++++++++++------- 3 files changed, 17 insertions(+), 13 deletions(-) diff --git a/config/hvins_airsim_config.yaml b/config/hvins_airsim_config.yaml index 5af6480..55546be 100644 --- a/config/hvins_airsim_config.yaml +++ b/config/hvins_airsim_config.yaml @@ -43,7 +43,7 @@ extrinsicTranslation: !!opencv-matrix data: [0.49118794, 0.02176473, 0.01374164] #[0.50,0,0] #feature traker parameters -max_cnt: 150 # max feature number in feature tracking +max_cnt: 200 # max feature number in feature tracking min_dist: 30 # min distance between two features freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image F_threshold: 1.0 # ransac threshold (pixel) diff --git a/config/vins_rviz_config.rviz b/config/vins_rviz_config.rviz index 55ed131..e0c3a51 100644 --- a/config/vins_rviz_config.rviz +++ b/config/vins_rviz_config.rviz @@ -6,7 +6,7 @@ Panels: Expanded: - /VIO1 Splitter Ratio: 0.4651159942150116 - Tree Height: 151 + Tree Height: 140 - Class: rviz/Selection Name: Selection - Class: rviz/Tool Properties @@ -295,7 +295,7 @@ Visualization Manager: Queue Size: 100 Value: true - Class: rviz/Image - Enabled: true + Enabled: false Image Topic: /pose_graph/match_image Max Value: 1 Median window: 5 @@ -305,7 +305,7 @@ Visualization Manager: Queue Size: 2 Transport Hint: raw Unreliable: false - Value: true + Value: false - Class: rviz/Marker Enabled: true Marker Topic: /pose_graph/key_odometrys @@ -506,7 +506,7 @@ Window Geometry: Height: 734 Hide Left Dock: false Hide Right Dock: true - QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000000de0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650100000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001ac000000d4000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002a30000003bfc0100000002fb0000000800540069006d00650100000000000002a3000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000001470000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000 + QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000001740000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650000000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001b7000000c9000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002eb0000003bfc0100000002fb0000000800540069006d00650100000000000002eb000002eb00fffffffb0000000800540069006d006501000000000000045000000000000000000000018f0000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000 Selection: collapsed: false Time: @@ -515,7 +515,7 @@ Window Geometry: collapsed: false Views: collapsed: true - Width: 675 + Width: 747 X: 685 Y: 4 loop_match_image: diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index a14ec1c..f7aa380 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -240,19 +240,23 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c } cv::Mat mask; //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask); - cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480); + cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/460, mask, 2000, 0.99); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); // Compute mean point vector Eigen::Vector3d mean_l(0, 0, 1); - for(cv::Point2f &point : ll) + int inlier_count = 0; + for(int i = 0; i < ll.size(); i++) { - mean_l(0) += point.x; - mean_l(1) += point.y; + if(mask.at(i,0) != 0) + { + mean_l(0) += ll[i].x; + mean_l(1) += ll[i].y; + inlier_count++; + } } - - mean_l(0) /= int(ll.size()); - mean_l(1) /= int(ll.size()); + mean_l(0) /= int(inlier_count); + mean_l(1) /= int(inlier_count); Eigen::Matrix4d est_Tr; Eigen::Vector3d est_n; -- 2.17.1 From 518208852f13b10d7ebc3efaa254564c09f35a46 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Thu, 11 Feb 2021 20:22:00 +0530 Subject: [PATCH 36/42] Consider mask during feature tracking as well, use H for feature rejection instead of F --- config/hvins_airsim_config.yaml | 4 +-- feature_tracker/src/feature_tracker.cpp | 45 +++++++++++++++++-------- feature_tracker/src/feature_tracker.h | 4 +-- 3 files changed, 35 insertions(+), 18 deletions(-) diff --git a/config/hvins_airsim_config.yaml b/config/hvins_airsim_config.yaml index 55546be..7847070 100644 --- a/config/hvins_airsim_config.yaml +++ b/config/hvins_airsim_config.yaml @@ -43,10 +43,10 @@ extrinsicTranslation: !!opencv-matrix data: [0.49118794, 0.02176473, 0.01374164] #[0.50,0,0] #feature traker parameters -max_cnt: 200 # max feature number in feature tracking +max_cnt: 250 # max feature number in feature tracking min_dist: 30 # min distance between two features freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image -F_threshold: 1.0 # ransac threshold (pixel) +H_threshold: 1.0 # ransac threshold (pixel) show_track: 1 # publish tracking image as topic equalize: 1 # if image is too dark or light, trun on equalize to find enough features fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points diff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp index 6548c36..5537c6b 100644 --- a/feature_tracker/src/feature_tracker.cpp +++ b/feature_tracker/src/feature_tracker.cpp @@ -32,13 +32,12 @@ FeatureTracker::FeatureTracker() { } -void FeatureTracker::setMask() +void FeatureTracker::setMask(const cv::Mat &_mask) { - if(FISHEYE) - mask = fisheye_mask.clone(); - else - mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255)); - + //mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255)); + mask = _mask.clone(); + //cv::Mat erosion_mat = cv::getStructuringElement(cv::MORPH_RECT, cv::Size(21,21)); + cv::erode(mask, mask, cv::Mat(), cv::Point(-1, -1), 2, 1, 1); // prefer to keep features that are tracked for long time vector>>> cnt_pts_id_pid; @@ -61,7 +60,7 @@ void FeatureTracker::setMask() for (auto &it : cnt_pts_id_pid) { - if (mask.at(it.second.first) == 255) + if (mask.at(it.second.first) != 0) { forw_pts.push_back(it.second.first); ids.push_back(it.second.second.first); @@ -165,10 +164,10 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double if (PUB_THIS_FRAME) { - rejectWithF(); + rejectWithH(); ROS_DEBUG("set mask begins"); TicToc t_m; - setMask(); + setMask(_mask); ROS_DEBUG("set mask costs %fms", t_m.toc()); ROS_DEBUG("detect feature begins"); @@ -202,11 +201,11 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double prev_time = cur_time; } -void FeatureTracker::rejectWithF() +void FeatureTracker::rejectWithH() { - if (forw_pts.size() >= 8) + if (forw_pts.size() >= 4) { - ROS_DEBUG("FM ransac begins"); + ROS_DEBUG("HM ransac begins"); TicToc t_f; vector un_cur_pts(cur_pts.size()), un_forw_pts(forw_pts.size()); for (unsigned int i = 0; i < cur_pts.size(); i++) @@ -223,9 +222,27 @@ void FeatureTracker::rejectWithF() un_forw_pts[i] = cv::Point2f(tmp_p.x(), tmp_p.y()); } - vector status; - cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); + vector status(un_cur_pts.size()); + //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); //cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status); + + map> pid_un_cur_pts, pid_un_forw_pts; + map, int> new_ids_to_old; + for(uint i = 0; i < un_cur_pts.size(); i++) + { + pid_un_cur_pts[plane_ids[i]].push_back(un_cur_pts[i]); + pid_un_forw_pts[plane_ids[i]].push_back(un_forw_pts[i]); + new_ids_to_old[make_pair(plane_ids[i], pid_un_cur_pts[plane_ids[i]].size()-1)] = i; + } + + for(auto &imap : pid_un_cur_pts) + { + vector pid_status; + cv::findHomography(pid_un_cur_pts[imap.first], pid_un_forw_pts[imap.first], cv::RANSAC, 1.0, pid_status, 2000, 0.99); + for(int i = 0; i < pid_status.size(); i++) + status[new_ids_to_old[make_pair(imap.first, i)]] = pid_status[i]; + } + int size_a = cur_pts.size(); reduceVector(prev_pts, status); reduceVector(cur_pts, status); diff --git a/feature_tracker/src/feature_tracker.h b/feature_tracker/src/feature_tracker.h index 5db3810..72cc8d8 100644 --- a/feature_tracker/src/feature_tracker.h +++ b/feature_tracker/src/feature_tracker.h @@ -32,7 +32,7 @@ class FeatureTracker void readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time); - void setMask(); + void setMask(const cv::Mat &_mask); void addPoints(const cv::Mat &_mask); @@ -42,7 +42,7 @@ class FeatureTracker void showUndistortion(const string &name); - void rejectWithF(); + void rejectWithH(); void undistortedPoints(); -- 2.17.1 From 911c4c709cd356869c9a3394bb3a2cf4602de65c Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Fri, 12 Feb 2021 10:58:47 +0530 Subject: [PATCH 37/42] Revert to airsim spatial extrinsics --- config/hvins_airsim_config.yaml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/config/hvins_airsim_config.yaml b/config/hvins_airsim_config.yaml index 7847070..f8e2908 100644 --- a/config/hvins_airsim_config.yaml +++ b/config/hvins_airsim_config.yaml @@ -33,14 +33,14 @@ extrinsicRotation: !!opencv-matrix rows: 3 cols: 3 dt: d - data: [-0.00017913,-0.00042042,0.9999999,0.99999996,-0.00023746, 0.00017903,0.00023739,0.99999988,0.00042046] #[0,0,1,1,0,0,0,1,0] + data: [0,0,1,1,0,0,0,1,0] #[-0.00017913,-0.00042042,0.9999999,0.99999996,-0.00023746, 0.00017903,0.00023739,0.99999988,0.00042046] #[0,0,1,1,0,0,0,1,0] #Translation from camera frame to imu frame, imu^T_cam extrinsicTranslation: !!opencv-matrix rows: 3 cols: 1 dt: d - data: [0.49118794, 0.02176473, 0.01374164] #[0.50,0,0] + data: [0.50, 0, 0] #[0.49118794, 0.02176473, 0.01374164] #[0.50,0,0] #feature traker parameters max_cnt: 250 # max feature number in feature tracking -- 2.17.1 From 592768ea351956ff4a573292e78bc67449438db2 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sun, 14 Feb 2021 12:48:22 +0530 Subject: [PATCH 38/42] Handle possible decomposeH exception --- vins_estimator/src/initial/solve_5pts.cpp | 29 +++++++++++++---------- 1 file changed, 17 insertions(+), 12 deletions(-) diff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp index f7aa380..bfba3f3 100644 --- a/vins_estimator/src/initial/solve_5pts.cpp +++ b/vins_estimator/src/initial/solve_5pts.cpp @@ -259,11 +259,14 @@ bool MotionEstimator::solveRelativeHRT(const vector> &c mean_l(1) /= int(inlier_count); Eigen::Matrix4d est_Tr; - Eigen::Vector3d est_n; + Eigen::Vector3d est_n(0,0,0); decomposeH(H, K, R_imu, TrIC, mean_l, est_Tr, est_n); Rotation = est_Tr.block(0,0,3,3); Translation = est_Tr.block(0,3,3,1); n = est_n; + if(n.isZero()) + return false; + return true; } return false; @@ -295,7 +298,6 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri //Tr = TrIC * Tr * TrIC.inverse(); Tr = Tr.inverse().eval(); - Vector3d e3(0, 0, 1); Vector3d n; cv::cv2eigen(cv_ns[i], n); n.normalize(); @@ -306,18 +308,21 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri } } - vector rot_diff; - for(size_t i = 0; i < positive_depth_transforms.size(); i++) + if(positive_depth_transforms.size() > 0) { - Eigen::Matrix4d Tr = TrIC * positive_depth_transforms[i] * TrIC.inverse(); - Eigen::Matrix3d R = Tr.block(0,0,3,3); - double f = (R.transpose()*R_imu - MatrixXd::Identity(3,3)).norm(); - rot_diff.push_back(f); - } + vector rot_diff; + for(size_t i = 0; i < positive_depth_transforms.size(); i++) + { + Eigen::Matrix4d Tr = TrIC * positive_depth_transforms[i] * TrIC.inverse(); + Eigen::Matrix3d R = Tr.block(0,0,3,3); + double f = (R.transpose()*R_imu - MatrixXd::Identity(3,3)).norm(); + rot_diff.push_back(f); + } - int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin(); - est_Tr = positive_depth_transforms[min_index]; - est_n = positive_depth_normals[min_index]; + int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin(); + est_Tr = positive_depth_transforms[min_index]; + est_n = positive_depth_normals[min_index]; + } } else -- 2.17.1 From 1d0449b500e822224680d90503fcf785a2b28ab2 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sun, 14 Feb 2021 12:48:49 +0530 Subject: [PATCH 39/42] Move plane variables to camera frame properly --- vins_estimator/src/estimator.cpp | 4 ++-- vins_estimator/src/estimator.h | 2 ++ 2 files changed, 4 insertions(+), 2 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index d4ec043..86033e9 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -618,11 +618,11 @@ void Estimator::initializeNewPlanes() if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, est_R, est_t, est_n)) { // Transform est_n to global frame - est_n = Rs[i] * est_n; + est_n = ric[0].transpose() * Rs[i] * ric[0] * est_n; para_N[plane_id] = {est_n(0), est_n(1), est_n(2)}; // Estimate plane d using known metric t from vio - vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]); + vi_t = ric[0].transpose() * Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]); para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())}; init_pids.push_back(plane_id); ROS_INFO("Initialized plane %d features", plane_id); diff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h index e794866..6295758 100644 --- a/vins_estimator/src/estimator.h +++ b/vins_estimator/src/estimator.h @@ -117,6 +117,8 @@ class Estimator double para_Retrive_Pose[SIZE_POSE]; double para_Td[1][1]; double para_Tr[1][1]; + + // In global camera frame (c_0) map> para_N; map> para_d; -- 2.17.1 From fe8857b092935da13daeee267612bc1a6e6c11bf Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sun, 14 Feb 2021 19:50:16 +0530 Subject: [PATCH 40/42] Transform d variable between frames properly --- vins_estimator/src/estimator.cpp | 7 +++--- vins_estimator/src/factor/homography_factor.h | 24 ++++++++++++++----- 2 files changed, 22 insertions(+), 9 deletions(-) diff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp index 86033e9..26836fc 100644 --- a/vins_estimator/src/estimator.cpp +++ b/vins_estimator/src/estimator.cpp @@ -621,9 +621,10 @@ void Estimator::initializeNewPlanes() est_n = ric[0].transpose() * Rs[i] * ric[0] * est_n; para_N[plane_id] = {est_n(0), est_n(1), est_n(2)}; - // Estimate plane d using known metric t from vio - vi_t = ric[0].transpose() * Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]); - para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())}; + // Estimate plane d (in global frame) using known metric t from vio + vi_t = (Ps[WINDOW_SIZE - 1] - Ps[i]); + double di = (1.0*vi_t.norm())/est_t.norm(); + para_d[plane_id] = {di + (ric[0].transpose() * Ps[i]).dot(est_n)}; init_pids.push_back(plane_id); ROS_INFO("Initialized plane %d features", plane_id); break; diff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h index 7107036..f5eb3dc 100644 --- a/vins_estimator/src/factor/homography_factor.h +++ b/vins_estimator/src/factor/homography_factor.h @@ -7,7 +7,7 @@ struct HomographyFactor HomographyFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) {} template - bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const para_inv_depth, const T* const ex_pose, + bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const para_depth, const T* const ex_pose, T* residuals) const { Eigen::Map> pi(pose_i); @@ -24,17 +24,29 @@ struct HomographyFactor Eigen::Matrix n; n << para_n[0], para_n[1], para_n[2]; - Eigen::Matrix n_imu_1 = qic*n;// + tic; - Eigen::Matrix n_imu_i = qi.inverse()*n_imu_1;// - qi.inverse()*pi; - Eigen::Map> inv_depth(para_inv_depth); + // transform camera normal to imu normal + Eigen::Matrix n_imu_0 = qic*n;// + tic; + + // transform imu 0 normal to imu i normal + Eigen::Matrix n_imu_i = qi.inverse()*n_imu_0;// - qi.inverse()*pi; + + Eigen::Map> depth(para_depth); Eigen::Quaternion qji = qj.inverse() * qi; Eigen::Matrix tji = qj.inverse() * (pi - pj); - Eigen::Matrix di; - di(0,0) = inv_depth(0,0) - pi.dot(n_imu_i); + Eigen::Matrix di, di0; + + // convert camera depth to imu frame + di0(0,0) = depth(0,0) + tic.dot(n_imu_0); + // convert imu 0 depth to imu i depth + di(0,0) = di0(0,0) - pi.dot(n_imu_0); + Eigen::Matrix pts_imu_i = qic * pts_i.cast() + tic; + + // homography mapping Eigen::Matrix pts_imu_j = qji * pts_imu_i + (tji*(1.0/di(0,0)) * n_imu_i.transpose()) * pts_imu_i; + Eigen::Matrix pts_cam_j = qic.inverse() * (pts_imu_j - tic); pts_cam_j = (pts_cam_j / pts_cam_j[2]); -- 2.17.1 From c5fa6e1369311dc5f762802c62a201590b9a5a34 Mon Sep 17 00:00:00 2001 From: Karnik Ram Date: Sun, 21 Feb 2021 17:07:23 +0530 Subject: [PATCH 41/42] Modify benchmark publisher --- benchmark_publisher/launch/publish.launch | 4 +- .../src/benchmark_publisher_node.cpp | 27 +++------- config/vins_rviz_config.rviz | 49 ++++++++++++++----- .../launch/playback_vins_airsim.launch | 3 ++ 4 files changed, 49 insertions(+), 34 deletions(-) diff --git a/benchmark_publisher/launch/publish.launch b/benchmark_publisher/launch/publish.launch index 5e066fa..5d52c78 100644 --- a/benchmark_publisher/launch/publish.launch +++ b/benchmark_publisher/launch/publish.launch @@ -1,9 +1,9 @@ - + - + -- 2.17.1 ================================================ FILE: rpvio_estimator/CMakeLists.txt ================================================ cmake_minimum_required(VERSION 2.8.3) project(rpvio_estimator) set(CMAKE_BUILD_TYPE "Release") set(CMAKE_CXX_FLAGS "-std=c++11") #-DEIGEN_USE_MKL_ALL") set(CMAKE_CXX_FLAGS_RELEASE "-O3 -Wall -g") find_package(catkin REQUIRED COMPONENTS roscpp std_msgs geometry_msgs nav_msgs tf cv_bridge ) find_package(OpenCV REQUIRED) # message(WARNING "OpenCV_VERSION: ${OpenCV_VERSION}") find_package(Ceres REQUIRED) include_directories(${catkin_INCLUDE_DIRS} ${CERES_INCLUDE_DIRS}) set(CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake) find_package(Eigen3) include_directories( ${catkin_INCLUDE_DIRS} ${EIGEN3_INCLUDE_DIR} ) catkin_package() add_executable(rpvio_estimator src/estimator_node.cpp src/parameters.cpp src/estimator.cpp src/feature_manager.cpp src/factor/pose_local_parameterization.cpp src/factor/projection_factor.cpp src/factor/projection_td_factor.cpp src/factor/marginalization_factor.cpp src/utility/utility.cpp src/utility/visualization.cpp src/utility/CameraPoseVisualization.cpp src/initial/solve_5pts.cpp src/initial/initial_aligment.cpp src/initial/initial_sfm.cpp src/initial/initial_ex_rotation.cpp ) target_link_libraries(rpvio_estimator ${catkin_LIBRARIES} ${OpenCV_LIBS} ${CERES_LIBRARIES}) ================================================ FILE: rpvio_estimator/cmake/FindEigen.cmake ================================================ # Ceres Solver - A fast non-linear least squares minimizer # Copyright 2015 Google Inc. All rights reserved. # http://ceres-solver.org/ # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # * Redistributions of source code must retain the above copyright notice, # this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright notice, # this list of conditions and the following disclaimer in the documentation # and/or other materials provided with the distribution. # * Neither the name of Google Inc. nor the names of its contributors may be # used to endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE # ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE # LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR # CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF # SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS # INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN # CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) # ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE # POSSIBILITY OF SUCH DAMAGE. # # Author: alexs.mac@gmail.com (Alex Stewart) # # FindEigen.cmake - Find Eigen library, version >= 3. # # This module defines the following variables: # # EIGEN_FOUND: TRUE iff Eigen is found. # EIGEN_INCLUDE_DIRS: Include directories for Eigen. # # EIGEN_VERSION: Extracted from Eigen/src/Core/util/Macros.h # EIGEN_WORLD_VERSION: Equal to 3 if EIGEN_VERSION = 3.2.0 # EIGEN_MAJOR_VERSION: Equal to 2 if EIGEN_VERSION = 3.2.0 # EIGEN_MINOR_VERSION: Equal to 0 if EIGEN_VERSION = 3.2.0 # # The following variables control the behaviour of this module: # # EIGEN_INCLUDE_DIR_HINTS: List of additional directories in which to # search for eigen includes, e.g: /timbuktu/eigen3. # # The following variables are also defined by this module, but in line with # CMake recommended FindPackage() module style should NOT be referenced directly # by callers (use the plural variables detailed above instead). These variables # do however affect the behaviour of the module via FIND_[PATH/LIBRARY]() which # are NOT re-called (i.e. search for library is not repeated) if these variables # are set with valid values _in the CMake cache_. This means that if these # variables are set directly in the cache, either by the user in the CMake GUI, # or by the user passing -DVAR=VALUE directives to CMake when called (which # explicitly defines a cache variable), then they will be used verbatim, # bypassing the HINTS variables and other hard-coded search locations. # # EIGEN_INCLUDE_DIR: Include directory for CXSparse, not including the # include directory of any dependencies. # Called if we failed to find Eigen or any of it's required dependencies, # unsets all public (designed to be used externally) variables and reports # error message at priority depending upon [REQUIRED/QUIET/] argument. macro(EIGEN_REPORT_NOT_FOUND REASON_MSG) unset(EIGEN_FOUND) unset(EIGEN_INCLUDE_DIRS) # Make results of search visible in the CMake GUI if Eigen has not # been found so that user does not have to toggle to advanced view. mark_as_advanced(CLEAR EIGEN_INCLUDE_DIR) # Note _FIND_[REQUIRED/QUIETLY] variables defined by FindPackage() # use the camelcase library name, not uppercase. if (Eigen_FIND_QUIETLY) message(STATUS "Failed to find Eigen - " ${REASON_MSG} ${ARGN}) elseif (Eigen_FIND_REQUIRED) message(FATAL_ERROR "Failed to find Eigen - " ${REASON_MSG} ${ARGN}) else() # Neither QUIETLY nor REQUIRED, use no priority which emits a message # but continues configuration and allows generation. message("-- Failed to find Eigen - " ${REASON_MSG} ${ARGN}) endif () return() endmacro(EIGEN_REPORT_NOT_FOUND) # Protect against any alternative find_package scripts for this library having # been called previously (in a client project) which set EIGEN_FOUND, but not # the other variables we require / set here which could cause the search logic # here to fail. unset(EIGEN_FOUND) # Search user-installed locations first, so that we prefer user installs # to system installs where both exist. list(APPEND EIGEN_CHECK_INCLUDE_DIRS /usr/local/include /usr/local/homebrew/include # Mac OS X /opt/local/var/macports/software # Mac OS X. /opt/local/include /usr/include) # Additional suffixes to try appending to each search path. list(APPEND EIGEN_CHECK_PATH_SUFFIXES eigen3 # Default root directory for Eigen. Eigen/include/eigen3 # Windows (for C:/Program Files prefix) < 3.3 Eigen3/include/eigen3 ) # Windows (for C:/Program Files prefix) >= 3.3 # Search supplied hint directories first if supplied. find_path(EIGEN_INCLUDE_DIR NAMES Eigen/Core PATHS ${EIGEN_INCLUDE_DIR_HINTS} ${EIGEN_CHECK_INCLUDE_DIRS} PATH_SUFFIXES ${EIGEN_CHECK_PATH_SUFFIXES}) if (NOT EIGEN_INCLUDE_DIR OR NOT EXISTS ${EIGEN_INCLUDE_DIR}) eigen_report_not_found( "Could not find eigen3 include directory, set EIGEN_INCLUDE_DIR to " "path to eigen3 include directory, e.g. /usr/local/include/eigen3.") endif (NOT EIGEN_INCLUDE_DIR OR NOT EXISTS ${EIGEN_INCLUDE_DIR}) # Mark internally as found, then verify. EIGEN_REPORT_NOT_FOUND() unsets # if called. set(EIGEN_FOUND TRUE) # Extract Eigen version from Eigen/src/Core/util/Macros.h if (EIGEN_INCLUDE_DIR) set(EIGEN_VERSION_FILE ${EIGEN_INCLUDE_DIR}/Eigen/src/Core/util/Macros.h) if (NOT EXISTS ${EIGEN_VERSION_FILE}) eigen_report_not_found( "Could not find file: ${EIGEN_VERSION_FILE} " "containing version information in Eigen install located at: " "${EIGEN_INCLUDE_DIR}.") else (NOT EXISTS ${EIGEN_VERSION_FILE}) file(READ ${EIGEN_VERSION_FILE} EIGEN_VERSION_FILE_CONTENTS) string(REGEX MATCH "#define EIGEN_WORLD_VERSION [0-9]+" EIGEN_WORLD_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_WORLD_VERSION ([0-9]+)" "\\1" EIGEN_WORLD_VERSION "${EIGEN_WORLD_VERSION}") string(REGEX MATCH "#define EIGEN_MAJOR_VERSION [0-9]+" EIGEN_MAJOR_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_MAJOR_VERSION ([0-9]+)" "\\1" EIGEN_MAJOR_VERSION "${EIGEN_MAJOR_VERSION}") string(REGEX MATCH "#define EIGEN_MINOR_VERSION [0-9]+" EIGEN_MINOR_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_MINOR_VERSION ([0-9]+)" "\\1" EIGEN_MINOR_VERSION "${EIGEN_MINOR_VERSION}") # This is on a single line s/t CMake does not interpret it as a list of # elements and insert ';' separators which would result in 3.;2.;0 nonsense. set(EIGEN_VERSION "${EIGEN_WORLD_VERSION}.${EIGEN_MAJOR_VERSION}.${EIGEN_MINOR_VERSION}") endif (NOT EXISTS ${EIGEN_VERSION_FILE}) endif (EIGEN_INCLUDE_DIR) # Set standard CMake FindPackage variables if found. if (EIGEN_FOUND) set(EIGEN_INCLUDE_DIRS ${EIGEN_INCLUDE_DIR}) endif (EIGEN_FOUND) # Handle REQUIRED / QUIET optional arguments and version. include(FindPackageHandleStandardArgs) find_package_handle_standard_args(Eigen REQUIRED_VARS EIGEN_INCLUDE_DIRS VERSION_VAR EIGEN_VERSION) # Only mark internal variables as advanced if we found Eigen, otherwise # leave it visible in the standard GUI for the user to set manually. if (EIGEN_FOUND) mark_as_advanced(FORCE EIGEN_INCLUDE_DIR) endif (EIGEN_FOUND) ================================================ FILE: rpvio_estimator/launch/advio_12.launch ================================================ ================================================ FILE: rpvio_estimator/launch/ol_market1.launch ================================================ ================================================ FILE: rpvio_estimator/launch/rpvio_rviz.launch ================================================ ================================================ FILE: rpvio_estimator/launch/rpvio_sim.launch ================================================ ================================================ FILE: rpvio_estimator/package.xml ================================================ rpvio_estimator 0.0.0 The vins_estimator package karnikram GPLv3 catkin roscpp roscpp ================================================ FILE: rpvio_estimator/src/estimator.cpp ================================================ #include "estimator.h" Estimator::Estimator(): f_manager{Rs} { ROS_INFO("init begins"); clearState(); } void Estimator::setParameter() { for (int i = 0; i < NUM_OF_CAM; i++) { tic[i] = TIC[i]; ric[i] = RIC[i]; } f_manager.setRic(ric); ProjectionFactor::sqrt_info = FOCAL_LENGTH / 1.5 * Matrix2d::Identity(); ProjectionTdFactor::sqrt_info = FOCAL_LENGTH / 1.5 * Matrix2d::Identity(); td = TD; } void Estimator::clearState() { for (int i = 0; i < WINDOW_SIZE + 1; i++) { Rs[i].setIdentity(); Ps[i].setZero(); Vs[i].setZero(); Bas[i].setZero(); Bgs[i].setZero(); dt_buf[i].clear(); linear_acceleration_buf[i].clear(); angular_velocity_buf[i].clear(); if (pre_integrations[i] != nullptr) delete pre_integrations[i]; pre_integrations[i] = nullptr; } for (int i = 0; i < NUM_OF_CAM; i++) { tic[i] = Vector3d::Zero(); ric[i] = Matrix3d::Identity(); } for (auto &it : all_image_frame) { if (it.second.pre_integration != nullptr) { delete it.second.pre_integration; it.second.pre_integration = nullptr; } } para_N.clear(); para_d.clear(); init_pids.clear(); solver_flag = INITIAL; first_imu = false, sum_of_back = 0; sum_of_front = 0; frame_count = 0; solver_flag = INITIAL; initial_timestamp = 0; all_image_frame.clear(); td = TD; if (tmp_pre_integration != nullptr) delete tmp_pre_integration; if (last_marginalization_info != nullptr) delete last_marginalization_info; tmp_pre_integration = nullptr; last_marginalization_info = nullptr; last_marginalization_parameter_blocks.clear(); f_manager.clearState(); failure_occur = 0; relocalization_info = 0; drift_correct_r = Matrix3d::Identity(); drift_correct_t = Vector3d::Zero(); } void Estimator::processIMU(double dt, const Vector3d &linear_acceleration, const Vector3d &angular_velocity) { if (!first_imu) { first_imu = true; acc_0 = linear_acceleration; gyr_0 = angular_velocity; } if (!pre_integrations[frame_count]) { pre_integrations[frame_count] = new IntegrationBase{acc_0, gyr_0, Bas[frame_count], Bgs[frame_count]}; } if (frame_count != 0) { pre_integrations[frame_count]->push_back(dt, linear_acceleration, angular_velocity); //if(solver_flag != NON_LINEAR) tmp_pre_integration->push_back(dt, linear_acceleration, angular_velocity); dt_buf[frame_count].push_back(dt); linear_acceleration_buf[frame_count].push_back(linear_acceleration); angular_velocity_buf[frame_count].push_back(angular_velocity); int j = frame_count; Vector3d un_acc_0 = Rs[j] * (acc_0 - Bas[j]) - g; Vector3d un_gyr = 0.5 * (gyr_0 + angular_velocity) - Bgs[j]; Rs[j] *= Utility::deltaQ(un_gyr * dt).toRotationMatrix(); Vector3d un_acc_1 = Rs[j] * (linear_acceleration - Bas[j]) - g; Vector3d un_acc = 0.5 * (un_acc_0 + un_acc_1); Ps[j] += dt * Vs[j] + 0.5 * dt * dt * un_acc; Vs[j] += dt * un_acc; } acc_0 = linear_acceleration; gyr_0 = angular_velocity; } void Estimator::processImage(const map>>> &image, const std_msgs::Header &header) { ROS_DEBUG("new image coming ------------------------------------------"); ROS_DEBUG("Adding feature points %lu", image.size()); if (f_manager.addFeatureCheckParallax(frame_count, image, td)) marginalization_flag = MARGIN_OLD; else marginalization_flag = MARGIN_SECOND_NEW; ROS_DEBUG("this frame is--------------------%s", marginalization_flag ? "reject" : "accept"); ROS_DEBUG("%s", marginalization_flag ? "Non-keyframe" : "Keyframe"); ROS_DEBUG("Solving %d", frame_count); ROS_DEBUG("number of feature: %d", f_manager.getFeatureCount()); Headers[frame_count] = header; ImageFrame imageframe(image, header.stamp.toSec()); imageframe.pre_integration = tmp_pre_integration; all_image_frame.insert(make_pair(header.stamp.toSec(), imageframe)); tmp_pre_integration = new IntegrationBase{acc_0, gyr_0, Bas[frame_count], Bgs[frame_count]}; //if(ESTIMATE_EXTRINSIC == 2) //{ // ROS_INFO("calibrating extrinsic param, rotation movement is needed"); // if (frame_count != 0) // { // vector> corres = f_manager.getCorresponding(frame_count - 1, frame_count); // Matrix3d calib_ric; // if (initial_ex_rotation.CalibrationExRotation(corres, pre_integrations[frame_count]->delta_q, calib_ric)) // { // ROS_WARN("initial extrinsic rotation calib success"); // ROS_WARN_STREAM("initial extrinsic rotation: " << endl << calib_ric); // ric[0] = calib_ric; // RIC[0] = calib_ric; // ESTIMATE_EXTRINSIC = 1; // } // } //} if (solver_flag == INITIAL) { if (frame_count == WINDOW_SIZE) { bool result = false; if( ESTIMATE_EXTRINSIC != 2 && (header.stamp.toSec() - initial_timestamp) > 0.1) { result = initialStructure(); initial_timestamp = header.stamp.toSec(); } if(result) { solver_flag = NON_LINEAR; solveOdometry(); slideWindow(); f_manager.removeFailures(); ROS_INFO("Initialization finish!"); last_R = Rs[WINDOW_SIZE]; last_P = Ps[WINDOW_SIZE]; last_R0 = Rs[0]; last_P0 = Ps[0]; } else slideWindow(); } else frame_count++; } else { TicToc t_solve; initializeNewPlanes(); solveOdometry(); ROS_DEBUG("solver costs: %fms", t_solve.toc()); if (failureDetection()) { ROS_WARN("failure detection!"); failure_occur = 1; clearState(); setParameter(); ROS_WARN("system reboot!"); return; } TicToc t_margin; slideWindow(); f_manager.removeFailures(); ROS_DEBUG("marginalization costs: %fms", t_margin.toc()); // prepare output of VINS key_poses.clear(); for (int i = 0; i <= WINDOW_SIZE; i++) key_poses.push_back(Ps[i]); last_R = Rs[WINDOW_SIZE]; last_P = Ps[WINDOW_SIZE]; last_R0 = Rs[0]; last_P0 = Ps[0]; } } bool Estimator::initialStructure() { TicToc t_sfm; //check imu observibility { map::iterator frame_it; Vector3d sum_g; for (frame_it = all_image_frame.begin(), frame_it++; frame_it != all_image_frame.end(); frame_it++) { double dt = frame_it->second.pre_integration->sum_dt; Vector3d tmp_g = frame_it->second.pre_integration->delta_v / dt; sum_g += tmp_g; } Vector3d aver_g; aver_g = sum_g * 1.0 / ((int)all_image_frame.size() - 1); double var = 0; for (frame_it = all_image_frame.begin(), frame_it++; frame_it != all_image_frame.end(); frame_it++) { double dt = frame_it->second.pre_integration->sum_dt; Vector3d tmp_g = frame_it->second.pre_integration->delta_v / dt; var += (tmp_g - aver_g).transpose() * (tmp_g - aver_g); //cout << "frame g " << tmp_g.transpose() << endl; } var = sqrt(var / ((int)all_image_frame.size() - 1)); //ROS_WARN("IMU variation %f!", var); if(var < 0.25) { ROS_INFO("IMU excitation not enouth!"); //return false; } } // Bootstrap SfM map relative_R; map relative_T; map n; int l; if (!relativeHPose(relative_R, relative_T, n, l, lplane_id)) { ROS_INFO("Not enough features or parallax; Move device around"); return false; } // global sfm Quaterniond Q[frame_count + 1]; Vector3d T[frame_count + 1]; map sfm_tracked_points; vector sfm_f; for (auto &it_per_id : f_manager.feature) { // Consider only features from largest plne if(it_per_id.plane_id != lplane_id) continue; int imu_j = it_per_id.start_frame - 1; SFMFeature tmp_feature; tmp_feature.state = false; tmp_feature.id = it_per_id.feature_id; tmp_feature.plane_id = it_per_id.plane_id; for (auto &it_per_frame : it_per_id.feature_per_frame) { imu_j++; Vector3d pts_j = it_per_frame.point; tmp_feature.observation.push_back(make_pair(imu_j, Eigen::Vector2d{pts_j.x(), pts_j.y()})); } sfm_f.push_back(tmp_feature); } GlobalSFM sfm; if(!sfm.constructH(frame_count + 1, Q, T, l, relative_R[lplane_id], relative_T[lplane_id], n[lplane_id], sfm_f, sfm_tracked_points)) { ROS_DEBUG("global SFM failed!"); marginalization_flag = MARGIN_OLD; return false; } // Save estimated normal variables for(auto &est_n : n) { para_N[est_n.first] = {est_n.second(0), est_n.second(1), est_n.second(2)}; } //solve pnp for all frame map::iterator frame_it; map::iterator it; frame_it = all_image_frame.begin( ); for (int i = 0; frame_it != all_image_frame.end( ); frame_it++) { // provide initial guess cv::Mat r, rvec, t, D, tmp_r; if((frame_it->first) == Headers[i].stamp.toSec()) { frame_it->second.is_key_frame = true; frame_it->second.R = Q[i].toRotationMatrix() * RIC[0].transpose(); frame_it->second.T = T[i]; i++; continue; } if((frame_it->first) > Headers[i].stamp.toSec()) { i++; } Matrix3d R_inital = (Q[i].inverse()).toRotationMatrix(); Vector3d P_inital = - R_inital * T[i]; cv::eigen2cv(R_inital, tmp_r); cv::Rodrigues(tmp_r, rvec); cv::eigen2cv(P_inital, t); frame_it->second.is_key_frame = false; vector pts_3_vector; vector pts_2_vector; for (auto &id_pts : frame_it->second.points) { int feature_id = id_pts.first; for (auto &i_p : id_pts.second) { it = sfm_tracked_points.find(feature_id); if(it != sfm_tracked_points.end()) { Vector3d world_pts = it->second; cv::Point3f pts_3(world_pts(0), world_pts(1), world_pts(2)); pts_3_vector.push_back(pts_3); Vector2d img_pts = i_p.second.head<2>(); cv::Point2f pts_2(img_pts(0), img_pts(1)); pts_2_vector.push_back(pts_2); } } } cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); if(pts_3_vector.size() < 6) { cout << "pts_3_vector size " << pts_3_vector.size() << endl; ROS_DEBUG("Not enough points for solve pnp !"); return false; } if (! cv::solvePnP(pts_3_vector, pts_2_vector, K, D, rvec, t, 1)) { ROS_DEBUG("solve pnp fail!"); return false; } cv::Rodrigues(rvec, r); MatrixXd R_pnp,tmp_R_pnp; cv::cv2eigen(r, tmp_R_pnp); R_pnp = tmp_R_pnp.transpose(); MatrixXd T_pnp; cv::cv2eigen(t, T_pnp); T_pnp = R_pnp * (-T_pnp); frame_it->second.R = R_pnp * RIC[0].transpose(); frame_it->second.T = T_pnp; } if (visualInitialAlign(relative_T, T[frame_count])) return true; else { para_N.clear(); para_d.clear(); ROS_INFO("misalign visual structure with IMU"); return false; } } bool Estimator::visualInitialAlign(map &relative_T, Eigen::Vector3d <) { TicToc t_g; VectorXd x; //solve scale bool result = VisualIMUAlignment(all_image_frame, Bgs, g, x); if(!result) { ROS_DEBUG("solve g failed!"); return false; } // change state for (int i = 0; i <= frame_count; i++) { Matrix3d Ri = all_image_frame[Headers[i].stamp.toSec()].R; Vector3d Pi = all_image_frame[Headers[i].stamp.toSec()].T; Ps[i] = Pi; Rs[i] = Ri; all_image_frame[Headers[i].stamp.toSec()].is_key_frame = true; } VectorXd dep = f_manager.getDepthVector(); for (int i = 0; i < dep.size(); i++) dep[i] = -1; f_manager.clearDepth(dep); //triangulat on cam pose , no tic Vector3d TIC_TMP[NUM_OF_CAM]; for(int i = 0; i < NUM_OF_CAM; i++) TIC_TMP[i].setZero(); ric[0] = RIC[0]; f_manager.setRic(ric); f_manager.triangulate(Ps, &(TIC_TMP[0]), &(RIC[0])); s = (x.tail<1>())(0); for (int i = 0; i <= WINDOW_SIZE; i++) { pre_integrations[i]->repropagate(Vector3d::Zero(), Bgs[i]); } for (int i = frame_count; i >= 0; i--) Ps[i] = s * Ps[i] - Rs[i] * TIC[0] - (s * Ps[0] - Rs[0] * TIC[0]); int kv = -1; map::iterator frame_i; for (frame_i = all_image_frame.begin(); frame_i != all_image_frame.end(); frame_i++) { if(frame_i->second.is_key_frame) { kv++; Vs[kv] = frame_i->second.R * x.segment<3>(kv * 3); } } for (auto &it_per_id : f_manager.feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; it_per_id.estimated_depth *= s; } //estimate plane d variables for(auto &t : relative_T) { if(t.first == lplane_id) para_d[lplane_id] = {s}; else para_d[t.first] = {s*lt.norm()/t.second.norm()}; init_pids.push_back(t.first); ROS_INFO("Initialized plane %d features", t.first); } Matrix3d R0 = Utility::g2R(g); double yaw = Utility::R2ypr(R0 * Rs[0]).x(); R0 = Utility::ypr2R(Eigen::Vector3d{-yaw, 0, 0}) * R0; g = R0 * g; //Matrix3d rot_diff = R0 * Rs[0].transpose(); Matrix3d rot_diff = R0; for (int i = 0; i <= frame_count; i++) { Ps[i] = rot_diff * Ps[i]; Rs[i] = rot_diff * Rs[i]; Vs[i] = rot_diff * Vs[i]; } ROS_DEBUG_STREAM("g0 " << g.transpose()); ROS_DEBUG_STREAM("my R0 " << Utility::R2ypr(Rs[0]).transpose()); return true; } bool Estimator::relativeHPose(map &relative_R, map &relative_t, map &n, int &l, int &lplane_id) { // find previous frame which contains enough correspondence and parallex with newest frame for (int i = 0; i < WINDOW_SIZE; i++) { int lplane_size = 0; vector> lplane_corres; map>> plane_corres = f_manager.getCorresponding(i, WINDOW_SIZE); for(auto &corres : plane_corres) { if(corres.second.size() > lplane_size) { lplane_id = corres.first; lplane_size = corres.second.size(); lplane_corres = corres.second; } } if (lplane_size > 20) { double sum_parallax = 0; double average_parallax; for (int j = 0; j < int(lplane_corres.size()); j++) { Vector2d pts_0(lplane_corres[j].first(0), lplane_corres[j].first(1)); Vector2d pts_1(lplane_corres[j].second(0), lplane_corres[j].second(1)); double parallax = (pts_0 - pts_1).norm(); sum_parallax = sum_parallax + parallax; } average_parallax = 1.0 * sum_parallax / int(lplane_corres.size()); Matrix4d TrIC = Matrix4d::Identity(); TrIC.block(0,0,3,3) = ric[0]; TrIC.block(0,3,3,1) = tic[0]; //compute corresponding preintegrated rotation Matrix3d R_imu = Matrix3d::Identity(); for(int k = WINDOW_SIZE - 1; k > i; k--) R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); Eigen::Matrix3d est_R; Eigen::Vector3d est_t, est_n; if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(lplane_corres, R_imu, TrIC, est_R, est_t, est_n)) { l = i; ROS_DEBUG("average_parallax %f choose l %d and newest frame to triangulate the whole structure", average_parallax * 460, l); relative_R[lplane_id] = est_R; relative_t[lplane_id] = est_t; n[lplane_id] = est_n; for(auto &corres : plane_corres) { if(corres.first == lplane_id) continue; if(m_estimator.solveRelativeHRT(corres.second, R_imu, TrIC, est_R, est_t, est_n)) { relative_R[corres.first] = est_R; relative_t[corres.first] = est_t; n[corres.first] = est_n; } } return true; } } } return false; } void Estimator::initializeNewPlanes() { // check for new plane ids for(auto &plane_id : f_manager.plane_ids) { if(para_N.count(plane_id) > 0) continue; else { for(int i = 0; i < WINDOW_SIZE - 1; i++) { vector> corres = f_manager.getCorrespondingByPlane( i, WINDOW_SIZE-1, plane_id); if(corres.size() > 20) { double sum_parallax = 0; double average_parallax; for (int j = 0; j < int(corres.size()); j++) { Vector2d pts_0(corres[j].first(0), corres[j].first(1)); Vector2d pts_1(corres[j].second(0), corres[j].second(1)); double parallax = (pts_0 - pts_1).norm(); sum_parallax = sum_parallax + parallax; } average_parallax = 1.0 * sum_parallax / int(corres.size()); Matrix4d TrIC = Matrix4d::Identity(); TrIC.block(0,0,3,3) = ric[0]; TrIC.block(0,3,3,1) = tic[0]; //compute corresponding preintegrated rotation Matrix3d R_imu = Matrix3d::Identity(); for(int k = WINDOW_SIZE - 2; k > i; k--) R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix(); Eigen::Matrix3d est_R; Eigen::Vector3d est_t, est_n, vi_t; if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, est_R, est_t, est_n)) { // Transform est_n to global frame est_n = ric[0].transpose() * Rs[i] * ric[0] * est_n; para_N[plane_id] = {est_n(0), est_n(1), est_n(2)}; // Estimate plane d (in global frame) using known metric t from vio vi_t = (Ps[WINDOW_SIZE - 1] - Ps[i]); double di = (1.0*vi_t.norm())/est_t.norm(); para_d[plane_id] = {di + (ric[0].transpose() * Ps[i]).dot(est_n)}; init_pids.push_back(plane_id); ROS_INFO("Initialized plane %d features", plane_id); break; } } } } } } void Estimator::solveOdometry() { if (frame_count < WINDOW_SIZE) return; if (solver_flag == NON_LINEAR) { TicToc t_tri; f_manager.triangulate(Ps, tic, ric); ROS_DEBUG("triangulation costs %f", t_tri.toc()); optimization(); } } void Estimator::vector2double() { for (int i = 0; i <= WINDOW_SIZE; i++) { para_Pose[i][0] = Ps[i].x(); para_Pose[i][1] = Ps[i].y(); para_Pose[i][2] = Ps[i].z(); Quaterniond q{Rs[i]}; para_Pose[i][3] = q.x(); para_Pose[i][4] = q.y(); para_Pose[i][5] = q.z(); para_Pose[i][6] = q.w(); para_SpeedBias[i][0] = Vs[i].x(); para_SpeedBias[i][1] = Vs[i].y(); para_SpeedBias[i][2] = Vs[i].z(); para_SpeedBias[i][3] = Bas[i].x(); para_SpeedBias[i][4] = Bas[i].y(); para_SpeedBias[i][5] = Bas[i].z(); para_SpeedBias[i][6] = Bgs[i].x(); para_SpeedBias[i][7] = Bgs[i].y(); para_SpeedBias[i][8] = Bgs[i].z(); } for (int i = 0; i < NUM_OF_CAM; i++) { para_Ex_Pose[i][0] = tic[i].x(); para_Ex_Pose[i][1] = tic[i].y(); para_Ex_Pose[i][2] = tic[i].z(); Quaterniond q{ric[i]}; para_Ex_Pose[i][3] = q.x(); para_Ex_Pose[i][4] = q.y(); para_Ex_Pose[i][5] = q.z(); para_Ex_Pose[i][6] = q.w(); } VectorXd dep = f_manager.getDepthVector(); for (int i = 0; i < f_manager.getFeatureCount(); i++) para_Feature[i][0] = dep(i); if (ESTIMATE_TD) para_Td[0][0] = td; } void Estimator::double2vector() { Vector3d origin_R0 = Utility::R2ypr(Rs[0]); Vector3d origin_P0 = Ps[0]; if (failure_occur) { origin_R0 = Utility::R2ypr(last_R0); origin_P0 = last_P0; failure_occur = 0; } Vector3d origin_R00 = Utility::R2ypr(Quaterniond(para_Pose[0][6], para_Pose[0][3], para_Pose[0][4], para_Pose[0][5]).toRotationMatrix()); double y_diff = origin_R0.x() - origin_R00.x(); //TODO Matrix3d rot_diff = Utility::ypr2R(Vector3d(y_diff, 0, 0)); if (abs(abs(origin_R0.y()) - 90) < 1.0 || abs(abs(origin_R00.y()) - 90) < 1.0) { ROS_DEBUG("euler singular point!"); rot_diff = Rs[0] * Quaterniond(para_Pose[0][6], para_Pose[0][3], para_Pose[0][4], para_Pose[0][5]).toRotationMatrix().transpose(); } for (int i = 0; i <= WINDOW_SIZE; i++) { Rs[i] = rot_diff * Quaterniond(para_Pose[i][6], para_Pose[i][3], para_Pose[i][4], para_Pose[i][5]).normalized().toRotationMatrix(); Ps[i] = rot_diff * Vector3d(para_Pose[i][0] - para_Pose[0][0], para_Pose[i][1] - para_Pose[0][1], para_Pose[i][2] - para_Pose[0][2]) + origin_P0; Vs[i] = rot_diff * Vector3d(para_SpeedBias[i][0], para_SpeedBias[i][1], para_SpeedBias[i][2]); Bas[i] = Vector3d(para_SpeedBias[i][3], para_SpeedBias[i][4], para_SpeedBias[i][5]); Bgs[i] = Vector3d(para_SpeedBias[i][6], para_SpeedBias[i][7], para_SpeedBias[i][8]); } for (int i = 0; i < NUM_OF_CAM; i++) { tic[i] = Vector3d(para_Ex_Pose[i][0], para_Ex_Pose[i][1], para_Ex_Pose[i][2]); ric[i] = Quaterniond(para_Ex_Pose[i][6], para_Ex_Pose[i][3], para_Ex_Pose[i][4], para_Ex_Pose[i][5]).toRotationMatrix(); } VectorXd dep = f_manager.getDepthVector(); for (int i = 0; i < f_manager.getFeatureCount(); i++) dep(i) = para_Feature[i][0]; f_manager.setDepth(dep); if (ESTIMATE_TD) td = para_Td[0][0]; // relative info between two loop frame if(relocalization_info) { Matrix3d relo_r; Vector3d relo_t; relo_r = rot_diff * Quaterniond(relo_Pose[6], relo_Pose[3], relo_Pose[4], relo_Pose[5]).normalized().toRotationMatrix(); relo_t = rot_diff * Vector3d(relo_Pose[0] - para_Pose[0][0], relo_Pose[1] - para_Pose[0][1], relo_Pose[2] - para_Pose[0][2]) + origin_P0; double drift_correct_yaw; drift_correct_yaw = Utility::R2ypr(prev_relo_r).x() - Utility::R2ypr(relo_r).x(); drift_correct_r = Utility::ypr2R(Vector3d(drift_correct_yaw, 0, 0)); drift_correct_t = prev_relo_t - drift_correct_r * relo_t; relo_relative_t = relo_r.transpose() * (Ps[relo_frame_local_index] - relo_t); relo_relative_q = relo_r.transpose() * Rs[relo_frame_local_index]; relo_relative_yaw = Utility::normalizeAngle(Utility::R2ypr(Rs[relo_frame_local_index]).x() - Utility::R2ypr(relo_r).x()); //cout << "vins relo " << endl; //cout << "vins relative_t " << relo_relative_t.transpose() << endl; //cout << "vins relative_yaw " < 2.5) { ROS_INFO(" big IMU acc bias estimation %f", Bas[WINDOW_SIZE].norm()); return true; } if (Bgs[WINDOW_SIZE].norm() > 1.0) { ROS_INFO(" big IMU gyr bias estimation %f", Bgs[WINDOW_SIZE].norm()); return true; } /* if (tic(0) > 1) { ROS_INFO(" big extri param estimation %d", tic(0) > 1); return true; } */ Vector3d tmp_P = Ps[WINDOW_SIZE]; if ((tmp_P - last_P).norm() > 5) { ROS_INFO(" big translation"); return true; } if (abs(tmp_P.z() - last_P.z()) > 1) { ROS_INFO(" big z translation"); return true; } Matrix3d tmp_R = Rs[WINDOW_SIZE]; Matrix3d delta_R = tmp_R.transpose() * last_R; Quaterniond delta_Q(delta_R); double delta_angle; delta_angle = acos(delta_Q.w()) * 2.0 / 3.14 * 180.0; if (delta_angle > 50) { ROS_INFO(" big delta_angle "); //return true; } return false; } void Estimator::optimization() { ceres::Problem problem; ceres::LossFunction *loss_function; //loss_function = new ceres::HuberLoss(1.0); loss_function = new ceres::CauchyLoss(1.0); for (int i = 0; i < WINDOW_SIZE + 1; i++) { ceres::LocalParameterization *local_parameterization = new PoseLocalParameterization(); problem.AddParameterBlock(para_Pose[i], SIZE_POSE, local_parameterization); problem.AddParameterBlock(para_SpeedBias[i], SIZE_SPEEDBIAS); } for(auto &N : para_N) { ceres::LocalParameterization *local_n_parameterization = new ceres::HomogeneousVectorParameterization(3); problem.AddParameterBlock(N.second.data(), 3, local_n_parameterization); } for (int i = 0; i < NUM_OF_CAM; i++) { ceres::LocalParameterization *local_parameterization = new PoseLocalParameterization(); problem.AddParameterBlock(para_Ex_Pose[i], SIZE_POSE, local_parameterization); if (!ESTIMATE_EXTRINSIC) { ROS_DEBUG("fix extinsic param"); problem.SetParameterBlockConstant(para_Ex_Pose[i]); } else ROS_DEBUG("estimate extinsic param"); } if (ESTIMATE_TD) { problem.AddParameterBlock(para_Td[0], 1); //problem.SetParameterBlockConstant(para_Td[0]); } TicToc t_whole, t_prepare; vector2double(); if (last_marginalization_info) { // construct new marginlization_factor MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info); problem.AddResidualBlock(marginalization_factor, NULL, last_marginalization_parameter_blocks); } for (int i = 0; i < WINDOW_SIZE; i++) { int j = i + 1; if (pre_integrations[j]->sum_dt > 10.0) continue; IMUFactor* imu_factor = new IMUFactor(pre_integrations[j]); problem.AddResidualBlock(imu_factor, NULL, para_Pose[i], para_SpeedBias[i], para_Pose[j], para_SpeedBias[j]); } int f_m_cnt = 0; int feature_index = -1; int largest_pid = f_manager.getLargestPlaneId(init_pids); ROS_INFO("Using features from plane %d", largest_pid); for (auto &it_per_id : f_manager.feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; ++feature_index; int imu_i = it_per_id.start_frame, imu_j = imu_i - 1; Vector3d pts_i = it_per_id.feature_per_frame[0].point; int pid = it_per_id.plane_id; if(pid != largest_pid) continue; for (auto &it_per_frame : it_per_id.feature_per_frame) { imu_j++; if (imu_i == imu_j) { continue; } Vector3d pts_j = it_per_frame.point; if (ESTIMATE_TD) { ProjectionTdFactor *f_td = new ProjectionTdFactor(pts_i, pts_j, it_per_id.feature_per_frame[0].velocity, it_per_frame.velocity, it_per_id.feature_per_frame[0].cur_td, it_per_frame.cur_td, it_per_id.feature_per_frame[0].uv.y(), it_per_frame.uv.y()); problem.AddResidualBlock(f_td, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index], para_Td[0]); /* double **para = new double *[5]; para[0] = para_Pose[imu_i]; para[1] = para_Pose[imu_j]; para[2] = para_Ex_Pose[0]; para[3] = para_Feature[feature_index]; para[4] = para_Td[0]; f_td->check(para); */ } else { ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]); ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j); problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N[pid].data(), para_d[pid].data(), para_Ex_Pose[0]); } f_m_cnt++; } } ROS_DEBUG("visual measurement count: %d", f_m_cnt); ROS_DEBUG("prepare for ceres: %f", t_prepare.toc()); if(relocalization_info) { //printf("set relocalization factor! \n"); ceres::LocalParameterization *local_parameterization = new PoseLocalParameterization(); problem.AddParameterBlock(relo_Pose, SIZE_POSE, local_parameterization); int retrive_feature_index = 0; int feature_index = -1; for (auto &it_per_id : f_manager.feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; ++feature_index; int start = it_per_id.start_frame; if(start <= relo_frame_local_index) { while((int)match_points[retrive_feature_index].z() < it_per_id.feature_id) { retrive_feature_index++; } if((int)match_points[retrive_feature_index].z() == it_per_id.feature_id) { Vector3d pts_j = Vector3d(match_points[retrive_feature_index].x(), match_points[retrive_feature_index].y(), 1.0); Vector3d pts_i = it_per_id.feature_per_frame[0].point; ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); problem.AddResidualBlock(f, loss_function, para_Pose[start], relo_Pose, para_Ex_Pose[0], para_Feature[feature_index]); retrive_feature_index++; } } } } ceres::Solver::Options options; options.linear_solver_type = ceres::DENSE_SCHUR; //options.num_threads = 2; options.trust_region_strategy_type = ceres::DOGLEG; options.max_num_iterations = NUM_ITERATIONS; //options.use_explicit_schur_complement = true; //options.minimizer_progress_to_stdout = true; //options.use_nonmonotonic_steps = true; if (marginalization_flag == MARGIN_OLD) options.max_solver_time_in_seconds = SOLVER_TIME * 4.0 / 5.0; else options.max_solver_time_in_seconds = SOLVER_TIME; TicToc t_solver; ceres::Solver::Summary summary; ceres::Solve(options, &problem, &summary); //cout << summary.BriefReport() << endl; ROS_DEBUG("Iterations : %d", static_cast(summary.iterations.size())); ROS_DEBUG("solver costs: %f", t_solver.toc()); double2vector(); TicToc t_whole_marginalization; if (marginalization_flag == MARGIN_OLD) { MarginalizationInfo *marginalization_info = new MarginalizationInfo(); vector2double(); if (last_marginalization_info) { vector drop_set; for (int i = 0; i < static_cast(last_marginalization_parameter_blocks.size()); i++) { if (last_marginalization_parameter_blocks[i] == para_Pose[0] || last_marginalization_parameter_blocks[i] == para_SpeedBias[0]) drop_set.push_back(i); } // construct new marginlization_factor MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info); ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(marginalization_factor, NULL, last_marginalization_parameter_blocks, drop_set); marginalization_info->addResidualBlockInfo(residual_block_info); } { if (pre_integrations[1]->sum_dt < 10.0) { IMUFactor* imu_factor = new IMUFactor(pre_integrations[1]); ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(imu_factor, NULL, vector{para_Pose[0], para_SpeedBias[0], para_Pose[1], para_SpeedBias[1]}, vector{0, 1}); marginalization_info->addResidualBlockInfo(residual_block_info); } } { int feature_index = -1; for (auto &it_per_id : f_manager.feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; ++feature_index; int imu_i = it_per_id.start_frame, imu_j = imu_i - 1; if (imu_i != 0) continue; Vector3d pts_i = it_per_id.feature_per_frame[0].point; for (auto &it_per_frame : it_per_id.feature_per_frame) { imu_j++; if (imu_i == imu_j) continue; Vector3d pts_j = it_per_frame.point; if (ESTIMATE_TD) { ProjectionTdFactor *f_td = new ProjectionTdFactor(pts_i, pts_j, it_per_id.feature_per_frame[0].velocity, it_per_frame.velocity, it_per_id.feature_per_frame[0].cur_td, it_per_frame.cur_td, it_per_id.feature_per_frame[0].uv.y(), it_per_frame.uv.y()); ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(f_td, loss_function, vector{para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index], para_Td[0]}, vector{0, 3}); marginalization_info->addResidualBlockInfo(residual_block_info); } else { ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j); ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(f, loss_function, vector{para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]}, vector{0, 3}); marginalization_info->addResidualBlockInfo(residual_block_info); } } } } TicToc t_pre_margin; marginalization_info->preMarginalize(); ROS_DEBUG("pre marginalization %f ms", t_pre_margin.toc()); TicToc t_margin; marginalization_info->marginalize(); ROS_DEBUG("marginalization %f ms", t_margin.toc()); std::unordered_map addr_shift; for (int i = 1; i <= WINDOW_SIZE; i++) { addr_shift[reinterpret_cast(para_Pose[i])] = para_Pose[i - 1]; addr_shift[reinterpret_cast(para_SpeedBias[i])] = para_SpeedBias[i - 1]; } for (int i = 0; i < NUM_OF_CAM; i++) addr_shift[reinterpret_cast(para_Ex_Pose[i])] = para_Ex_Pose[i]; if (ESTIMATE_TD) { addr_shift[reinterpret_cast(para_Td[0])] = para_Td[0]; } vector parameter_blocks = marginalization_info->getParameterBlocks(addr_shift); if (last_marginalization_info) delete last_marginalization_info; last_marginalization_info = marginalization_info; last_marginalization_parameter_blocks = parameter_blocks; } else { if (last_marginalization_info && std::count(std::begin(last_marginalization_parameter_blocks), std::end(last_marginalization_parameter_blocks), para_Pose[WINDOW_SIZE - 1])) { MarginalizationInfo *marginalization_info = new MarginalizationInfo(); vector2double(); if (last_marginalization_info) { vector drop_set; for (int i = 0; i < static_cast(last_marginalization_parameter_blocks.size()); i++) { ROS_ASSERT(last_marginalization_parameter_blocks[i] != para_SpeedBias[WINDOW_SIZE - 1]); if (last_marginalization_parameter_blocks[i] == para_Pose[WINDOW_SIZE - 1]) drop_set.push_back(i); } // construct new marginlization_factor MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info); ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(marginalization_factor, NULL, last_marginalization_parameter_blocks, drop_set); marginalization_info->addResidualBlockInfo(residual_block_info); } TicToc t_pre_margin; ROS_DEBUG("begin marginalization"); marginalization_info->preMarginalize(); ROS_DEBUG("end pre marginalization, %f ms", t_pre_margin.toc()); TicToc t_margin; ROS_DEBUG("begin marginalization"); marginalization_info->marginalize(); ROS_DEBUG("end marginalization, %f ms", t_margin.toc()); std::unordered_map addr_shift; for (int i = 0; i <= WINDOW_SIZE; i++) { if (i == WINDOW_SIZE - 1) continue; else if (i == WINDOW_SIZE) { addr_shift[reinterpret_cast(para_Pose[i])] = para_Pose[i - 1]; addr_shift[reinterpret_cast(para_SpeedBias[i])] = para_SpeedBias[i - 1]; } else { addr_shift[reinterpret_cast(para_Pose[i])] = para_Pose[i]; addr_shift[reinterpret_cast(para_SpeedBias[i])] = para_SpeedBias[i]; } } for (int i = 0; i < NUM_OF_CAM; i++) addr_shift[reinterpret_cast(para_Ex_Pose[i])] = para_Ex_Pose[i]; if (ESTIMATE_TD) { addr_shift[reinterpret_cast(para_Td[0])] = para_Td[0]; } vector parameter_blocks = marginalization_info->getParameterBlocks(addr_shift); if (last_marginalization_info) delete last_marginalization_info; last_marginalization_info = marginalization_info; last_marginalization_parameter_blocks = parameter_blocks; } } ROS_DEBUG("whole marginalization costs: %f", t_whole_marginalization.toc()); ROS_DEBUG("whole time for ceres: %f", t_whole.toc()); } void Estimator::slideWindow() { TicToc t_margin; if (marginalization_flag == MARGIN_OLD) { double t_0 = Headers[0].stamp.toSec(); back_R0 = Rs[0]; back_P0 = Ps[0]; if (frame_count == WINDOW_SIZE) { for (int i = 0; i < WINDOW_SIZE; i++) { Rs[i].swap(Rs[i + 1]); std::swap(pre_integrations[i], pre_integrations[i + 1]); dt_buf[i].swap(dt_buf[i + 1]); linear_acceleration_buf[i].swap(linear_acceleration_buf[i + 1]); angular_velocity_buf[i].swap(angular_velocity_buf[i + 1]); Headers[i] = Headers[i + 1]; Ps[i].swap(Ps[i + 1]); Vs[i].swap(Vs[i + 1]); Bas[i].swap(Bas[i + 1]); Bgs[i].swap(Bgs[i + 1]); } Headers[WINDOW_SIZE] = Headers[WINDOW_SIZE - 1]; Ps[WINDOW_SIZE] = Ps[WINDOW_SIZE - 1]; Vs[WINDOW_SIZE] = Vs[WINDOW_SIZE - 1]; Rs[WINDOW_SIZE] = Rs[WINDOW_SIZE - 1]; Bas[WINDOW_SIZE] = Bas[WINDOW_SIZE - 1]; Bgs[WINDOW_SIZE] = Bgs[WINDOW_SIZE - 1]; delete pre_integrations[WINDOW_SIZE]; pre_integrations[WINDOW_SIZE] = new IntegrationBase{acc_0, gyr_0, Bas[WINDOW_SIZE], Bgs[WINDOW_SIZE]}; dt_buf[WINDOW_SIZE].clear(); linear_acceleration_buf[WINDOW_SIZE].clear(); angular_velocity_buf[WINDOW_SIZE].clear(); if (true || solver_flag == INITIAL) { map::iterator it_0; it_0 = all_image_frame.find(t_0); delete it_0->second.pre_integration; it_0->second.pre_integration = nullptr; for (map::iterator it = all_image_frame.begin(); it != it_0; ++it) { if (it->second.pre_integration) delete it->second.pre_integration; it->second.pre_integration = NULL; } all_image_frame.erase(all_image_frame.begin(), it_0); all_image_frame.erase(t_0); } slideWindowOld(); } } else { if (frame_count == WINDOW_SIZE) { for (unsigned int i = 0; i < dt_buf[frame_count].size(); i++) { double tmp_dt = dt_buf[frame_count][i]; Vector3d tmp_linear_acceleration = linear_acceleration_buf[frame_count][i]; Vector3d tmp_angular_velocity = angular_velocity_buf[frame_count][i]; pre_integrations[frame_count - 1]->push_back(tmp_dt, tmp_linear_acceleration, tmp_angular_velocity); dt_buf[frame_count - 1].push_back(tmp_dt); linear_acceleration_buf[frame_count - 1].push_back(tmp_linear_acceleration); angular_velocity_buf[frame_count - 1].push_back(tmp_angular_velocity); } Headers[frame_count - 1] = Headers[frame_count]; Ps[frame_count - 1] = Ps[frame_count]; Vs[frame_count - 1] = Vs[frame_count]; Rs[frame_count - 1] = Rs[frame_count]; Bas[frame_count - 1] = Bas[frame_count]; Bgs[frame_count - 1] = Bgs[frame_count]; delete pre_integrations[WINDOW_SIZE]; pre_integrations[WINDOW_SIZE] = new IntegrationBase{acc_0, gyr_0, Bas[WINDOW_SIZE], Bgs[WINDOW_SIZE]}; dt_buf[WINDOW_SIZE].clear(); linear_acceleration_buf[WINDOW_SIZE].clear(); angular_velocity_buf[WINDOW_SIZE].clear(); slideWindowNew(); } } } // real marginalization is removed in solve_ceres() void Estimator::slideWindowNew() { sum_of_front++; f_manager.removeFront(frame_count); } // real marginalization is removed in solve_ceres() void Estimator::slideWindowOld() { sum_of_back++; bool shift_depth = solver_flag == NON_LINEAR ? true : false; if (shift_depth) { Matrix3d R0, R1; Vector3d P0, P1; R0 = back_R0 * ric[0]; R1 = Rs[0] * ric[0]; P0 = back_P0 + back_R0 * tic[0]; P1 = Ps[0] + Rs[0] * tic[0]; f_manager.removeBackShiftDepth(R0, P0, R1, P1); } else f_manager.removeBack(); } void Estimator::setReloFrame(double _frame_stamp, int _frame_index, vector &_match_points, Vector3d _relo_t, Matrix3d _relo_r) { relo_frame_stamp = _frame_stamp; relo_frame_index = _frame_index; match_points.clear(); match_points = _match_points; prev_relo_t = _relo_t; prev_relo_r = _relo_r; for(int i = 0; i < WINDOW_SIZE; i++) { if(relo_frame_stamp == Headers[i].stamp.toSec()) { relo_frame_local_index = i; relocalization_info = 1; for (int j = 0; j < SIZE_POSE; j++) relo_Pose[j] = para_Pose[i][j]; } } } ================================================ FILE: rpvio_estimator/src/estimator.h ================================================ #pragma once #include "parameters.h" #include "feature_manager.h" #include "utility/utility.h" #include "utility/tic_toc.h" #include "initial/solve_5pts.h" #include "initial/initial_sfm.h" #include "initial/initial_alignment.h" #include "initial/initial_ex_rotation.h" #include #include #include #include "factor/imu_factor.h" #include "factor/pose_local_parameterization.h" #include "factor/projection_factor.h" #include "factor/projection_td_factor.h" #include "factor/marginalization_factor.h" #include "factor/homography_factor.h" #include #include #include class Estimator { public: Estimator(); void setParameter(); // interface void processIMU(double t, const Vector3d &linear_acceleration, const Vector3d &angular_velocity); void processImage(const map>>> &image, const std_msgs::Header &header); void setReloFrame(double _frame_stamp, int _frame_index, vector &_match_points, Vector3d _relo_t, Matrix3d _relo_r); // internal void clearState(); bool initialStructure(); bool visualInitialAlign(map &relative_T, Eigen::Vector3d <); bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l); bool relativeHPose(map &relative_R, map &relative_T, map &n, int &l, int &lplane_id); void initializeNewPlanes(); void slideWindow(); void solveOdometry(); void slideWindowNew(); void slideWindowOld(); void optimization(); void vector2double(); void double2vector(); bool failureDetection(); enum SolverFlag { INITIAL, NON_LINEAR }; enum MarginalizationFlag { MARGIN_OLD = 0, MARGIN_SECOND_NEW = 1 }; SolverFlag solver_flag; MarginalizationFlag marginalization_flag; Vector3d g; MatrixXd Ap[2], backup_A; VectorXd bp[2], backup_b; Matrix3d ric[NUM_OF_CAM]; Vector3d tic[NUM_OF_CAM]; Vector3d Ps[(WINDOW_SIZE + 1)]; Vector3d Vs[(WINDOW_SIZE + 1)]; Matrix3d Rs[(WINDOW_SIZE + 1)]; Vector3d Bas[(WINDOW_SIZE + 1)]; Vector3d Bgs[(WINDOW_SIZE + 1)]; double td; double s; int lplane_id; Matrix3d back_R0, last_R, last_R0; Vector3d back_P0, last_P, last_P0; std_msgs::Header Headers[(WINDOW_SIZE + 1)]; IntegrationBase *pre_integrations[(WINDOW_SIZE + 1)]; Vector3d acc_0, gyr_0; vector dt_buf[(WINDOW_SIZE + 1)]; vector linear_acceleration_buf[(WINDOW_SIZE + 1)]; vector angular_velocity_buf[(WINDOW_SIZE + 1)]; int frame_count; int sum_of_outlier, sum_of_back, sum_of_front, sum_of_invalid; FeatureManager f_manager; MotionEstimator m_estimator; InitialEXRotation initial_ex_rotation; bool first_imu; bool is_valid, is_key; bool failure_occur; vector point_cloud; vector margin_cloud; vector key_poses; double initial_timestamp; double para_Pose[WINDOW_SIZE + 1][SIZE_POSE]; double para_SpeedBias[WINDOW_SIZE + 1][SIZE_SPEEDBIAS]; double para_Feature[NUM_OF_F][SIZE_FEATURE]; double para_Ex_Pose[NUM_OF_CAM][SIZE_POSE]; double para_Retrive_Pose[SIZE_POSE]; double para_Td[1][1]; double para_Tr[1][1]; // In global camera frame (c_0) map> para_N; map> para_d; vector init_pids; int loop_window_index; MarginalizationInfo *last_marginalization_info; vector last_marginalization_parameter_blocks; map all_image_frame; IntegrationBase *tmp_pre_integration; //relocalization variables bool relocalization_info; double relo_frame_stamp; double relo_frame_index; int relo_frame_local_index; vector match_points; double relo_Pose[SIZE_POSE]; Matrix3d drift_correct_r; Vector3d drift_correct_t; Vector3d prev_relo_t; Matrix3d prev_relo_r; Vector3d relo_relative_t; Quaterniond relo_relative_q; double relo_relative_yaw; }; ================================================ FILE: rpvio_estimator/src/estimator_node.cpp ================================================ #include #include #include #include #include #include #include #include #include #include "estimator.h" #include "parameters.h" #include "utility/visualization.h" Estimator estimator; std::condition_variable con; double current_time = -1; queue imu_buf; queue feature_buf; queue relo_buf; int sum_of_wait = 0; std::mutex m_buf; std::mutex m_state; std::mutex i_buf; std::mutex m_estimator; double latest_time; Eigen::Vector3d tmp_P; Eigen::Quaterniond tmp_Q; Eigen::Vector3d tmp_V; Eigen::Vector3d tmp_Ba; Eigen::Vector3d tmp_Bg; Eigen::Vector3d acc_0; Eigen::Vector3d gyr_0; bool init_feature = 0; bool init_imu = 1; double last_imu_t = 0; void predict(const sensor_msgs::ImuConstPtr &imu_msg) { double t = imu_msg->header.stamp.toSec(); if (init_imu) { latest_time = t; init_imu = 0; return; } double dt = t - latest_time; latest_time = t; double dx = imu_msg->linear_acceleration.x; double dy = imu_msg->linear_acceleration.y; double dz = imu_msg->linear_acceleration.z; Eigen::Vector3d linear_acceleration{dx, dy, dz}; double rx = imu_msg->angular_velocity.x; double ry = imu_msg->angular_velocity.y; double rz = imu_msg->angular_velocity.z; Eigen::Vector3d angular_velocity{rx, ry, rz}; Eigen::Vector3d un_acc_0 = tmp_Q * (acc_0 - tmp_Ba) - estimator.g; Eigen::Vector3d un_gyr = 0.5 * (gyr_0 + angular_velocity) - tmp_Bg; tmp_Q = tmp_Q * Utility::deltaQ(un_gyr * dt); Eigen::Vector3d un_acc_1 = tmp_Q * (linear_acceleration - tmp_Ba) - estimator.g; Eigen::Vector3d un_acc = 0.5 * (un_acc_0 + un_acc_1); tmp_P = tmp_P + dt * tmp_V + 0.5 * dt * dt * un_acc; tmp_V = tmp_V + dt * un_acc; acc_0 = linear_acceleration; gyr_0 = angular_velocity; } void update() { TicToc t_predict; latest_time = current_time; tmp_P = estimator.Ps[WINDOW_SIZE]; tmp_Q = estimator.Rs[WINDOW_SIZE]; tmp_V = estimator.Vs[WINDOW_SIZE]; tmp_Ba = estimator.Bas[WINDOW_SIZE]; tmp_Bg = estimator.Bgs[WINDOW_SIZE]; acc_0 = estimator.acc_0; gyr_0 = estimator.gyr_0; queue tmp_imu_buf = imu_buf; for (sensor_msgs::ImuConstPtr tmp_imu_msg; !tmp_imu_buf.empty(); tmp_imu_buf.pop()) predict(tmp_imu_buf.front()); } std::vector, sensor_msgs::PointCloudConstPtr>> getMeasurements() { std::vector, sensor_msgs::PointCloudConstPtr>> measurements; while (true) { if (imu_buf.empty() || feature_buf.empty()) return measurements; if (!(imu_buf.back()->header.stamp.toSec() > feature_buf.front()->header.stamp.toSec() + estimator.td)) { //ROS_WARN("wait for imu, only should happen at the beginning"); sum_of_wait++; return measurements; } if (!(imu_buf.front()->header.stamp.toSec() < feature_buf.front()->header.stamp.toSec() + estimator.td)) { ROS_WARN("throw img, only should happen at the beginning"); feature_buf.pop(); continue; } sensor_msgs::PointCloudConstPtr img_msg = feature_buf.front(); feature_buf.pop(); std::vector IMUs; while (imu_buf.front()->header.stamp.toSec() < img_msg->header.stamp.toSec() + estimator.td) { IMUs.emplace_back(imu_buf.front()); imu_buf.pop(); } IMUs.emplace_back(imu_buf.front()); if (IMUs.empty()) ROS_WARN("no imu between two image"); measurements.emplace_back(IMUs, img_msg); } return measurements; } void imu_callback(const sensor_msgs::ImuConstPtr &imu_msg) { if (imu_msg->header.stamp.toSec() <= last_imu_t) { ROS_WARN("imu message in disorder!"); return; } last_imu_t = imu_msg->header.stamp.toSec(); m_buf.lock(); imu_buf.push(imu_msg); m_buf.unlock(); con.notify_one(); last_imu_t = imu_msg->header.stamp.toSec(); { std::lock_guard lg(m_state); predict(imu_msg); std_msgs::Header header = imu_msg->header; header.frame_id = "world"; if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR) pubLatestOdometry(tmp_P, tmp_Q, tmp_V, header); } } void feature_callback(const sensor_msgs::PointCloudConstPtr &feature_msg) { if (!init_feature) { //skip the first detected feature, which doesn't contain optical flow speed init_feature = 1; return; } m_buf.lock(); feature_buf.push(feature_msg); m_buf.unlock(); con.notify_one(); } void restart_callback(const std_msgs::BoolConstPtr &restart_msg) { if (restart_msg->data == true) { ROS_WARN("restart the estimator!"); m_buf.lock(); while(!feature_buf.empty()) feature_buf.pop(); while(!imu_buf.empty()) imu_buf.pop(); m_buf.unlock(); m_estimator.lock(); estimator.clearState(); estimator.setParameter(); m_estimator.unlock(); current_time = -1; last_imu_t = 0; } return; } void relocalization_callback(const sensor_msgs::PointCloudConstPtr &points_msg) { //printf("relocalization callback! \n"); m_buf.lock(); relo_buf.push(points_msg); m_buf.unlock(); } // thread: visual-inertial odometry void process() { while (true) { std::vector, sensor_msgs::PointCloudConstPtr>> measurements; std::unique_lock lk(m_buf); con.wait(lk, [&] { return (measurements = getMeasurements()).size() != 0; }); lk.unlock(); m_estimator.lock(); for (auto &measurement : measurements) { auto img_msg = measurement.second; double dx = 0, dy = 0, dz = 0, rx = 0, ry = 0, rz = 0; for (auto &imu_msg : measurement.first) { double t = imu_msg->header.stamp.toSec(); double img_t = img_msg->header.stamp.toSec() + estimator.td; if (t <= img_t) { if (current_time < 0) current_time = t; double dt = t - current_time; ROS_ASSERT(dt >= 0); current_time = t; dx = imu_msg->linear_acceleration.x; dy = imu_msg->linear_acceleration.y; dz = imu_msg->linear_acceleration.z; rx = imu_msg->angular_velocity.x; ry = imu_msg->angular_velocity.y; rz = imu_msg->angular_velocity.z; estimator.processIMU(dt, Vector3d(dx, dy, dz), Vector3d(rx, ry, rz)); //printf("imu: dt:%f a: %f %f %f w: %f %f %f\n",dt, dx, dy, dz, rx, ry, rz); } else { double dt_1 = img_t - current_time; double dt_2 = t - img_t; current_time = img_t; ROS_ASSERT(dt_1 >= 0); ROS_ASSERT(dt_2 >= 0); ROS_ASSERT(dt_1 + dt_2 > 0); double w1 = dt_2 / (dt_1 + dt_2); double w2 = dt_1 / (dt_1 + dt_2); dx = w1 * dx + w2 * imu_msg->linear_acceleration.x; dy = w1 * dy + w2 * imu_msg->linear_acceleration.y; dz = w1 * dz + w2 * imu_msg->linear_acceleration.z; rx = w1 * rx + w2 * imu_msg->angular_velocity.x; ry = w1 * ry + w2 * imu_msg->angular_velocity.y; rz = w1 * rz + w2 * imu_msg->angular_velocity.z; estimator.processIMU(dt_1, Vector3d(dx, dy, dz), Vector3d(rx, ry, rz)); //printf("dimu: dt:%f a: %f %f %f w: %f %f %f\n",dt_1, dx, dy, dz, rx, ry, rz); } } // set relocalization frame sensor_msgs::PointCloudConstPtr relo_msg = NULL; while (!relo_buf.empty()) { relo_msg = relo_buf.front(); relo_buf.pop(); } if (relo_msg != NULL) { vector match_points; double frame_stamp = relo_msg->header.stamp.toSec(); for (unsigned int i = 0; i < relo_msg->points.size(); i++) { Vector3d u_v_id; u_v_id.x() = relo_msg->points[i].x; u_v_id.y() = relo_msg->points[i].y; u_v_id.z() = relo_msg->points[i].z; match_points.push_back(u_v_id); } Vector3d relo_t(relo_msg->channels[0].values[0], relo_msg->channels[0].values[1], relo_msg->channels[0].values[2]); Quaterniond relo_q(relo_msg->channels[0].values[3], relo_msg->channels[0].values[4], relo_msg->channels[0].values[5], relo_msg->channels[0].values[6]); Matrix3d relo_r = relo_q.toRotationMatrix(); int frame_index; frame_index = relo_msg->channels[0].values[7]; estimator.setReloFrame(frame_stamp, frame_index, match_points, relo_t, relo_r); } ROS_DEBUG("processing vision data with stamp %f \n", img_msg->header.stamp.toSec()); TicToc t_s; map>>> image; for (unsigned int i = 0; i < img_msg->points.size(); i++) { int v = img_msg->channels[0].values[i] + 0.5; int feature_id = v / NUM_OF_CAM; int camera_id = v % NUM_OF_CAM; int plane_id = img_msg->channels[1].values[i]; double x = img_msg->points[i].x; double y = img_msg->points[i].y; double z = img_msg->points[i].z; double p_u = img_msg->channels[2].values[i]; double p_v = img_msg->channels[3].values[i]; double velocity_x = img_msg->channels[4].values[i]; double velocity_y = img_msg->channels[5].values[i]; ROS_ASSERT(z == 1); Eigen::Matrix xyz_uv_velocity_pid; xyz_uv_velocity_pid << x, y, z, p_u, p_v, velocity_x, velocity_y, plane_id; image[feature_id].emplace_back(camera_id, xyz_uv_velocity_pid); } estimator.processImage(image, img_msg->header); double whole_t = t_s.toc(); printStatistics(estimator, whole_t); std_msgs::Header header = img_msg->header; header.frame_id = "world"; pubOdometry(estimator, header); pubKeyPoses(estimator, header); pubCameraPose(estimator, header); pubPointCloud(estimator, header); pubTF(estimator, header); pubKeyframe(estimator); if (relo_msg != NULL) pubRelocalization(estimator); //ROS_ERROR("end: %f, at %f", img_msg->header.stamp.toSec(), ros::Time::now().toSec()); } m_estimator.unlock(); m_buf.lock(); m_state.lock(); if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR) update(); m_state.unlock(); m_buf.unlock(); } } int main(int argc, char **argv) { ros::init(argc, argv, "rpvio_estimator"); ros::NodeHandle n("~"); ros::console::set_logger_level(ROSCONSOLE_DEFAULT_NAME, ros::console::levels::Info); readParameters(n); estimator.setParameter(); #ifdef EIGEN_DONT_PARALLELIZE ROS_DEBUG("EIGEN_DONT_PARALLELIZE"); #endif ROS_WARN("waiting for image and imu..."); registerPub(n); ros::Subscriber sub_imu = n.subscribe(IMU_TOPIC, 2000, imu_callback, ros::TransportHints().tcpNoDelay()); ros::Subscriber sub_image = n.subscribe("/rpvio_feature_tracker/feature", 2000, feature_callback); ros::Subscriber sub_restart = n.subscribe("/rpvio_feature_tracker/restart", 2000, restart_callback); ros::Subscriber sub_relo_points = n.subscribe("/pose_graph/match_points", 2000, relocalization_callback); std::thread measurement_process{process}; ros::spin(); return 0; } ================================================ FILE: rpvio_estimator/src/factor/homography_factor.h ================================================ #pragma once #include #include struct HomographyFactor { HomographyFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) {} template bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const para_depth, const T* const ex_pose, T* residuals) const { Eigen::Map> pi(pose_i); Eigen::Quaternion qi; qi.coeffs() << pose_i[3], pose_i[4], pose_i[5], pose_i[6]; Eigen::Map> pj(pose_j); Eigen::Quaternion qj; qj.coeffs() << pose_j[3], pose_j[4], pose_j[5], pose_j[6]; Eigen::Map> tic(ex_pose); Eigen::Quaternion qic; qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6]; Eigen::Matrix n; n << para_n[0], para_n[1], para_n[2]; // transform camera normal to imu normal Eigen::Matrix n_imu_0 = qic*n;// + tic; // transform imu 0 normal to imu i normal Eigen::Matrix n_imu_i = qi.inverse()*n_imu_0;// - qi.inverse()*pi; Eigen::Map> depth(para_depth); Eigen::Quaternion qji = qj.inverse() * qi; Eigen::Matrix tji = qj.inverse() * (pi - pj); Eigen::Matrix di, di0; // convert camera depth to imu frame di0(0,0) = depth(0,0) + tic.dot(n_imu_0); // convert imu 0 depth to imu i depth di(0,0) = di0(0,0) - pi.dot(n_imu_0); Eigen::Matrix pts_imu_i = qic * pts_i.cast() + tic; // homography mapping Eigen::Matrix pts_imu_j = qji * pts_imu_i + (tji*(1.0/di(0,0)) * n_imu_i.transpose()) * pts_imu_i; Eigen::Matrix pts_cam_j = qic.inverse() * (pts_imu_j - tic); pts_cam_j = (pts_cam_j / pts_cam_j[2]); residuals[0] = pts_cam_j[0] - pts_j[0]; residuals[1] = pts_cam_j[1] - pts_j[1]; return true; } static ceres::CostFunction* Create(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) { return (new ceres::AutoDiffCostFunction (new HomographyFactor(_pts_i, _pts_j))); } Eigen::Vector3d pts_i, pts_j; }; ================================================ FILE: rpvio_estimator/src/factor/imu_factor.h ================================================ #pragma once #include #include #include #include "../utility/utility.h" #include "../parameters.h" #include "integration_base.h" #include class IMUFactor : public ceres::SizedCostFunction<15, 7, 9, 7, 9> { public: IMUFactor() = delete; IMUFactor(IntegrationBase* _pre_integration):pre_integration(_pre_integration) { } virtual bool Evaluate(double const *const *parameters, double *residuals, double **jacobians) const { Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]); Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]); Eigen::Vector3d Vi(parameters[1][0], parameters[1][1], parameters[1][2]); Eigen::Vector3d Bai(parameters[1][3], parameters[1][4], parameters[1][5]); Eigen::Vector3d Bgi(parameters[1][6], parameters[1][7], parameters[1][8]); Eigen::Vector3d Pj(parameters[2][0], parameters[2][1], parameters[2][2]); Eigen::Quaterniond Qj(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]); Eigen::Vector3d Vj(parameters[3][0], parameters[3][1], parameters[3][2]); Eigen::Vector3d Baj(parameters[3][3], parameters[3][4], parameters[3][5]); Eigen::Vector3d Bgj(parameters[3][6], parameters[3][7], parameters[3][8]); //Eigen::Matrix Fd; //Eigen::Matrix Gd; //Eigen::Vector3d pPj = Pi + Vi * sum_t - 0.5 * g * sum_t * sum_t + corrected_delta_p; //Eigen::Quaterniond pQj = Qi * delta_q; //Eigen::Vector3d pVj = Vi - g * sum_t + corrected_delta_v; //Eigen::Vector3d pBaj = Bai; //Eigen::Vector3d pBgj = Bgi; //Vi + Qi * delta_v - g * sum_dt = Vj; //Qi * delta_q = Qj; //delta_p = Qi.inverse() * (0.5 * g * sum_dt * sum_dt + Pj - Pi); //delta_v = Qi.inverse() * (g * sum_dt + Vj - Vi); //delta_q = Qi.inverse() * Qj; #if 0 if ((Bai - pre_integration->linearized_ba).norm() > 0.10 || (Bgi - pre_integration->linearized_bg).norm() > 0.01) { pre_integration->repropagate(Bai, Bgi); } #endif Eigen::Map> residual(residuals); residual = pre_integration->evaluate(Pi, Qi, Vi, Bai, Bgi, Pj, Qj, Vj, Baj, Bgj); Eigen::Matrix sqrt_info = Eigen::LLT>(pre_integration->covariance.inverse()).matrixL().transpose(); //sqrt_info.setIdentity(); residual = sqrt_info * residual; if (jacobians) { double sum_dt = pre_integration->sum_dt; Eigen::Matrix3d dp_dba = pre_integration->jacobian.template block<3, 3>(O_P, O_BA); Eigen::Matrix3d dp_dbg = pre_integration->jacobian.template block<3, 3>(O_P, O_BG); Eigen::Matrix3d dq_dbg = pre_integration->jacobian.template block<3, 3>(O_R, O_BG); Eigen::Matrix3d dv_dba = pre_integration->jacobian.template block<3, 3>(O_V, O_BA); Eigen::Matrix3d dv_dbg = pre_integration->jacobian.template block<3, 3>(O_V, O_BG); if (pre_integration->jacobian.maxCoeff() > 1e8 || pre_integration->jacobian.minCoeff() < -1e8) { ROS_WARN("numerical unstable in preintegration"); //std::cout << pre_integration->jacobian << std::endl; /// ROS_BREAK(); } if (jacobians[0]) { Eigen::Map> jacobian_pose_i(jacobians[0]); jacobian_pose_i.setZero(); jacobian_pose_i.block<3, 3>(O_P, O_P) = -Qi.inverse().toRotationMatrix(); jacobian_pose_i.block<3, 3>(O_P, O_R) = Utility::skewSymmetric(Qi.inverse() * (0.5 * G * sum_dt * sum_dt + Pj - Pi - Vi * sum_dt)); #if 0 jacobian_pose_i.block<3, 3>(O_R, O_R) = -(Qj.inverse() * Qi).toRotationMatrix(); #else Eigen::Quaterniond corrected_delta_q = pre_integration->delta_q * Utility::deltaQ(dq_dbg * (Bgi - pre_integration->linearized_bg)); jacobian_pose_i.block<3, 3>(O_R, O_R) = -(Utility::Qleft(Qj.inverse() * Qi) * Utility::Qright(corrected_delta_q)).bottomRightCorner<3, 3>(); #endif jacobian_pose_i.block<3, 3>(O_V, O_R) = Utility::skewSymmetric(Qi.inverse() * (G * sum_dt + Vj - Vi)); jacobian_pose_i = sqrt_info * jacobian_pose_i; if (jacobian_pose_i.maxCoeff() > 1e8 || jacobian_pose_i.minCoeff() < -1e8) { ROS_WARN("numerical unstable in preintegration"); //std::cout << sqrt_info << std::endl; //ROS_BREAK(); } } if (jacobians[1]) { Eigen::Map> jacobian_speedbias_i(jacobians[1]); jacobian_speedbias_i.setZero(); jacobian_speedbias_i.block<3, 3>(O_P, O_V - O_V) = -Qi.inverse().toRotationMatrix() * sum_dt; jacobian_speedbias_i.block<3, 3>(O_P, O_BA - O_V) = -dp_dba; jacobian_speedbias_i.block<3, 3>(O_P, O_BG - O_V) = -dp_dbg; #if 0 jacobian_speedbias_i.block<3, 3>(O_R, O_BG - O_V) = -dq_dbg; #else //Eigen::Quaterniond corrected_delta_q = pre_integration->delta_q * Utility::deltaQ(dq_dbg * (Bgi - pre_integration->linearized_bg)); //jacobian_speedbias_i.block<3, 3>(O_R, O_BG - O_V) = -Utility::Qleft(Qj.inverse() * Qi * corrected_delta_q).bottomRightCorner<3, 3>() * dq_dbg; jacobian_speedbias_i.block<3, 3>(O_R, O_BG - O_V) = -Utility::Qleft(Qj.inverse() * Qi * pre_integration->delta_q).bottomRightCorner<3, 3>() * dq_dbg; #endif jacobian_speedbias_i.block<3, 3>(O_V, O_V - O_V) = -Qi.inverse().toRotationMatrix(); jacobian_speedbias_i.block<3, 3>(O_V, O_BA - O_V) = -dv_dba; jacobian_speedbias_i.block<3, 3>(O_V, O_BG - O_V) = -dv_dbg; jacobian_speedbias_i.block<3, 3>(O_BA, O_BA - O_V) = -Eigen::Matrix3d::Identity(); jacobian_speedbias_i.block<3, 3>(O_BG, O_BG - O_V) = -Eigen::Matrix3d::Identity(); jacobian_speedbias_i = sqrt_info * jacobian_speedbias_i; //ROS_ASSERT(fabs(jacobian_speedbias_i.maxCoeff()) < 1e8); //ROS_ASSERT(fabs(jacobian_speedbias_i.minCoeff()) < 1e8); } if (jacobians[2]) { Eigen::Map> jacobian_pose_j(jacobians[2]); jacobian_pose_j.setZero(); jacobian_pose_j.block<3, 3>(O_P, O_P) = Qi.inverse().toRotationMatrix(); #if 0 jacobian_pose_j.block<3, 3>(O_R, O_R) = Eigen::Matrix3d::Identity(); #else Eigen::Quaterniond corrected_delta_q = pre_integration->delta_q * Utility::deltaQ(dq_dbg * (Bgi - pre_integration->linearized_bg)); jacobian_pose_j.block<3, 3>(O_R, O_R) = Utility::Qleft(corrected_delta_q.inverse() * Qi.inverse() * Qj).bottomRightCorner<3, 3>(); #endif jacobian_pose_j = sqrt_info * jacobian_pose_j; //ROS_ASSERT(fabs(jacobian_pose_j.maxCoeff()) < 1e8); //ROS_ASSERT(fabs(jacobian_pose_j.minCoeff()) < 1e8); } if (jacobians[3]) { Eigen::Map> jacobian_speedbias_j(jacobians[3]); jacobian_speedbias_j.setZero(); jacobian_speedbias_j.block<3, 3>(O_V, O_V - O_V) = Qi.inverse().toRotationMatrix(); jacobian_speedbias_j.block<3, 3>(O_BA, O_BA - O_V) = Eigen::Matrix3d::Identity(); jacobian_speedbias_j.block<3, 3>(O_BG, O_BG - O_V) = Eigen::Matrix3d::Identity(); jacobian_speedbias_j = sqrt_info * jacobian_speedbias_j; //ROS_ASSERT(fabs(jacobian_speedbias_j.maxCoeff()) < 1e8); //ROS_ASSERT(fabs(jacobian_speedbias_j.minCoeff()) < 1e8); } } return true; } //bool Evaluate_Direct(double const *const *parameters, Eigen::Matrix &residuals, Eigen::Matrix &jacobians); //void checkCorrection(); //void checkTransition(); //void checkJacobian(double **parameters); IntegrationBase* pre_integration; }; ================================================ FILE: rpvio_estimator/src/factor/integration_base.h ================================================ #pragma once #include "../utility/utility.h" #include "../parameters.h" #include using namespace Eigen; class IntegrationBase { public: IntegrationBase() = delete; IntegrationBase(const Eigen::Vector3d &_acc_0, const Eigen::Vector3d &_gyr_0, const Eigen::Vector3d &_linearized_ba, const Eigen::Vector3d &_linearized_bg) : acc_0{_acc_0}, gyr_0{_gyr_0}, linearized_acc{_acc_0}, linearized_gyr{_gyr_0}, linearized_ba{_linearized_ba}, linearized_bg{_linearized_bg}, jacobian{Eigen::Matrix::Identity()}, covariance{Eigen::Matrix::Zero()}, sum_dt{0.0}, delta_p{Eigen::Vector3d::Zero()}, delta_q{Eigen::Quaterniond::Identity()}, delta_v{Eigen::Vector3d::Zero()} { noise = Eigen::Matrix::Zero(); noise.block<3, 3>(0, 0) = (ACC_N * ACC_N) * Eigen::Matrix3d::Identity(); noise.block<3, 3>(3, 3) = (GYR_N * GYR_N) * Eigen::Matrix3d::Identity(); noise.block<3, 3>(6, 6) = (ACC_N * ACC_N) * Eigen::Matrix3d::Identity(); noise.block<3, 3>(9, 9) = (GYR_N * GYR_N) * Eigen::Matrix3d::Identity(); noise.block<3, 3>(12, 12) = (ACC_W * ACC_W) * Eigen::Matrix3d::Identity(); noise.block<3, 3>(15, 15) = (GYR_W * GYR_W) * Eigen::Matrix3d::Identity(); } void push_back(double dt, const Eigen::Vector3d &acc, const Eigen::Vector3d &gyr) { dt_buf.push_back(dt); acc_buf.push_back(acc); gyr_buf.push_back(gyr); propagate(dt, acc, gyr); } void repropagate(const Eigen::Vector3d &_linearized_ba, const Eigen::Vector3d &_linearized_bg) { sum_dt = 0.0; acc_0 = linearized_acc; gyr_0 = linearized_gyr; delta_p.setZero(); delta_q.setIdentity(); delta_v.setZero(); linearized_ba = _linearized_ba; linearized_bg = _linearized_bg; jacobian.setIdentity(); covariance.setZero(); for (int i = 0; i < static_cast(dt_buf.size()); i++) propagate(dt_buf[i], acc_buf[i], gyr_buf[i]); } void midPointIntegration(double _dt, const Eigen::Vector3d &_acc_0, const Eigen::Vector3d &_gyr_0, const Eigen::Vector3d &_acc_1, const Eigen::Vector3d &_gyr_1, const Eigen::Vector3d &delta_p, const Eigen::Quaterniond &delta_q, const Eigen::Vector3d &delta_v, const Eigen::Vector3d &linearized_ba, const Eigen::Vector3d &linearized_bg, Eigen::Vector3d &result_delta_p, Eigen::Quaterniond &result_delta_q, Eigen::Vector3d &result_delta_v, Eigen::Vector3d &result_linearized_ba, Eigen::Vector3d &result_linearized_bg, bool update_jacobian) { //ROS_INFO("midpoint integration"); Vector3d un_acc_0 = delta_q * (_acc_0 - linearized_ba); Vector3d un_gyr = 0.5 * (_gyr_0 + _gyr_1) - linearized_bg; result_delta_q = delta_q * Quaterniond(1, un_gyr(0) * _dt / 2, un_gyr(1) * _dt / 2, un_gyr(2) * _dt / 2); Vector3d un_acc_1 = result_delta_q * (_acc_1 - linearized_ba); Vector3d un_acc = 0.5 * (un_acc_0 + un_acc_1); result_delta_p = delta_p + delta_v * _dt + 0.5 * un_acc * _dt * _dt; result_delta_v = delta_v + un_acc * _dt; result_linearized_ba = linearized_ba; result_linearized_bg = linearized_bg; if(update_jacobian) { Vector3d w_x = 0.5 * (_gyr_0 + _gyr_1) - linearized_bg; Vector3d a_0_x = _acc_0 - linearized_ba; Vector3d a_1_x = _acc_1 - linearized_ba; Matrix3d R_w_x, R_a_0_x, R_a_1_x; R_w_x<<0, -w_x(2), w_x(1), w_x(2), 0, -w_x(0), -w_x(1), w_x(0), 0; R_a_0_x<<0, -a_0_x(2), a_0_x(1), a_0_x(2), 0, -a_0_x(0), -a_0_x(1), a_0_x(0), 0; R_a_1_x<<0, -a_1_x(2), a_1_x(1), a_1_x(2), 0, -a_1_x(0), -a_1_x(1), a_1_x(0), 0; MatrixXd F = MatrixXd::Zero(15, 15); F.block<3, 3>(0, 0) = Matrix3d::Identity(); F.block<3, 3>(0, 3) = -0.25 * delta_q.toRotationMatrix() * R_a_0_x * _dt * _dt + -0.25 * result_delta_q.toRotationMatrix() * R_a_1_x * (Matrix3d::Identity() - R_w_x * _dt) * _dt * _dt; F.block<3, 3>(0, 6) = MatrixXd::Identity(3,3) * _dt; F.block<3, 3>(0, 9) = -0.25 * (delta_q.toRotationMatrix() + result_delta_q.toRotationMatrix()) * _dt * _dt; F.block<3, 3>(0, 12) = -0.25 * result_delta_q.toRotationMatrix() * R_a_1_x * _dt * _dt * -_dt; F.block<3, 3>(3, 3) = Matrix3d::Identity() - R_w_x * _dt; F.block<3, 3>(3, 12) = -1.0 * MatrixXd::Identity(3,3) * _dt; F.block<3, 3>(6, 3) = -0.5 * delta_q.toRotationMatrix() * R_a_0_x * _dt + -0.5 * result_delta_q.toRotationMatrix() * R_a_1_x * (Matrix3d::Identity() - R_w_x * _dt) * _dt; F.block<3, 3>(6, 6) = Matrix3d::Identity(); F.block<3, 3>(6, 9) = -0.5 * (delta_q.toRotationMatrix() + result_delta_q.toRotationMatrix()) * _dt; F.block<3, 3>(6, 12) = -0.5 * result_delta_q.toRotationMatrix() * R_a_1_x * _dt * -_dt; F.block<3, 3>(9, 9) = Matrix3d::Identity(); F.block<3, 3>(12, 12) = Matrix3d::Identity(); //cout<<"A"<(0, 0) = 0.25 * delta_q.toRotationMatrix() * _dt * _dt; V.block<3, 3>(0, 3) = 0.25 * -result_delta_q.toRotationMatrix() * R_a_1_x * _dt * _dt * 0.5 * _dt; V.block<3, 3>(0, 6) = 0.25 * result_delta_q.toRotationMatrix() * _dt * _dt; V.block<3, 3>(0, 9) = V.block<3, 3>(0, 3); V.block<3, 3>(3, 3) = 0.5 * MatrixXd::Identity(3,3) * _dt; V.block<3, 3>(3, 9) = 0.5 * MatrixXd::Identity(3,3) * _dt; V.block<3, 3>(6, 0) = 0.5 * delta_q.toRotationMatrix() * _dt; V.block<3, 3>(6, 3) = 0.5 * -result_delta_q.toRotationMatrix() * R_a_1_x * _dt * 0.5 * _dt; V.block<3, 3>(6, 6) = 0.5 * result_delta_q.toRotationMatrix() * _dt; V.block<3, 3>(6, 9) = V.block<3, 3>(6, 3); V.block<3, 3>(9, 12) = MatrixXd::Identity(3,3) * _dt; V.block<3, 3>(12, 15) = MatrixXd::Identity(3,3) * _dt; //step_jacobian = F; //step_V = V; jacobian = F * jacobian; covariance = F * covariance * F.transpose() + V * noise * V.transpose(); } } void propagate(double _dt, const Eigen::Vector3d &_acc_1, const Eigen::Vector3d &_gyr_1) { dt = _dt; acc_1 = _acc_1; gyr_1 = _gyr_1; Vector3d result_delta_p; Quaterniond result_delta_q; Vector3d result_delta_v; Vector3d result_linearized_ba; Vector3d result_linearized_bg; midPointIntegration(_dt, acc_0, gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v, linearized_ba, linearized_bg, result_delta_p, result_delta_q, result_delta_v, result_linearized_ba, result_linearized_bg, 1); //checkJacobian(_dt, acc_0, gyr_0, acc_1, gyr_1, delta_p, delta_q, delta_v, // linearized_ba, linearized_bg); delta_p = result_delta_p; delta_q = result_delta_q; delta_v = result_delta_v; linearized_ba = result_linearized_ba; linearized_bg = result_linearized_bg; delta_q.normalize(); sum_dt += dt; acc_0 = acc_1; gyr_0 = gyr_1; } Eigen::Matrix evaluate(const Eigen::Vector3d &Pi, const Eigen::Quaterniond &Qi, const Eigen::Vector3d &Vi, const Eigen::Vector3d &Bai, const Eigen::Vector3d &Bgi, const Eigen::Vector3d &Pj, const Eigen::Quaterniond &Qj, const Eigen::Vector3d &Vj, const Eigen::Vector3d &Baj, const Eigen::Vector3d &Bgj) { Eigen::Matrix residuals; Eigen::Matrix3d dp_dba = jacobian.block<3, 3>(O_P, O_BA); Eigen::Matrix3d dp_dbg = jacobian.block<3, 3>(O_P, O_BG); Eigen::Matrix3d dq_dbg = jacobian.block<3, 3>(O_R, O_BG); Eigen::Matrix3d dv_dba = jacobian.block<3, 3>(O_V, O_BA); Eigen::Matrix3d dv_dbg = jacobian.block<3, 3>(O_V, O_BG); Eigen::Vector3d dba = Bai - linearized_ba; Eigen::Vector3d dbg = Bgi - linearized_bg; Eigen::Quaterniond corrected_delta_q = delta_q * Utility::deltaQ(dq_dbg * dbg); Eigen::Vector3d corrected_delta_v = delta_v + dv_dba * dba + dv_dbg * dbg; Eigen::Vector3d corrected_delta_p = delta_p + dp_dba * dba + dp_dbg * dbg; residuals.block<3, 1>(O_P, 0) = Qi.inverse() * (0.5 * G * sum_dt * sum_dt + Pj - Pi - Vi * sum_dt) - corrected_delta_p; residuals.block<3, 1>(O_R, 0) = 2 * (corrected_delta_q.inverse() * (Qi.inverse() * Qj)).vec(); residuals.block<3, 1>(O_V, 0) = Qi.inverse() * (G * sum_dt + Vj - Vi) - corrected_delta_v; residuals.block<3, 1>(O_BA, 0) = Baj - Bai; residuals.block<3, 1>(O_BG, 0) = Bgj - Bgi; return residuals; } double dt; Eigen::Vector3d acc_0, gyr_0; Eigen::Vector3d acc_1, gyr_1; const Eigen::Vector3d linearized_acc, linearized_gyr; Eigen::Vector3d linearized_ba, linearized_bg; Eigen::Matrix jacobian, covariance; Eigen::Matrix step_jacobian; Eigen::Matrix step_V; Eigen::Matrix noise; double sum_dt; Eigen::Vector3d delta_p; Eigen::Quaterniond delta_q; Eigen::Vector3d delta_v; std::vector dt_buf; std::vector acc_buf; std::vector gyr_buf; }; /* void eulerIntegration(double _dt, const Eigen::Vector3d &_acc_0, const Eigen::Vector3d &_gyr_0, const Eigen::Vector3d &_acc_1, const Eigen::Vector3d &_gyr_1, const Eigen::Vector3d &delta_p, const Eigen::Quaterniond &delta_q, const Eigen::Vector3d &delta_v, const Eigen::Vector3d &linearized_ba, const Eigen::Vector3d &linearized_bg, Eigen::Vector3d &result_delta_p, Eigen::Quaterniond &result_delta_q, Eigen::Vector3d &result_delta_v, Eigen::Vector3d &result_linearized_ba, Eigen::Vector3d &result_linearized_bg, bool update_jacobian) { result_delta_p = delta_p + delta_v * _dt + 0.5 * (delta_q * (_acc_1 - linearized_ba)) * _dt * _dt; result_delta_v = delta_v + delta_q * (_acc_1 - linearized_ba) * _dt; Vector3d omg = _gyr_1 - linearized_bg; omg = omg * _dt / 2; Quaterniond dR(1, omg(0), omg(1), omg(2)); result_delta_q = (delta_q * dR); result_linearized_ba = linearized_ba; result_linearized_bg = linearized_bg; if(update_jacobian) { Vector3d w_x = _gyr_1 - linearized_bg; Vector3d a_x = _acc_1 - linearized_ba; Matrix3d R_w_x, R_a_x; R_w_x<<0, -w_x(2), w_x(1), w_x(2), 0, -w_x(0), -w_x(1), w_x(0), 0; R_a_x<<0, -a_x(2), a_x(1), a_x(2), 0, -a_x(0), -a_x(1), a_x(0), 0; MatrixXd A = MatrixXd::Zero(15, 15); // one step euler 0.5 A.block<3, 3>(0, 3) = 0.5 * (-1 * delta_q.toRotationMatrix()) * R_a_x * _dt; A.block<3, 3>(0, 6) = MatrixXd::Identity(3,3); A.block<3, 3>(0, 9) = 0.5 * (-1 * delta_q.toRotationMatrix()) * _dt; A.block<3, 3>(3, 3) = -R_w_x; A.block<3, 3>(3, 12) = -1 * MatrixXd::Identity(3,3); A.block<3, 3>(6, 3) = (-1 * delta_q.toRotationMatrix()) * R_a_x; A.block<3, 3>(6, 9) = (-1 * delta_q.toRotationMatrix()); //cout<<"A"<(0, 0) = 0.5 * delta_q.toRotationMatrix() * _dt; U.block<3, 3>(3, 3) = MatrixXd::Identity(3,3); U.block<3, 3>(6, 0) = delta_q.toRotationMatrix(); U.block<3, 3>(9, 6) = MatrixXd::Identity(3,3); U.block<3, 3>(12, 9) = MatrixXd::Identity(3,3); // put outside Eigen::Matrix noise = Eigen::Matrix::Zero(); noise.block<3, 3>(0, 0) = (ACC_N * ACC_N) * Eigen::Matrix3d::Identity(); noise.block<3, 3>(3, 3) = (GYR_N * GYR_N) * Eigen::Matrix3d::Identity(); noise.block<3, 3>(6, 6) = (ACC_W * ACC_W) * Eigen::Matrix3d::Identity(); noise.block<3, 3>(9, 9) = (GYR_W * GYR_W) * Eigen::Matrix3d::Identity(); //write F directly MatrixXd F, V; F = (MatrixXd::Identity(15,15) + _dt * A); V = _dt * U; step_jacobian = F; step_V = V; jacobian = F * jacobian; covariance = F * covariance * F.transpose() + V * noise * V.transpose(); } } void checkJacobian(double _dt, const Eigen::Vector3d &_acc_0, const Eigen::Vector3d &_gyr_0, const Eigen::Vector3d &_acc_1, const Eigen::Vector3d &_gyr_1, const Eigen::Vector3d &delta_p, const Eigen::Quaterniond &delta_q, const Eigen::Vector3d &delta_v, const Eigen::Vector3d &linearized_ba, const Eigen::Vector3d &linearized_bg) { Vector3d result_delta_p; Quaterniond result_delta_q; Vector3d result_delta_v; Vector3d result_linearized_ba; Vector3d result_linearized_bg; midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v, linearized_ba, linearized_bg, result_delta_p, result_delta_q, result_delta_v, result_linearized_ba, result_linearized_bg, 0); Vector3d turb_delta_p; Quaterniond turb_delta_q; Vector3d turb_delta_v; Vector3d turb_linearized_ba; Vector3d turb_linearized_bg; Vector3d turb(0.0001, -0.003, 0.003); midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p + turb, delta_q, delta_v, linearized_ba, linearized_bg, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb p " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_jacobian.block<3, 3>(0, 0) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_jacobian.block<3, 3>(3, 0) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_jacobian.block<3, 3>(6, 0) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_jacobian.block<3, 3>(9, 0) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff " << (step_jacobian.block<3, 3>(12, 0) * turb).transpose() << endl; midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q * Quaterniond(1, turb(0) / 2, turb(1) / 2, turb(2) / 2), delta_v, linearized_ba, linearized_bg, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb q " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_jacobian.block<3, 3>(0, 3) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_jacobian.block<3, 3>(3, 3) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_jacobian.block<3, 3>(6, 3) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_jacobian.block<3, 3>(9, 3) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff" << (step_jacobian.block<3, 3>(12, 3) * turb).transpose() << endl; midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v + turb, linearized_ba, linearized_bg, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb v " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_jacobian.block<3, 3>(0, 6) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_jacobian.block<3, 3>(3, 6) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_jacobian.block<3, 3>(6, 6) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_jacobian.block<3, 3>(9, 6) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff" << (step_jacobian.block<3, 3>(12, 6) * turb).transpose() << endl; midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v, linearized_ba + turb, linearized_bg, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb ba " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_jacobian.block<3, 3>(0, 9) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_jacobian.block<3, 3>(3, 9) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_jacobian.block<3, 3>(6, 9) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_jacobian.block<3, 3>(9, 9) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff" << (step_jacobian.block<3, 3>(12, 9) * turb).transpose() << endl; midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v, linearized_ba, linearized_bg + turb, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb bg " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_jacobian.block<3, 3>(0, 12) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_jacobian.block<3, 3>(3, 12) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_jacobian.block<3, 3>(6, 12) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_jacobian.block<3, 3>(9, 12) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff" << (step_jacobian.block<3, 3>(12, 12) * turb).transpose() << endl; midPointIntegration(_dt, _acc_0 + turb, _gyr_0, _acc_1 , _gyr_1, delta_p, delta_q, delta_v, linearized_ba, linearized_bg, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb acc_0 " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_V.block<3, 3>(0, 0) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_V.block<3, 3>(3, 0) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_V.block<3, 3>(6, 0) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_V.block<3, 3>(9, 0) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff" << (step_V.block<3, 3>(12, 0) * turb).transpose() << endl; midPointIntegration(_dt, _acc_0, _gyr_0 + turb, _acc_1 , _gyr_1, delta_p, delta_q, delta_v, linearized_ba, linearized_bg, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb _gyr_0 " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_V.block<3, 3>(0, 3) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_V.block<3, 3>(3, 3) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_V.block<3, 3>(6, 3) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_V.block<3, 3>(9, 3) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff" << (step_V.block<3, 3>(12, 3) * turb).transpose() << endl; midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1 + turb, _gyr_1, delta_p, delta_q, delta_v, linearized_ba, linearized_bg, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb acc_1 " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_V.block<3, 3>(0, 6) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_V.block<3, 3>(3, 6) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_V.block<3, 3>(6, 6) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_V.block<3, 3>(9, 6) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff" << (step_V.block<3, 3>(12, 6) * turb).transpose() << endl; midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1 , _gyr_1 + turb, delta_p, delta_q, delta_v, linearized_ba, linearized_bg, turb_delta_p, turb_delta_q, turb_delta_v, turb_linearized_ba, turb_linearized_bg, 0); cout << "turb _gyr_1 " << endl; cout << "p diff " << (turb_delta_p - result_delta_p).transpose() << endl; cout << "p jacob diff " << (step_V.block<3, 3>(0, 9) * turb).transpose() << endl; cout << "q diff " << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl; cout << "q jacob diff " << (step_V.block<3, 3>(3, 9) * turb).transpose() << endl; cout << "v diff " << (turb_delta_v - result_delta_v).transpose() << endl; cout << "v jacob diff " << (step_V.block<3, 3>(6, 9) * turb).transpose() << endl; cout << "ba diff " << (turb_linearized_ba - result_linearized_ba).transpose() << endl; cout << "ba jacob diff" << (step_V.block<3, 3>(9, 9) * turb).transpose() << endl; cout << "bg diff " << (turb_linearized_bg - result_linearized_bg).transpose() << endl; cout << "bg jacob diff" << (step_V.block<3, 3>(12, 9) * turb).transpose() << endl; } */ ================================================ FILE: rpvio_estimator/src/factor/marginalization_factor.cpp ================================================ #include "marginalization_factor.h" void ResidualBlockInfo::Evaluate() { residuals.resize(cost_function->num_residuals()); std::vector block_sizes = cost_function->parameter_block_sizes(); raw_jacobians = new double *[block_sizes.size()]; jacobians.resize(block_sizes.size()); for (int i = 0; i < static_cast(block_sizes.size()); i++) { jacobians[i].resize(cost_function->num_residuals(), block_sizes[i]); raw_jacobians[i] = jacobians[i].data(); //dim += block_sizes[i] == 7 ? 6 : block_sizes[i]; } cost_function->Evaluate(parameter_blocks.data(), residuals.data(), raw_jacobians); //std::vector tmp_idx(block_sizes.size()); //Eigen::MatrixXd tmp(dim, dim); //for (int i = 0; i < static_cast(parameter_blocks.size()); i++) //{ // int size_i = localSize(block_sizes[i]); // Eigen::MatrixXd jacobian_i = jacobians[i].leftCols(size_i); // for (int j = 0, sub_idx = 0; j < static_cast(parameter_blocks.size()); sub_idx += block_sizes[j] == 7 ? 6 : block_sizes[j], j++) // { // int size_j = localSize(block_sizes[j]); // Eigen::MatrixXd jacobian_j = jacobians[j].leftCols(size_j); // tmp_idx[j] = sub_idx; // tmp.block(tmp_idx[i], tmp_idx[j], size_i, size_j) = jacobian_i.transpose() * jacobian_j; // } //} //Eigen::SelfAdjointEigenSolver saes(tmp); //std::cout << saes.eigenvalues() << std::endl; //ROS_ASSERT(saes.eigenvalues().minCoeff() >= -1e-6); if (loss_function) { double residual_scaling_, alpha_sq_norm_; double sq_norm, rho[3]; sq_norm = residuals.squaredNorm(); loss_function->Evaluate(sq_norm, rho); //printf("sq_norm: %f, rho[0]: %f, rho[1]: %f, rho[2]: %f\n", sq_norm, rho[0], rho[1], rho[2]); double sqrt_rho1_ = sqrt(rho[1]); if ((sq_norm == 0.0) || (rho[2] <= 0.0)) { residual_scaling_ = sqrt_rho1_; alpha_sq_norm_ = 0.0; } else { const double D = 1.0 + 2.0 * sq_norm * rho[2] / rho[1]; const double alpha = 1.0 - sqrt(D); residual_scaling_ = sqrt_rho1_ / (1 - alpha); alpha_sq_norm_ = alpha / sq_norm; } for (int i = 0; i < static_cast(parameter_blocks.size()); i++) { jacobians[i] = sqrt_rho1_ * (jacobians[i] - alpha_sq_norm_ * residuals * (residuals.transpose() * jacobians[i])); } residuals *= residual_scaling_; } } MarginalizationInfo::~MarginalizationInfo() { //ROS_WARN("release marginlizationinfo"); for (auto it = parameter_block_data.begin(); it != parameter_block_data.end(); ++it) delete[] it->second; for (int i = 0; i < (int)factors.size(); i++) { delete[] factors[i]->raw_jacobians; delete factors[i]->cost_function; delete factors[i]; } } void MarginalizationInfo::addResidualBlockInfo(ResidualBlockInfo *residual_block_info) { factors.emplace_back(residual_block_info); std::vector ¶meter_blocks = residual_block_info->parameter_blocks; std::vector parameter_block_sizes = residual_block_info->cost_function->parameter_block_sizes(); for (int i = 0; i < static_cast(residual_block_info->parameter_blocks.size()); i++) { double *addr = parameter_blocks[i]; int size = parameter_block_sizes[i]; parameter_block_size[reinterpret_cast(addr)] = size; } for (int i = 0; i < static_cast(residual_block_info->drop_set.size()); i++) { double *addr = parameter_blocks[residual_block_info->drop_set[i]]; parameter_block_idx[reinterpret_cast(addr)] = 0; } } void MarginalizationInfo::preMarginalize() { for (auto it : factors) { it->Evaluate(); std::vector block_sizes = it->cost_function->parameter_block_sizes(); for (int i = 0; i < static_cast(block_sizes.size()); i++) { long addr = reinterpret_cast(it->parameter_blocks[i]); int size = block_sizes[i]; if (parameter_block_data.find(addr) == parameter_block_data.end()) { double *data = new double[size]; memcpy(data, it->parameter_blocks[i], sizeof(double) * size); parameter_block_data[addr] = data; } } } } int MarginalizationInfo::localSize(int size) const { return size == 7 ? 6 : size; } int MarginalizationInfo::globalSize(int size) const { return size == 6 ? 7 : size; } void* ThreadsConstructA(void* threadsstruct) { ThreadsStruct* p = ((ThreadsStruct*)threadsstruct); for (auto it : p->sub_factors) { for (int i = 0; i < static_cast(it->parameter_blocks.size()); i++) { int idx_i = p->parameter_block_idx[reinterpret_cast(it->parameter_blocks[i])]; int size_i = p->parameter_block_size[reinterpret_cast(it->parameter_blocks[i])]; if (size_i == 7) size_i = 6; Eigen::MatrixXd jacobian_i = it->jacobians[i].leftCols(size_i); for (int j = i; j < static_cast(it->parameter_blocks.size()); j++) { int idx_j = p->parameter_block_idx[reinterpret_cast(it->parameter_blocks[j])]; int size_j = p->parameter_block_size[reinterpret_cast(it->parameter_blocks[j])]; if (size_j == 7) size_j = 6; Eigen::MatrixXd jacobian_j = it->jacobians[j].leftCols(size_j); if (i == j) p->A.block(idx_i, idx_j, size_i, size_j) += jacobian_i.transpose() * jacobian_j; else { p->A.block(idx_i, idx_j, size_i, size_j) += jacobian_i.transpose() * jacobian_j; p->A.block(idx_j, idx_i, size_j, size_i) = p->A.block(idx_i, idx_j, size_i, size_j).transpose(); } } p->b.segment(idx_i, size_i) += jacobian_i.transpose() * it->residuals; } } return threadsstruct; } void MarginalizationInfo::marginalize() { int pos = 0; for (auto &it : parameter_block_idx) { it.second = pos; pos += localSize(parameter_block_size[it.first]); } m = pos; for (const auto &it : parameter_block_size) { if (parameter_block_idx.find(it.first) == parameter_block_idx.end()) { parameter_block_idx[it.first] = pos; pos += localSize(it.second); } } n = pos - m; //ROS_DEBUG("marginalization, pos: %d, m: %d, n: %d, size: %d", pos, m, n, (int)parameter_block_idx.size()); TicToc t_summing; Eigen::MatrixXd A(pos, pos); Eigen::VectorXd b(pos); A.setZero(); b.setZero(); /* for (auto it : factors) { for (int i = 0; i < static_cast(it->parameter_blocks.size()); i++) { int idx_i = parameter_block_idx[reinterpret_cast(it->parameter_blocks[i])]; int size_i = localSize(parameter_block_size[reinterpret_cast(it->parameter_blocks[i])]); Eigen::MatrixXd jacobian_i = it->jacobians[i].leftCols(size_i); for (int j = i; j < static_cast(it->parameter_blocks.size()); j++) { int idx_j = parameter_block_idx[reinterpret_cast(it->parameter_blocks[j])]; int size_j = localSize(parameter_block_size[reinterpret_cast(it->parameter_blocks[j])]); Eigen::MatrixXd jacobian_j = it->jacobians[j].leftCols(size_j); if (i == j) A.block(idx_i, idx_j, size_i, size_j) += jacobian_i.transpose() * jacobian_j; else { A.block(idx_i, idx_j, size_i, size_j) += jacobian_i.transpose() * jacobian_j; A.block(idx_j, idx_i, size_j, size_i) = A.block(idx_i, idx_j, size_i, size_j).transpose(); } } b.segment(idx_i, size_i) += jacobian_i.transpose() * it->residuals; } } ROS_INFO("summing up costs %f ms", t_summing.toc()); */ //multi thread TicToc t_thread_summing; pthread_t tids[NUM_THREADS]; ThreadsStruct threadsstruct[NUM_THREADS]; int i = 0; for (auto it : factors) { threadsstruct[i].sub_factors.push_back(it); i++; i = i % NUM_THREADS; } for (int i = 0; i < NUM_THREADS; i++) { TicToc zero_matrix; threadsstruct[i].A = Eigen::MatrixXd::Zero(pos,pos); threadsstruct[i].b = Eigen::VectorXd::Zero(pos); threadsstruct[i].parameter_block_size = parameter_block_size; threadsstruct[i].parameter_block_idx = parameter_block_idx; int ret = pthread_create( &tids[i], NULL, ThreadsConstructA ,(void*)&(threadsstruct[i])); if (ret != 0) { ROS_WARN("pthread_create error"); ROS_BREAK(); } } for( int i = NUM_THREADS - 1; i >= 0; i--) { pthread_join( tids[i], NULL ); A += threadsstruct[i].A; b += threadsstruct[i].b; } //ROS_DEBUG("thread summing up costs %f ms", t_thread_summing.toc()); //ROS_INFO("A diff %f , b diff %f ", (A - tmp_A).sum(), (b - tmp_b).sum()); //TODO Eigen::MatrixXd Amm = 0.5 * (A.block(0, 0, m, m) + A.block(0, 0, m, m).transpose()); Eigen::SelfAdjointEigenSolver saes(Amm); //ROS_ASSERT_MSG(saes.eigenvalues().minCoeff() >= -1e-4, "min eigenvalue %f", saes.eigenvalues().minCoeff()); Eigen::MatrixXd Amm_inv = saes.eigenvectors() * Eigen::VectorXd((saes.eigenvalues().array() > eps).select(saes.eigenvalues().array().inverse(), 0)).asDiagonal() * saes.eigenvectors().transpose(); //printf("error1: %f\n", (Amm * Amm_inv - Eigen::MatrixXd::Identity(m, m)).sum()); Eigen::VectorXd bmm = b.segment(0, m); Eigen::MatrixXd Amr = A.block(0, m, m, n); Eigen::MatrixXd Arm = A.block(m, 0, n, m); Eigen::MatrixXd Arr = A.block(m, m, n, n); Eigen::VectorXd brr = b.segment(m, n); A = Arr - Arm * Amm_inv * Amr; b = brr - Arm * Amm_inv * bmm; Eigen::SelfAdjointEigenSolver saes2(A); Eigen::VectorXd S = Eigen::VectorXd((saes2.eigenvalues().array() > eps).select(saes2.eigenvalues().array(), 0)); Eigen::VectorXd S_inv = Eigen::VectorXd((saes2.eigenvalues().array() > eps).select(saes2.eigenvalues().array().inverse(), 0)); Eigen::VectorXd S_sqrt = S.cwiseSqrt(); Eigen::VectorXd S_inv_sqrt = S_inv.cwiseSqrt(); linearized_jacobians = S_sqrt.asDiagonal() * saes2.eigenvectors().transpose(); linearized_residuals = S_inv_sqrt.asDiagonal() * saes2.eigenvectors().transpose() * b; //std::cout << A << std::endl // << std::endl; //std::cout << linearized_jacobians << std::endl; //printf("error2: %f %f\n", (linearized_jacobians.transpose() * linearized_jacobians - A).sum(), // (linearized_jacobians.transpose() * linearized_residuals - b).sum()); } std::vector MarginalizationInfo::getParameterBlocks(std::unordered_map &addr_shift) { std::vector keep_block_addr; keep_block_size.clear(); keep_block_idx.clear(); keep_block_data.clear(); for (const auto &it : parameter_block_idx) { if (it.second >= m) { keep_block_size.push_back(parameter_block_size[it.first]); keep_block_idx.push_back(parameter_block_idx[it.first]); keep_block_data.push_back(parameter_block_data[it.first]); keep_block_addr.push_back(addr_shift[it.first]); } } sum_block_size = std::accumulate(std::begin(keep_block_size), std::end(keep_block_size), 0); return keep_block_addr; } MarginalizationFactor::MarginalizationFactor(MarginalizationInfo* _marginalization_info):marginalization_info(_marginalization_info) { int cnt = 0; for (auto it : marginalization_info->keep_block_size) { mutable_parameter_block_sizes()->push_back(it); cnt += it; } //printf("residual size: %d, %d\n", cnt, n); set_num_residuals(marginalization_info->n); }; bool MarginalizationFactor::Evaluate(double const *const *parameters, double *residuals, double **jacobians) const { //printf("internal addr,%d, %d\n", (int)parameter_block_sizes().size(), num_residuals()); //for (int i = 0; i < static_cast(keep_block_size.size()); i++) //{ // //printf("unsigned %x\n", reinterpret_cast(parameters[i])); // //printf("signed %x\n", reinterpret_cast(parameters[i])); //printf("jacobian %x\n", reinterpret_cast(jacobians)); //printf("residual %x\n", reinterpret_cast(residuals)); //} int n = marginalization_info->n; int m = marginalization_info->m; Eigen::VectorXd dx(n); for (int i = 0; i < static_cast(marginalization_info->keep_block_size.size()); i++) { int size = marginalization_info->keep_block_size[i]; int idx = marginalization_info->keep_block_idx[i] - m; Eigen::VectorXd x = Eigen::Map(parameters[i], size); Eigen::VectorXd x0 = Eigen::Map(marginalization_info->keep_block_data[i], size); if (size != 7) dx.segment(idx, size) = x - x0; else { dx.segment<3>(idx + 0) = x.head<3>() - x0.head<3>(); dx.segment<3>(idx + 3) = 2.0 * Utility::positify(Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).vec(); if (!((Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).w() >= 0)) { dx.segment<3>(idx + 3) = 2.0 * -Utility::positify(Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).vec(); } } } Eigen::Map(residuals, n) = marginalization_info->linearized_residuals + marginalization_info->linearized_jacobians * dx; if (jacobians) { for (int i = 0; i < static_cast(marginalization_info->keep_block_size.size()); i++) { if (jacobians[i]) { int size = marginalization_info->keep_block_size[i], local_size = marginalization_info->localSize(size); int idx = marginalization_info->keep_block_idx[i] - m; Eigen::Map> jacobian(jacobians[i], n, size); jacobian.setZero(); jacobian.leftCols(local_size) = marginalization_info->linearized_jacobians.middleCols(idx, local_size); } } } return true; } ================================================ FILE: rpvio_estimator/src/factor/marginalization_factor.h ================================================ #pragma once #include #include #include #include #include #include #include "../utility/utility.h" #include "../utility/tic_toc.h" const int NUM_THREADS = 4; struct ResidualBlockInfo { ResidualBlockInfo(ceres::CostFunction *_cost_function, ceres::LossFunction *_loss_function, std::vector _parameter_blocks, std::vector _drop_set) : cost_function(_cost_function), loss_function(_loss_function), parameter_blocks(_parameter_blocks), drop_set(_drop_set) {} void Evaluate(); ceres::CostFunction *cost_function; ceres::LossFunction *loss_function; std::vector parameter_blocks; std::vector drop_set; double **raw_jacobians; std::vector> jacobians; Eigen::VectorXd residuals; int localSize(int size) { return size == 7 ? 6 : size; } }; struct ThreadsStruct { std::vector sub_factors; Eigen::MatrixXd A; Eigen::VectorXd b; std::unordered_map parameter_block_size; //global size std::unordered_map parameter_block_idx; //local size }; class MarginalizationInfo { public: ~MarginalizationInfo(); int localSize(int size) const; int globalSize(int size) const; void addResidualBlockInfo(ResidualBlockInfo *residual_block_info); void preMarginalize(); void marginalize(); std::vector getParameterBlocks(std::unordered_map &addr_shift); std::vector factors; int m, n; std::unordered_map parameter_block_size; //global size int sum_block_size; std::unordered_map parameter_block_idx; //local size std::unordered_map parameter_block_data; std::vector keep_block_size; //global size std::vector keep_block_idx; //local size std::vector keep_block_data; Eigen::MatrixXd linearized_jacobians; Eigen::VectorXd linearized_residuals; const double eps = 1e-8; }; class MarginalizationFactor : public ceres::CostFunction { public: MarginalizationFactor(MarginalizationInfo* _marginalization_info); virtual bool Evaluate(double const *const *parameters, double *residuals, double **jacobians) const; MarginalizationInfo* marginalization_info; }; ================================================ FILE: rpvio_estimator/src/factor/pose_local_parameterization.cpp ================================================ #include "pose_local_parameterization.h" bool PoseLocalParameterization::Plus(const double *x, const double *delta, double *x_plus_delta) const { Eigen::Map _p(x); Eigen::Map _q(x + 3); Eigen::Map dp(delta); Eigen::Quaterniond dq = Utility::deltaQ(Eigen::Map(delta + 3)); Eigen::Map p(x_plus_delta); Eigen::Map q(x_plus_delta + 3); p = _p + dp; q = (_q * dq).normalized(); return true; } bool PoseLocalParameterization::ComputeJacobian(const double *x, double *jacobian) const { Eigen::Map> j(jacobian); j.topRows<6>().setIdentity(); j.bottomRows<1>().setZero(); return true; } ================================================ FILE: rpvio_estimator/src/factor/pose_local_parameterization.h ================================================ #pragma once #include #include #include "../utility/utility.h" class PoseLocalParameterization : public ceres::LocalParameterization { virtual bool Plus(const double *x, const double *delta, double *x_plus_delta) const; virtual bool ComputeJacobian(const double *x, double *jacobian) const; virtual int GlobalSize() const { return 7; }; virtual int LocalSize() const { return 6; }; }; ================================================ FILE: rpvio_estimator/src/factor/projection_factor.cpp ================================================ #include "projection_factor.h" Eigen::Matrix2d ProjectionFactor::sqrt_info; double ProjectionFactor::sum_t; ProjectionFactor::ProjectionFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) { #ifdef UNIT_SPHERE_ERROR Eigen::Vector3d b1, b2; Eigen::Vector3d a = pts_j.normalized(); Eigen::Vector3d tmp(0, 0, 1); if(a == tmp) tmp << 1, 0, 0; b1 = (tmp - a * (a.transpose() * tmp)).normalized(); b2 = a.cross(b1); tangent_base.block<1, 3>(0, 0) = b1.transpose(); tangent_base.block<1, 3>(1, 0) = b2.transpose(); #endif }; bool ProjectionFactor::Evaluate(double const *const *parameters, double *residuals, double **jacobians) const { TicToc tic_toc; Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]); Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]); Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]); Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]); Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]); Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]); double inv_dep_i = parameters[3][0]; Eigen::Vector3d pts_camera_i = pts_i / inv_dep_i; Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic; Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi; Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj); Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic); Eigen::Map residual(residuals); #ifdef UNIT_SPHERE_ERROR residual = tangent_base * (pts_camera_j.normalized() - pts_j.normalized()); #else double dep_j = pts_camera_j.z(); residual = (pts_camera_j / dep_j).head<2>() - pts_j.head<2>(); #endif residual = sqrt_info * residual; if (jacobians) { Eigen::Matrix3d Ri = Qi.toRotationMatrix(); Eigen::Matrix3d Rj = Qj.toRotationMatrix(); Eigen::Matrix3d ric = qic.toRotationMatrix(); Eigen::Matrix reduce(2, 3); #ifdef UNIT_SPHERE_ERROR double norm = pts_camera_j.norm(); Eigen::Matrix3d norm_jaco; double x1, x2, x3; x1 = pts_camera_j(0); x2 = pts_camera_j(1); x3 = pts_camera_j(2); norm_jaco << 1.0 / norm - x1 * x1 / pow(norm, 3), - x1 * x2 / pow(norm, 3), - x1 * x3 / pow(norm, 3), - x1 * x2 / pow(norm, 3), 1.0 / norm - x2 * x2 / pow(norm, 3), - x2 * x3 / pow(norm, 3), - x1 * x3 / pow(norm, 3), - x2 * x3 / pow(norm, 3), 1.0 / norm - x3 * x3 / pow(norm, 3); reduce = tangent_base * norm_jaco; #else reduce << 1. / dep_j, 0, -pts_camera_j(0) / (dep_j * dep_j), 0, 1. / dep_j, -pts_camera_j(1) / (dep_j * dep_j); #endif reduce = sqrt_info * reduce; if (jacobians[0]) { Eigen::Map> jacobian_pose_i(jacobians[0]); Eigen::Matrix jaco_i; jaco_i.leftCols<3>() = ric.transpose() * Rj.transpose(); jaco_i.rightCols<3>() = ric.transpose() * Rj.transpose() * Ri * -Utility::skewSymmetric(pts_imu_i); jacobian_pose_i.leftCols<6>() = reduce * jaco_i; jacobian_pose_i.rightCols<1>().setZero(); } if (jacobians[1]) { Eigen::Map> jacobian_pose_j(jacobians[1]); Eigen::Matrix jaco_j; jaco_j.leftCols<3>() = ric.transpose() * -Rj.transpose(); jaco_j.rightCols<3>() = ric.transpose() * Utility::skewSymmetric(pts_imu_j); jacobian_pose_j.leftCols<6>() = reduce * jaco_j; jacobian_pose_j.rightCols<1>().setZero(); } if (jacobians[2]) { Eigen::Map> jacobian_ex_pose(jacobians[2]); Eigen::Matrix jaco_ex; jaco_ex.leftCols<3>() = ric.transpose() * (Rj.transpose() * Ri - Eigen::Matrix3d::Identity()); Eigen::Matrix3d tmp_r = ric.transpose() * Rj.transpose() * Ri * ric; jaco_ex.rightCols<3>() = -tmp_r * Utility::skewSymmetric(pts_camera_i) + Utility::skewSymmetric(tmp_r * pts_camera_i) + Utility::skewSymmetric(ric.transpose() * (Rj.transpose() * (Ri * tic + Pi - Pj) - tic)); jacobian_ex_pose.leftCols<6>() = reduce * jaco_ex; jacobian_ex_pose.rightCols<1>().setZero(); } if (jacobians[3]) { Eigen::Map jacobian_feature(jacobians[3]); #if 1 jacobian_feature = reduce * ric.transpose() * Rj.transpose() * Ri * ric * pts_i * -1.0 / (inv_dep_i * inv_dep_i); #else jacobian_feature = reduce * ric.transpose() * Rj.transpose() * Ri * ric * pts_i; #endif } } sum_t += tic_toc.toc(); return true; } void ProjectionFactor::check(double **parameters) { double *res = new double[15]; double **jaco = new double *[4]; jaco[0] = new double[2 * 7]; jaco[1] = new double[2 * 7]; jaco[2] = new double[2 * 7]; jaco[3] = new double[2 * 1]; Evaluate(parameters, res, jaco); puts("check begins"); puts("my"); std::cout << Eigen::Map>(res).transpose() << std::endl << std::endl; std::cout << Eigen::Map>(jaco[0]) << std::endl << std::endl; std::cout << Eigen::Map>(jaco[1]) << std::endl << std::endl; std::cout << Eigen::Map>(jaco[2]) << std::endl << std::endl; std::cout << Eigen::Map(jaco[3]) << std::endl << std::endl; Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]); Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]); Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]); Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]); Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]); Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]); double inv_dep_i = parameters[3][0]; Eigen::Vector3d pts_camera_i = pts_i / inv_dep_i; Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic; Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi; Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj); Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic); Eigen::Vector2d residual; #ifdef UNIT_SPHERE_ERROR residual = tangent_base * (pts_camera_j.normalized() - pts_j.normalized()); #else double dep_j = pts_camera_j.z(); residual = (pts_camera_j / dep_j).head<2>() - pts_j.head<2>(); #endif residual = sqrt_info * residual; puts("num"); std::cout << residual.transpose() << std::endl; const double eps = 1e-6; Eigen::Matrix num_jacobian; for (int k = 0; k < 19; k++) { Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]); Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]); Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]); Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]); Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]); Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]); double inv_dep_i = parameters[3][0]; int a = k / 3, b = k % 3; Eigen::Vector3d delta = Eigen::Vector3d(b == 0, b == 1, b == 2) * eps; if (a == 0) Pi += delta; else if (a == 1) Qi = Qi * Utility::deltaQ(delta); else if (a == 2) Pj += delta; else if (a == 3) Qj = Qj * Utility::deltaQ(delta); else if (a == 4) tic += delta; else if (a == 5) qic = qic * Utility::deltaQ(delta); else if (a == 6) inv_dep_i += delta.x(); Eigen::Vector3d pts_camera_i = pts_i / inv_dep_i; Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic; Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi; Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj); Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic); Eigen::Vector2d tmp_residual; #ifdef UNIT_SPHERE_ERROR tmp_residual = tangent_base * (pts_camera_j.normalized() - pts_j.normalized()); #else double dep_j = pts_camera_j.z(); tmp_residual = (pts_camera_j / dep_j).head<2>() - pts_j.head<2>(); #endif tmp_residual = sqrt_info * tmp_residual; num_jacobian.col(k) = (tmp_residual - residual) / eps; } std::cout << num_jacobian << std::endl; } ================================================ FILE: rpvio_estimator/src/factor/projection_factor.h ================================================ #pragma once #include #include #include #include "../utility/utility.h" #include "../utility/tic_toc.h" #include "../parameters.h" class ProjectionFactor : public ceres::SizedCostFunction<2, 7, 7, 7, 1> { public: ProjectionFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j); virtual bool Evaluate(double const *const *parameters, double *residuals, double **jacobians) const; void check(double **parameters); Eigen::Vector3d pts_i, pts_j; Eigen::Matrix tangent_base; static Eigen::Matrix2d sqrt_info; static double sum_t; }; ================================================ FILE: rpvio_estimator/src/factor/projection_td_factor.cpp ================================================ #include "projection_td_factor.h" Eigen::Matrix2d ProjectionTdFactor::sqrt_info; double ProjectionTdFactor::sum_t; ProjectionTdFactor::ProjectionTdFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j, const Eigen::Vector2d &_velocity_i, const Eigen::Vector2d &_velocity_j, const double _td_i, const double _td_j, const double _row_i, const double _row_j) : pts_i(_pts_i), pts_j(_pts_j), td_i(_td_i), td_j(_td_j) { velocity_i.x() = _velocity_i.x(); velocity_i.y() = _velocity_i.y(); velocity_i.z() = 0; velocity_j.x() = _velocity_j.x(); velocity_j.y() = _velocity_j.y(); velocity_j.z() = 0; row_i = _row_i - ROW / 2; row_j = _row_j - ROW / 2; #ifdef UNIT_SPHERE_ERROR Eigen::Vector3d b1, b2; Eigen::Vector3d a = pts_j.normalized(); Eigen::Vector3d tmp(0, 0, 1); if(a == tmp) tmp << 1, 0, 0; b1 = (tmp - a * (a.transpose() * tmp)).normalized(); b2 = a.cross(b1); tangent_base.block<1, 3>(0, 0) = b1.transpose(); tangent_base.block<1, 3>(1, 0) = b2.transpose(); #endif }; bool ProjectionTdFactor::Evaluate(double const *const *parameters, double *residuals, double **jacobians) const { TicToc tic_toc; Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]); Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]); Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]); Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]); Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]); Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]); double inv_dep_i = parameters[3][0]; double td = parameters[4][0]; Eigen::Vector3d pts_i_td, pts_j_td; pts_i_td = pts_i - (td - td_i + TR / ROW * row_i) * velocity_i; pts_j_td = pts_j - (td - td_j + TR / ROW * row_j) * velocity_j; Eigen::Vector3d pts_camera_i = pts_i_td / inv_dep_i; Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic; Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi; Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj); Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic); Eigen::Map residual(residuals); #ifdef UNIT_SPHERE_ERROR residual = tangent_base * (pts_camera_j.normalized() - pts_j_td.normalized()); #else double dep_j = pts_camera_j.z(); residual = (pts_camera_j / dep_j).head<2>() - pts_j_td.head<2>(); #endif residual = sqrt_info * residual; if (jacobians) { Eigen::Matrix3d Ri = Qi.toRotationMatrix(); Eigen::Matrix3d Rj = Qj.toRotationMatrix(); Eigen::Matrix3d ric = qic.toRotationMatrix(); Eigen::Matrix reduce(2, 3); #ifdef UNIT_SPHERE_ERROR double norm = pts_camera_j.norm(); Eigen::Matrix3d norm_jaco; double x1, x2, x3; x1 = pts_camera_j(0); x2 = pts_camera_j(1); x3 = pts_camera_j(2); norm_jaco << 1.0 / norm - x1 * x1 / pow(norm, 3), - x1 * x2 / pow(norm, 3), - x1 * x3 / pow(norm, 3), - x1 * x2 / pow(norm, 3), 1.0 / norm - x2 * x2 / pow(norm, 3), - x2 * x3 / pow(norm, 3), - x1 * x3 / pow(norm, 3), - x2 * x3 / pow(norm, 3), 1.0 / norm - x3 * x3 / pow(norm, 3); reduce = tangent_base * norm_jaco; #else reduce << 1. / dep_j, 0, -pts_camera_j(0) / (dep_j * dep_j), 0, 1. / dep_j, -pts_camera_j(1) / (dep_j * dep_j); #endif reduce = sqrt_info * reduce; if (jacobians[0]) { Eigen::Map> jacobian_pose_i(jacobians[0]); Eigen::Matrix jaco_i; jaco_i.leftCols<3>() = ric.transpose() * Rj.transpose(); jaco_i.rightCols<3>() = ric.transpose() * Rj.transpose() * Ri * -Utility::skewSymmetric(pts_imu_i); jacobian_pose_i.leftCols<6>() = reduce * jaco_i; jacobian_pose_i.rightCols<1>().setZero(); } if (jacobians[1]) { Eigen::Map> jacobian_pose_j(jacobians[1]); Eigen::Matrix jaco_j; jaco_j.leftCols<3>() = ric.transpose() * -Rj.transpose(); jaco_j.rightCols<3>() = ric.transpose() * Utility::skewSymmetric(pts_imu_j); jacobian_pose_j.leftCols<6>() = reduce * jaco_j; jacobian_pose_j.rightCols<1>().setZero(); } if (jacobians[2]) { Eigen::Map> jacobian_ex_pose(jacobians[2]); Eigen::Matrix jaco_ex; jaco_ex.leftCols<3>() = ric.transpose() * (Rj.transpose() * Ri - Eigen::Matrix3d::Identity()); Eigen::Matrix3d tmp_r = ric.transpose() * Rj.transpose() * Ri * ric; jaco_ex.rightCols<3>() = -tmp_r * Utility::skewSymmetric(pts_camera_i) + Utility::skewSymmetric(tmp_r * pts_camera_i) + Utility::skewSymmetric(ric.transpose() * (Rj.transpose() * (Ri * tic + Pi - Pj) - tic)); jacobian_ex_pose.leftCols<6>() = reduce * jaco_ex; jacobian_ex_pose.rightCols<1>().setZero(); } if (jacobians[3]) { Eigen::Map jacobian_feature(jacobians[3]); jacobian_feature = reduce * ric.transpose() * Rj.transpose() * Ri * ric * pts_i_td * -1.0 / (inv_dep_i * inv_dep_i); } if (jacobians[4]) { Eigen::Map jacobian_td(jacobians[4]); jacobian_td = reduce * ric.transpose() * Rj.transpose() * Ri * ric * velocity_i / inv_dep_i * -1.0 + sqrt_info * velocity_j.head(2); } } sum_t += tic_toc.toc(); return true; } void ProjectionTdFactor::check(double **parameters) { double *res = new double[2]; double **jaco = new double *[5]; jaco[0] = new double[2 * 7]; jaco[1] = new double[2 * 7]; jaco[2] = new double[2 * 7]; jaco[3] = new double[2 * 1]; jaco[4] = new double[2 * 1]; Evaluate(parameters, res, jaco); puts("check begins"); puts("my"); std::cout << Eigen::Map>(res).transpose() << std::endl << std::endl; std::cout << Eigen::Map>(jaco[0]) << std::endl << std::endl; std::cout << Eigen::Map>(jaco[1]) << std::endl << std::endl; std::cout << Eigen::Map>(jaco[2]) << std::endl << std::endl; std::cout << Eigen::Map(jaco[3]) << std::endl << std::endl; std::cout << Eigen::Map(jaco[4]) << std::endl << std::endl; Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]); Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]); Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]); Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]); Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]); Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]); double inv_dep_i = parameters[3][0]; double td = parameters[4][0]; Eigen::Vector3d pts_i_td, pts_j_td; pts_i_td = pts_i - (td - td_i + TR / ROW * row_i) * velocity_i; pts_j_td = pts_j - (td - td_j + TR / ROW * row_j) * velocity_j; Eigen::Vector3d pts_camera_i = pts_i_td / inv_dep_i; Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic; Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi; Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj); Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic); Eigen::Vector2d residual; #ifdef UNIT_SPHERE_ERROR residual = tangent_base * (pts_camera_j.normalized() - pts_j_td.normalized()); #else double dep_j = pts_camera_j.z(); residual = (pts_camera_j / dep_j).head<2>() - pts_j_td.head<2>(); #endif residual = sqrt_info * residual; puts("num"); std::cout << residual.transpose() << std::endl; const double eps = 1e-6; Eigen::Matrix num_jacobian; for (int k = 0; k < 20; k++) { Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]); Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]); Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]); Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]); Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]); Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]); double inv_dep_i = parameters[3][0]; double td = parameters[4][0]; int a = k / 3, b = k % 3; Eigen::Vector3d delta = Eigen::Vector3d(b == 0, b == 1, b == 2) * eps; if (a == 0) Pi += delta; else if (a == 1) Qi = Qi * Utility::deltaQ(delta); else if (a == 2) Pj += delta; else if (a == 3) Qj = Qj * Utility::deltaQ(delta); else if (a == 4) tic += delta; else if (a == 5) qic = qic * Utility::deltaQ(delta); else if (a == 6 && b == 0) inv_dep_i += delta.x(); else if (a == 6 && b == 1) td += delta.y(); Eigen::Vector3d pts_i_td, pts_j_td; pts_i_td = pts_i - (td - td_i + TR / ROW * row_i) * velocity_i; pts_j_td = pts_j - (td - td_j + TR / ROW * row_j) * velocity_j; Eigen::Vector3d pts_camera_i = pts_i_td / inv_dep_i; Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic; Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi; Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj); Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic); Eigen::Vector2d tmp_residual; #ifdef UNIT_SPHERE_ERROR tmp_residual = tangent_base * (pts_camera_j.normalized() - pts_j_td.normalized()); #else double dep_j = pts_camera_j.z(); tmp_residual = (pts_camera_j / dep_j).head<2>() - pts_j_td.head<2>(); #endif tmp_residual = sqrt_info * tmp_residual; num_jacobian.col(k) = (tmp_residual - residual) / eps; } std::cout << num_jacobian << std::endl; } ================================================ FILE: rpvio_estimator/src/factor/projection_td_factor.h ================================================ #pragma once #include #include #include #include "../utility/utility.h" #include "../utility/tic_toc.h" #include "../parameters.h" class ProjectionTdFactor : public ceres::SizedCostFunction<2, 7, 7, 7, 1, 1> { public: ProjectionTdFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j, const Eigen::Vector2d &_velocity_i, const Eigen::Vector2d &_velocity_j, const double _td_i, const double _td_j, const double _row_i, const double _row_j); virtual bool Evaluate(double const *const *parameters, double *residuals, double **jacobians) const; void check(double **parameters); Eigen::Vector3d pts_i, pts_j; Eigen::Vector3d velocity_i, velocity_j; double td_i, td_j; Eigen::Matrix tangent_base; double row_i, row_j; static Eigen::Matrix2d sqrt_info; static double sum_t; }; ================================================ FILE: rpvio_estimator/src/feature_manager.cpp ================================================ #include "feature_manager.h" int FeaturePerId::endFrame() { return start_frame + feature_per_frame.size() - 1; } FeatureManager::FeatureManager(Matrix3d _Rs[]) : Rs(_Rs) { for (int i = 0; i < NUM_OF_CAM; i++) ric[i].setIdentity(); } void FeatureManager::setRic(Matrix3d _ric[]) { for (int i = 0; i < NUM_OF_CAM; i++) { ric[i] = _ric[i]; } } void FeatureManager::clearState() { feature.clear(); plane_ids.clear(); } int FeatureManager::getFeatureCount() { int cnt = 0; for (auto &it : feature) { it.used_num = it.feature_per_frame.size(); if (it.used_num >= 2 && it.start_frame < WINDOW_SIZE - 2) { cnt++; } } return cnt; } int FeatureManager::getLargestPlaneId(std::vector &init_pids) { std::map plane_counts; for(auto &pid : init_pids) plane_counts[pid] = 0; for(auto &it : feature) { if(plane_counts.count(it.plane_id) > 0) plane_counts[it.plane_id]++; } int largest_count = 0, largest_pid; for(auto &plane_count : plane_counts) { if(plane_count.second > largest_count) { largest_count = plane_count.second; largest_pid = plane_count.first; } } return largest_pid; } bool FeatureManager::addFeatureCheckParallax(int frame_count, const map>>> &image, double td) { ROS_DEBUG("input feature: %d", (int)image.size()); ROS_DEBUG("num of feature: %d", getFeatureCount()); double parallax_sum = 0; int parallax_num = 0; last_track_num = 0; for (auto &id_pts : image) { FeaturePerFrame f_per_fra(id_pts.second[0].second, td); int feature_id = id_pts.first; auto it = find_if(feature.begin(), feature.end(), [feature_id](const FeaturePerId &it) { return it.feature_id == feature_id; }); if (it == feature.end()) { int plane_id = id_pts.second[0].second(7); plane_ids.insert(plane_id); feature.push_back(FeaturePerId(feature_id, plane_id, frame_count)); feature.back().feature_per_frame.push_back(f_per_fra); } else if (it->feature_id == feature_id) { it->feature_per_frame.push_back(f_per_fra); last_track_num++; } } if (frame_count < 2 || last_track_num < 20) return true; for (auto &it_per_id : feature) { if (it_per_id.start_frame <= frame_count - 2 && it_per_id.start_frame + int(it_per_id.feature_per_frame.size()) - 1 >= frame_count - 1) { parallax_sum += compensatedParallax2(it_per_id, frame_count); parallax_num++; } } if (parallax_num == 0) { return true; } else { ROS_DEBUG("parallax_sum: %lf, parallax_num: %d", parallax_sum, parallax_num); ROS_DEBUG("current parallax: %lf", parallax_sum / parallax_num * FOCAL_LENGTH); return parallax_sum / parallax_num >= MIN_PARALLAX; } } void FeatureManager::debugShow() { ROS_DEBUG("debug show"); for (auto &it : feature) { ROS_ASSERT(it.feature_per_frame.size() != 0); ROS_ASSERT(it.start_frame >= 0); ROS_ASSERT(it.used_num >= 0); ROS_DEBUG("%d,%d,%d ", it.feature_id, it.used_num, it.start_frame); int sum = 0; for (auto &j : it.feature_per_frame) { ROS_DEBUG("%d,", int(j.is_used)); sum += j.is_used; printf("(%lf,%lf) ",j.point(0), j.point(1)); } ROS_ASSERT(it.used_num == sum); } } map>> FeatureManager::getCorresponding(int frame_count_l, int frame_count_r) { map>> corres; for (auto &it : feature) { if (it.start_frame <= frame_count_l && it.endFrame() >= frame_count_r) { Vector3d a = Vector3d::Zero(), b = Vector3d::Zero(); int idx_l = frame_count_l - it.start_frame; int idx_r = frame_count_r - it.start_frame; a = it.feature_per_frame[idx_l].point; b = it.feature_per_frame[idx_r].point; corres[it.plane_id].push_back(make_pair(a, b)); } } return corres; } vector> FeatureManager::getCorrespondingByPlane(int frame_count_l, int frame_count_r, int plane_id) { vector> corres; for (auto &it : feature) { if(it.plane_id != plane_id) continue; if (it.start_frame <= frame_count_l && it.endFrame() >= frame_count_r) { Vector3d a = Vector3d::Zero(), b = Vector3d::Zero(); int idx_l = frame_count_l - it.start_frame; int idx_r = frame_count_r - it.start_frame; a = it.feature_per_frame[idx_l].point; b = it.feature_per_frame[idx_r].point; corres.push_back(make_pair(a, b)); } } return corres; } void FeatureManager::setDepth(const VectorXd &x) { int feature_index = -1; for (auto &it_per_id : feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; it_per_id.estimated_depth = 1.0 / x(++feature_index); //ROS_INFO("feature id %d , start_frame %d, depth %f ", it_per_id->feature_id, it_per_id-> start_frame, it_per_id->estimated_depth); if (it_per_id.estimated_depth < 0) { it_per_id.solve_flag = 2; } else it_per_id.solve_flag = 1; } } void FeatureManager::removeFailures() { for (auto it = feature.begin(), it_next = feature.begin(); it != feature.end(); it = it_next) { it_next++; if (it->solve_flag == 2) feature.erase(it); } } void FeatureManager::clearDepth(const VectorXd &x) { int feature_index = -1; for (auto &it_per_id : feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; it_per_id.estimated_depth = 1.0 / x(++feature_index); } } VectorXd FeatureManager::getDepthVector() { VectorXd dep_vec(getFeatureCount()); int feature_index = -1; for (auto &it_per_id : feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; #if 1 dep_vec(++feature_index) = 1. / it_per_id.estimated_depth; #else dep_vec(++feature_index) = it_per_id->estimated_depth; #endif } return dep_vec; } void FeatureManager::triangulate(Vector3d Ps[], Vector3d tic[], Matrix3d ric[]) { for (auto &it_per_id : feature) { it_per_id.used_num = it_per_id.feature_per_frame.size(); if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; if (it_per_id.estimated_depth > 0) continue; int imu_i = it_per_id.start_frame, imu_j = imu_i - 1; ROS_ASSERT(NUM_OF_CAM == 1); Eigen::MatrixXd svd_A(2 * it_per_id.feature_per_frame.size(), 4); int svd_idx = 0; Eigen::Matrix P0; Eigen::Vector3d t0 = Ps[imu_i] + Rs[imu_i] * tic[0]; Eigen::Matrix3d R0 = Rs[imu_i] * ric[0]; P0.leftCols<3>() = Eigen::Matrix3d::Identity(); P0.rightCols<1>() = Eigen::Vector3d::Zero(); for (auto &it_per_frame : it_per_id.feature_per_frame) { imu_j++; Eigen::Vector3d t1 = Ps[imu_j] + Rs[imu_j] * tic[0]; Eigen::Matrix3d R1 = Rs[imu_j] * ric[0]; Eigen::Vector3d t = R0.transpose() * (t1 - t0); Eigen::Matrix3d R = R0.transpose() * R1; Eigen::Matrix P; P.leftCols<3>() = R.transpose(); P.rightCols<1>() = -R.transpose() * t; Eigen::Vector3d f = it_per_frame.point.normalized(); svd_A.row(svd_idx++) = f[0] * P.row(2) - f[2] * P.row(0); svd_A.row(svd_idx++) = f[1] * P.row(2) - f[2] * P.row(1); if (imu_i == imu_j) continue; } ROS_ASSERT(svd_idx == svd_A.rows()); Eigen::Vector4d svd_V = Eigen::JacobiSVD(svd_A, Eigen::ComputeThinV).matrixV().rightCols<1>(); double svd_method = svd_V[2] / svd_V[3]; //it_per_id->estimated_depth = -b / A; //it_per_id->estimated_depth = svd_V[2] / svd_V[3]; it_per_id.estimated_depth = svd_method; //it_per_id->estimated_depth = INIT_DEPTH; if (it_per_id.estimated_depth < 0.1) { it_per_id.estimated_depth = INIT_DEPTH; } } } void FeatureManager::removeOutlier() { ROS_BREAK(); int i = -1; for (auto it = feature.begin(), it_next = feature.begin(); it != feature.end(); it = it_next) { it_next++; i += it->used_num != 0; if (it->used_num != 0 && it->is_outlier == true) { feature.erase(it); } } } void FeatureManager::removeBackShiftDepth(Eigen::Matrix3d marg_R, Eigen::Vector3d marg_P, Eigen::Matrix3d new_R, Eigen::Vector3d new_P) { for (auto it = feature.begin(), it_next = feature.begin(); it != feature.end(); it = it_next) { it_next++; if (it->start_frame != 0) it->start_frame--; else { Eigen::Vector3d uv_i = it->feature_per_frame[0].point; it->feature_per_frame.erase(it->feature_per_frame.begin()); if (it->feature_per_frame.size() < 2) { feature.erase(it); continue; } else { Eigen::Vector3d pts_i = uv_i * it->estimated_depth; Eigen::Vector3d w_pts_i = marg_R * pts_i + marg_P; Eigen::Vector3d pts_j = new_R.transpose() * (w_pts_i - new_P); double dep_j = pts_j(2); if (dep_j > 0) it->estimated_depth = dep_j; else it->estimated_depth = INIT_DEPTH; } } // remove tracking-lost feature after marginalize /* if (it->endFrame() < WINDOW_SIZE - 1) { feature.erase(it); } */ } } void FeatureManager::removeBack() { for (auto it = feature.begin(), it_next = feature.begin(); it != feature.end(); it = it_next) { it_next++; if (it->start_frame != 0) it->start_frame--; else { it->feature_per_frame.erase(it->feature_per_frame.begin()); if (it->feature_per_frame.size() == 0) feature.erase(it); } } } void FeatureManager::removeFront(int frame_count) { for (auto it = feature.begin(), it_next = feature.begin(); it != feature.end(); it = it_next) { it_next++; if (it->start_frame == frame_count) { it->start_frame--; } else { int j = WINDOW_SIZE - 1 - it->start_frame; if (it->endFrame() < frame_count - 1) continue; it->feature_per_frame.erase(it->feature_per_frame.begin() + j); if (it->feature_per_frame.size() == 0) feature.erase(it); } } } double FeatureManager::compensatedParallax2(const FeaturePerId &it_per_id, int frame_count) { //check the second last frame is keyframe or not //parallax betwwen seconde last frame and third last frame const FeaturePerFrame &frame_i = it_per_id.feature_per_frame[frame_count - 2 - it_per_id.start_frame]; const FeaturePerFrame &frame_j = it_per_id.feature_per_frame[frame_count - 1 - it_per_id.start_frame]; double ans = 0; Vector3d p_j = frame_j.point; double u_j = p_j(0); double v_j = p_j(1); Vector3d p_i = frame_i.point; Vector3d p_i_comp; //int r_i = frame_count - 2; //int r_j = frame_count - 1; //p_i_comp = ric[camera_id_j].transpose() * Rs[r_j].transpose() * Rs[r_i] * ric[camera_id_i] * p_i; p_i_comp = p_i; double dep_i = p_i(2); double u_i = p_i(0) / dep_i; double v_i = p_i(1) / dep_i; double du = u_i - u_j, dv = v_i - v_j; double dep_i_comp = p_i_comp(2); double u_i_comp = p_i_comp(0) / dep_i_comp; double v_i_comp = p_i_comp(1) / dep_i_comp; double du_comp = u_i_comp - u_j, dv_comp = v_i_comp - v_j; ans = max(ans, sqrt(min(du * du + dv * dv, du_comp * du_comp + dv_comp * dv_comp))); return ans; } ================================================ FILE: rpvio_estimator/src/feature_manager.h ================================================ #ifndef FEATURE_MANAGER_H #define FEATURE_MANAGER_H #include #include #include #include using namespace std; #include using namespace Eigen; #include #include #include "parameters.h" class FeaturePerFrame { public: FeaturePerFrame(const Eigen::Matrix &_point, double td) { point.x() = _point(0); point.y() = _point(1); point.z() = _point(2); uv.x() = _point(3); uv.y() = _point(4); velocity.x() = _point(5); velocity.y() = _point(6); cur_td = td; } double cur_td; Vector3d point; Vector2d uv; Vector2d velocity; double z; bool is_used; double parallax; MatrixXd A; VectorXd b; double dep_gradient; }; class FeaturePerId { public: const int feature_id, plane_id; int start_frame; vector feature_per_frame; int used_num; bool is_outlier; bool is_margin; double estimated_depth; int solve_flag; // 0 haven't solve yet; 1 solve succ; 2 solve fail; Vector3d gt_p; FeaturePerId(int _feature_id, int _plane_id, int _start_frame) : feature_id(_feature_id), plane_id(_plane_id), start_frame(_start_frame), used_num(0), estimated_depth(-1.0), solve_flag(0) { } int endFrame(); }; class FeatureManager { public: FeatureManager(Matrix3d _Rs[]); void setRic(Matrix3d _ric[]); void clearState(); int getFeatureCount(); int getLargestPlaneId(std::vector &init_pids); bool addFeatureCheckParallax(int frame_count, const map>>> &image, double td); void debugShow(); map>> getCorresponding(int frame_count_l, int frame_count_r); vector> getCorrespondingByPlane(int frame_count_l, int frame_count_r, int plane_id); set plane_ids; //void updateDepth(const VectorXd &x); void setDepth(const VectorXd &x); void removeFailures(); void clearDepth(const VectorXd &x); VectorXd getDepthVector(); void triangulate(Vector3d Ps[], Vector3d tic[], Matrix3d ric[]); void removeBackShiftDepth(Eigen::Matrix3d marg_R, Eigen::Vector3d marg_P, Eigen::Matrix3d new_R, Eigen::Vector3d new_P); void removeBack(); void removeFront(int frame_count); void removeOutlier(); list feature; int last_track_num; private: double compensatedParallax2(const FeaturePerId &it_per_id, int frame_count); const Matrix3d *Rs; Matrix3d ric[NUM_OF_CAM]; }; #endif ================================================ FILE: rpvio_estimator/src/initial/initial_aligment.cpp ================================================ #include "initial_alignment.h" void solveGyroscopeBias(map &all_image_frame, Vector3d* Bgs) { Matrix3d A; Vector3d b; Vector3d delta_bg; A.setZero(); b.setZero(); map::iterator frame_i; map::iterator frame_j; for (frame_i = all_image_frame.begin(); next(frame_i) != all_image_frame.end(); frame_i++) { frame_j = next(frame_i); MatrixXd tmp_A(3, 3); tmp_A.setZero(); VectorXd tmp_b(3); tmp_b.setZero(); Eigen::Quaterniond q_ij(frame_i->second.R.transpose() * frame_j->second.R); tmp_A = frame_j->second.pre_integration->jacobian.template block<3, 3>(O_R, O_BG); tmp_b = 2 * (frame_j->second.pre_integration->delta_q.inverse() * q_ij).vec(); A += tmp_A.transpose() * tmp_A; b += tmp_A.transpose() * tmp_b; } delta_bg = A.ldlt().solve(b); ROS_WARN_STREAM("gyroscope bias initial calibration " << delta_bg.transpose()); for (int i = 0; i <= WINDOW_SIZE; i++) Bgs[i] += delta_bg; for (frame_i = all_image_frame.begin(); next(frame_i) != all_image_frame.end( ); frame_i++) { frame_j = next(frame_i); frame_j->second.pre_integration->repropagate(Vector3d::Zero(), Bgs[0]); } } MatrixXd TangentBasis(Vector3d &g0) { Vector3d b, c; Vector3d a = g0.normalized(); Vector3d tmp(0, 0, 1); if(a == tmp) tmp << 1, 0, 0; b = (tmp - a * (a.transpose() * tmp)).normalized(); c = a.cross(b); MatrixXd bc(3, 2); bc.block<3, 1>(0, 0) = b; bc.block<3, 1>(0, 1) = c; return bc; } void RefineGravity(map &all_image_frame, Vector3d &g, VectorXd &x) { Vector3d g0 = g.normalized() * G.norm(); Vector3d lx, ly; //VectorXd x; int all_frame_count = all_image_frame.size(); int n_state = all_frame_count * 3 + 2 + 1; MatrixXd A{n_state, n_state}; A.setZero(); VectorXd b{n_state}; b.setZero(); map::iterator frame_i; map::iterator frame_j; for(int k = 0; k < 4; k++) { MatrixXd lxly(3, 2); lxly = TangentBasis(g0); int i = 0; for (frame_i = all_image_frame.begin(); next(frame_i) != all_image_frame.end(); frame_i++, i++) { frame_j = next(frame_i); MatrixXd tmp_A(6, 9); tmp_A.setZero(); VectorXd tmp_b(6); tmp_b.setZero(); double dt = frame_j->second.pre_integration->sum_dt; tmp_A.block<3, 3>(0, 0) = -dt * Matrix3d::Identity(); tmp_A.block<3, 2>(0, 6) = frame_i->second.R.transpose() * dt * dt / 2 * Matrix3d::Identity() * lxly; tmp_A.block<3, 1>(0, 8) = frame_i->second.R.transpose() * (frame_j->second.T - frame_i->second.T) / 100.0; tmp_b.block<3, 1>(0, 0) = frame_j->second.pre_integration->delta_p + frame_i->second.R.transpose() * frame_j->second.R * TIC[0] - TIC[0] - frame_i->second.R.transpose() * dt * dt / 2 * g0; tmp_A.block<3, 3>(3, 0) = -Matrix3d::Identity(); tmp_A.block<3, 3>(3, 3) = frame_i->second.R.transpose() * frame_j->second.R; tmp_A.block<3, 2>(3, 6) = frame_i->second.R.transpose() * dt * Matrix3d::Identity() * lxly; tmp_b.block<3, 1>(3, 0) = frame_j->second.pre_integration->delta_v - frame_i->second.R.transpose() * dt * Matrix3d::Identity() * g0; Matrix cov_inv = Matrix::Zero(); //cov.block<6, 6>(0, 0) = IMU_cov[i + 1]; //MatrixXd cov_inv = cov.inverse(); cov_inv.setIdentity(); MatrixXd r_A = tmp_A.transpose() * cov_inv * tmp_A; VectorXd r_b = tmp_A.transpose() * cov_inv * tmp_b; A.block<6, 6>(i * 3, i * 3) += r_A.topLeftCorner<6, 6>(); b.segment<6>(i * 3) += r_b.head<6>(); A.bottomRightCorner<3, 3>() += r_A.bottomRightCorner<3, 3>(); b.tail<3>() += r_b.tail<3>(); A.block<6, 3>(i * 3, n_state - 3) += r_A.topRightCorner<6, 3>(); A.block<3, 6>(n_state - 3, i * 3) += r_A.bottomLeftCorner<3, 6>(); } A = A * 1000.0; b = b * 1000.0; x = A.ldlt().solve(b); VectorXd dg = x.segment<2>(n_state - 3); g0 = (g0 + lxly * dg).normalized() * G.norm(); //double s = x(n_state - 1); } g = g0; } bool LinearAlignment(map &all_image_frame, Vector3d &g, VectorXd &x) { int all_frame_count = all_image_frame.size(); int n_state = all_frame_count * 3 + 3 + 1; MatrixXd A{n_state, n_state}; A.setZero(); VectorXd b{n_state}; b.setZero(); map::iterator frame_i; map::iterator frame_j; int i = 0; for (frame_i = all_image_frame.begin(); next(frame_i) != all_image_frame.end(); frame_i++, i++) { frame_j = next(frame_i); MatrixXd tmp_A(6, 10); tmp_A.setZero(); VectorXd tmp_b(6); tmp_b.setZero(); double dt = frame_j->second.pre_integration->sum_dt; tmp_A.block<3, 3>(0, 0) = -dt * Matrix3d::Identity(); tmp_A.block<3, 3>(0, 6) = frame_i->second.R.transpose() * dt * dt / 2 * Matrix3d::Identity(); tmp_A.block<3, 1>(0, 9) = frame_i->second.R.transpose() * (frame_j->second.T - frame_i->second.T) / 100.0; tmp_b.block<3, 1>(0, 0) = frame_j->second.pre_integration->delta_p + frame_i->second.R.transpose() * frame_j->second.R * TIC[0] - TIC[0]; //cout << "delta_p " << frame_j->second.pre_integration->delta_p.transpose() << endl; tmp_A.block<3, 3>(3, 0) = -Matrix3d::Identity(); tmp_A.block<3, 3>(3, 3) = frame_i->second.R.transpose() * frame_j->second.R; tmp_A.block<3, 3>(3, 6) = frame_i->second.R.transpose() * dt * Matrix3d::Identity(); tmp_b.block<3, 1>(3, 0) = frame_j->second.pre_integration->delta_v; //cout << "delta_v " << frame_j->second.pre_integration->delta_v.transpose() << endl; Matrix cov_inv = Matrix::Zero(); //cov.block<6, 6>(0, 0) = IMU_cov[i + 1]; //MatrixXd cov_inv = cov.inverse(); cov_inv.setIdentity(); MatrixXd r_A = tmp_A.transpose() * cov_inv * tmp_A; VectorXd r_b = tmp_A.transpose() * cov_inv * tmp_b; A.block<6, 6>(i * 3, i * 3) += r_A.topLeftCorner<6, 6>(); b.segment<6>(i * 3) += r_b.head<6>(); A.bottomRightCorner<4, 4>() += r_A.bottomRightCorner<4, 4>(); b.tail<4>() += r_b.tail<4>(); A.block<6, 4>(i * 3, n_state - 4) += r_A.topRightCorner<6, 4>(); A.block<4, 6>(n_state - 4, i * 3) += r_A.bottomLeftCorner<4, 6>(); } A = A * 1000.0; b = b * 1000.0; x = A.ldlt().solve(b); double s = x(n_state - 1) / 100.0; ROS_DEBUG("estimated scale: %f", s); g = x.segment<3>(n_state - 4); ROS_DEBUG_STREAM(" result g " << g.norm() << " " << g.transpose()); if(fabs(g.norm() - G.norm()) > 1.0 || s < 0) { return false; } RefineGravity(all_image_frame, g, x); s = (x.tail<1>())(0) / 100.0; (x.tail<1>())(0) = s; ROS_DEBUG_STREAM(" refine " << g.norm() << " " << g.transpose()); if(s < 0.0 ) return false; else return true; } bool VisualIMUAlignment(map &all_image_frame, Vector3d* Bgs, Vector3d &g, VectorXd &x) { solveGyroscopeBias(all_image_frame, Bgs); if(LinearAlignment(all_image_frame, g, x)) return true; else return false; } ================================================ FILE: rpvio_estimator/src/initial/initial_alignment.h ================================================ #pragma once #include #include #include "../factor/imu_factor.h" #include "../utility/utility.h" #include #include #include "../feature_manager.h" using namespace Eigen; using namespace std; class ImageFrame { public: ImageFrame(){}; ImageFrame(const map>>>& _points, double _t):t{_t},is_key_frame{false} { points = _points; }; map> > > points; double t; Matrix3d R; Vector3d T; IntegrationBase *pre_integration; bool is_key_frame; }; bool VisualIMUAlignment(map &all_image_frame, Vector3d* Bgs, Vector3d &g, VectorXd &x); ================================================ FILE: rpvio_estimator/src/initial/initial_ex_rotation.cpp ================================================ #include "initial_ex_rotation.h" InitialEXRotation::InitialEXRotation(){ frame_count = 0; Rc.push_back(Matrix3d::Identity()); Rc_g.push_back(Matrix3d::Identity()); Rimu.push_back(Matrix3d::Identity()); ric = Matrix3d::Identity(); } bool InitialEXRotation::CalibrationExRotation(vector> corres, Quaterniond delta_q_imu, Matrix3d &calib_ric_result) { frame_count++; Rc.push_back(solveRelativeR(corres)); Rimu.push_back(delta_q_imu.toRotationMatrix()); Rc_g.push_back(ric.inverse() * delta_q_imu * ric); Eigen::MatrixXd A(frame_count * 4, 4); A.setZero(); int sum_ok = 0; for (int i = 1; i <= frame_count; i++) { Quaterniond r1(Rc[i]); Quaterniond r2(Rc_g[i]); double angular_distance = 180 / M_PI * r1.angularDistance(r2); ROS_DEBUG( "%d %f", i, angular_distance); double huber = angular_distance > 5.0 ? 5.0 / angular_distance : 1.0; ++sum_ok; Matrix4d L, R; double w = Quaterniond(Rc[i]).w(); Vector3d q = Quaterniond(Rc[i]).vec(); L.block<3, 3>(0, 0) = w * Matrix3d::Identity() + Utility::skewSymmetric(q); L.block<3, 1>(0, 3) = q; L.block<1, 3>(3, 0) = -q.transpose(); L(3, 3) = w; Quaterniond R_ij(Rimu[i]); w = R_ij.w(); q = R_ij.vec(); R.block<3, 3>(0, 0) = w * Matrix3d::Identity() - Utility::skewSymmetric(q); R.block<3, 1>(0, 3) = q; R.block<1, 3>(3, 0) = -q.transpose(); R(3, 3) = w; A.block<4, 4>((i - 1) * 4, 0) = huber * (L - R); } JacobiSVD svd(A, ComputeFullU | ComputeFullV); Matrix x = svd.matrixV().col(3); Quaterniond estimated_R(x); ric = estimated_R.toRotationMatrix().inverse(); //cout << svd.singularValues().transpose() << endl; //cout << ric << endl; Vector3d ric_cov; ric_cov = svd.singularValues().tail<3>(); if (frame_count >= WINDOW_SIZE && ric_cov(1) > 0.25) { calib_ric_result = ric; return true; } else return false; } Matrix3d InitialEXRotation::solveRelativeR(const vector> &corres) { if (corres.size() >= 9) { vector ll, rr; for (int i = 0; i < int(corres.size()); i++) { ll.push_back(cv::Point2f(corres[i].first(0), corres[i].first(1))); rr.push_back(cv::Point2f(corres[i].second(0), corres[i].second(1))); } cv::Mat E = cv::findFundamentalMat(ll, rr); cv::Mat_ R1, R2, t1, t2; decomposeE(E, R1, R2, t1, t2); if (determinant(R1) + 1.0 < 1e-09) { E = -E; decomposeE(E, R1, R2, t1, t2); } double ratio1 = max(testTriangulation(ll, rr, R1, t1), testTriangulation(ll, rr, R1, t2)); double ratio2 = max(testTriangulation(ll, rr, R2, t1), testTriangulation(ll, rr, R2, t2)); cv::Mat_ ans_R_cv = ratio1 > ratio2 ? R1 : R2; Matrix3d ans_R_eigen; for (int i = 0; i < 3; i++) for (int j = 0; j < 3; j++) ans_R_eigen(j, i) = ans_R_cv(i, j); return ans_R_eigen; } return Matrix3d::Identity(); } double InitialEXRotation::testTriangulation(const vector &l, const vector &r, cv::Mat_ R, cv::Mat_ t) { cv::Mat pointcloud; cv::Matx34f P = cv::Matx34f(1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0); cv::Matx34f P1 = cv::Matx34f(R(0, 0), R(0, 1), R(0, 2), t(0), R(1, 0), R(1, 1), R(1, 2), t(1), R(2, 0), R(2, 1), R(2, 2), t(2)); cv::triangulatePoints(P, P1, l, r, pointcloud); int front_count = 0; for (int i = 0; i < pointcloud.cols; i++) { double normal_factor = pointcloud.col(i).at(3); cv::Mat_ p_3d_l = cv::Mat(P) * (pointcloud.col(i) / normal_factor); cv::Mat_ p_3d_r = cv::Mat(P1) * (pointcloud.col(i) / normal_factor); if (p_3d_l(2) > 0 && p_3d_r(2) > 0) front_count++; } ROS_DEBUG("MotionEstimator: %f", 1.0 * front_count / pointcloud.cols); return 1.0 * front_count / pointcloud.cols; } void InitialEXRotation::decomposeE(cv::Mat E, cv::Mat_ &R1, cv::Mat_ &R2, cv::Mat_ &t1, cv::Mat_ &t2) { cv::SVD svd(E, cv::SVD::MODIFY_A); cv::Matx33d W(0, -1, 0, 1, 0, 0, 0, 0, 1); cv::Matx33d Wt(0, 1, 0, -1, 0, 0, 0, 0, 1); R1 = svd.u * cv::Mat(W) * svd.vt; R2 = svd.u * cv::Mat(Wt) * svd.vt; t1 = svd.u.col(2); t2 = -svd.u.col(2); } ================================================ FILE: rpvio_estimator/src/initial/initial_ex_rotation.h ================================================ #pragma once #include #include "../parameters.h" using namespace std; #include #include using namespace Eigen; #include /* This class help you to calibrate extrinsic rotation between imu and camera when your totally don't konw the extrinsic parameter */ class InitialEXRotation { public: InitialEXRotation(); bool CalibrationExRotation(vector> corres, Quaterniond delta_q_imu, Matrix3d &calib_ric_result); private: Matrix3d solveRelativeR(const vector> &corres); double testTriangulation(const vector &l, const vector &r, cv::Mat_ R, cv::Mat_ t); void decomposeE(cv::Mat E, cv::Mat_ &R1, cv::Mat_ &R2, cv::Mat_ &t1, cv::Mat_ &t2); int frame_count; vector< Matrix3d > Rc; vector< Matrix3d > Rimu; vector< Matrix3d > Rc_g; Matrix3d ric; }; ================================================ FILE: rpvio_estimator/src/initial/initial_sfm.cpp ================================================ #include "initial_sfm.h" GlobalSFM::GlobalSFM(){} void GlobalSFM::triangulatePoint(Eigen::Matrix &Pose0, Eigen::Matrix &Pose1, Vector2d &point0, Vector2d &point1, Vector3d &point_3d) { Matrix4d design_matrix = Matrix4d::Zero(); design_matrix.row(0) = point0[0] * Pose0.row(2) - Pose0.row(0); design_matrix.row(1) = point0[1] * Pose0.row(2) - Pose0.row(1); design_matrix.row(2) = point1[0] * Pose1.row(2) - Pose1.row(0); design_matrix.row(3) = point1[1] * Pose1.row(2) - Pose1.row(1); Vector4d triangulated_point; triangulated_point = design_matrix.jacobiSvd(Eigen::ComputeFullV).matrixV().rightCols<1>(); point_3d(0) = triangulated_point(0) / triangulated_point(3); point_3d(1) = triangulated_point(1) / triangulated_point(3); point_3d(2) = triangulated_point(2) / triangulated_point(3); } bool GlobalSFM::solveFrameByPnP(Matrix3d &R_initial, Vector3d &P_initial, int i, vector &sfm_f) { vector pts_2_vector; vector pts_3_vector; for (int j = 0; j < feature_num; j++) { if (sfm_f[j].state != true) continue; Vector2d point2d; for (int k = 0; k < (int)sfm_f[j].observation.size(); k++) { if (sfm_f[j].observation[k].first == i) { Vector2d img_pts = sfm_f[j].observation[k].second; cv::Point2f pts_2(img_pts(0), img_pts(1)); pts_2_vector.push_back(pts_2); cv::Point3f pts_3(sfm_f[j].position[0], sfm_f[j].position[1], sfm_f[j].position[2]); pts_3_vector.push_back(pts_3); break; } } } if (int(pts_2_vector.size()) < 15) { printf("unstable features tracking, please slowly move you device!\n"); if (int(pts_2_vector.size()) < 10) return false; } cv::Mat r, rvec, t, D, tmp_r; cv::eigen2cv(R_initial, tmp_r); cv::Rodrigues(tmp_r, rvec); cv::eigen2cv(P_initial, t); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); bool pnp_succ; pnp_succ = cv::solvePnP(pts_3_vector, pts_2_vector, K, D, rvec, t, 1); if(!pnp_succ) { return false; } cv::Rodrigues(rvec, r); //cout << "r " << endl << r << endl; MatrixXd R_pnp; cv::cv2eigen(r, R_pnp); MatrixXd T_pnp; cv::cv2eigen(t, T_pnp); R_initial = R_pnp; P_initial = T_pnp; return true; } void GlobalSFM::triangulateTwoFrames(int frame0, Eigen::Matrix &Pose0, int frame1, Eigen::Matrix &Pose1, vector &sfm_f) { assert(frame0 != frame1); for (int j = 0; j < feature_num; j++) { if (sfm_f[j].state == true) continue; bool has_0 = false, has_1 = false; Vector2d point0; Vector2d point1; for (int k = 0; k < (int)sfm_f[j].observation.size(); k++) { if (sfm_f[j].observation[k].first == frame0) { point0 = sfm_f[j].observation[k].second; has_0 = true; } if (sfm_f[j].observation[k].first == frame1) { point1 = sfm_f[j].observation[k].second; has_1 = true; } } if (has_0 && has_1) { Vector3d point_3d; triangulatePoint(Pose0, Pose1, point0, point1, point_3d); sfm_f[j].state = true; sfm_f[j].position[0] = point_3d(0); sfm_f[j].position[1] = point_3d(1); sfm_f[j].position[2] = point_3d(2); sfm_f[j].image[0] = point1[0]; sfm_f[j].image[1] = point1[1]; sfm_f[j].image[2] = 1; //cout << "trangulated : " << frame1 << " 3d point : " << j << " " << point_3d.transpose() << endl; } } } // q w_R_cam t w_R_cam // c_rotation cam_R_w // c_translation cam_R_w // relative_q[i][j] j_q_i // relative_t[i][j] j_t_ji (j < i) bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l, const Matrix3d relative_R, const Vector3d relative_T, vector &sfm_f, map &sfm_tracked_points) { feature_num = sfm_f.size(); //cout << "set 0 and " << l << " as known " << endl; // have relative_r relative_t // intial two view q[l].w() = 1; q[l].x() = 0; q[l].y() = 0; q[l].z() = 0; T[l].setZero(); q[frame_num - 1] = q[l] * Quaterniond(relative_R); T[frame_num - 1] = relative_T; //cout << "init q_l " << q[l].w() << " " << q[l].vec().transpose() << endl; //cout << "init t_l " << T[l].transpose() << endl; //rotate to cam frame Matrix3d c_Rotation[frame_num]; Vector3d c_Translation[frame_num]; Quaterniond c_Quat[frame_num]; double c_rotation[frame_num][4]; double c_translation[frame_num][3]; Eigen::Matrix Pose[frame_num]; c_Quat[l] = q[l].inverse(); c_Rotation[l] = c_Quat[l].toRotationMatrix(); c_Translation[l] = -1 * (c_Rotation[l] * T[l]); Pose[l].block<3, 3>(0, 0) = c_Rotation[l]; Pose[l].block<3, 1>(0, 3) = c_Translation[l]; c_Quat[frame_num - 1] = q[frame_num - 1].inverse(); c_Rotation[frame_num - 1] = c_Quat[frame_num - 1].toRotationMatrix(); c_Translation[frame_num - 1] = -1 * (c_Rotation[frame_num - 1] * T[frame_num - 1]); Pose[frame_num - 1].block<3, 3>(0, 0) = c_Rotation[frame_num - 1]; Pose[frame_num - 1].block<3, 1>(0, 3) = c_Translation[frame_num - 1]; //1: trangulate between l ----- frame_num - 1 //2: solve pnp l + 1; trangulate l + 1 ------- frame_num - 1; for (int i = l; i < frame_num - 1 ; i++) { // solve pnp if (i > l) { Matrix3d R_initial = c_Rotation[i - 1]; Vector3d P_initial = c_Translation[i - 1]; if(!solveFrameByPnP(R_initial, P_initial, i, sfm_f)) return false; c_Rotation[i] = R_initial; c_Translation[i] = P_initial; c_Quat[i] = c_Rotation[i]; Pose[i].block<3, 3>(0, 0) = c_Rotation[i]; Pose[i].block<3, 1>(0, 3) = c_Translation[i]; } // triangulate point based on the solve pnp result triangulateTwoFrames(i, Pose[i], frame_num - 1, Pose[frame_num - 1], sfm_f); } //3: triangulate l-----l+1 l+2 ... frame_num -2 for (int i = l + 1; i < frame_num - 1; i++) triangulateTwoFrames(l, Pose[l], i, Pose[i], sfm_f); //4: solve pnp l-1; triangulate l-1 ----- l // l-2 l-2 ----- l for (int i = l - 1; i >= 0; i--) { //solve pnp Matrix3d R_initial = c_Rotation[i + 1]; Vector3d P_initial = c_Translation[i + 1]; if(!solveFrameByPnP(R_initial, P_initial, i, sfm_f)) return false; c_Rotation[i] = R_initial; c_Translation[i] = P_initial; c_Quat[i] = c_Rotation[i]; Pose[i].block<3, 3>(0, 0) = c_Rotation[i]; Pose[i].block<3, 1>(0, 3) = c_Translation[i]; //triangulate triangulateTwoFrames(i, Pose[i], l, Pose[l], sfm_f); } //5: triangulate all other points for (int j = 0; j < feature_num; j++) { if (sfm_f[j].state == true) continue; if ((int)sfm_f[j].observation.size() >= 2) { Vector2d point0, point1; int frame_0 = sfm_f[j].observation[0].first; point0 = sfm_f[j].observation[0].second; int frame_1 = sfm_f[j].observation.back().first; point1 = sfm_f[j].observation.back().second; Vector3d point_3d; triangulatePoint(Pose[frame_0], Pose[frame_1], point0, point1, point_3d); sfm_f[j].state = true; sfm_f[j].position[0] = point_3d(0); sfm_f[j].position[1] = point_3d(1); sfm_f[j].position[2] = point_3d(2); sfm_f[j].image[0] = point1[0]; sfm_f[j].image[1] = point1[1]; sfm_f[j].image[2] = 1; //cout << "trangulated : " << frame_0 << " " << frame_1 << " 3d point : " << j << " " << point_3d.transpose() << endl; } } /* for (int i = 0; i < frame_num; i++) { q[i] = c_Rotation[i].transpose(); cout << "solvePnP q" << " i " << i <<" " < &sfm_f, map &sfm_tracked_points) { feature_num = sfm_f.size(); //cout << "set 0 and " << l << " as known " << endl; // have relative_r relative_t // intial two view q[l].w() = 1; q[l].x() = 0; q[l].y() = 0; q[l].z() = 0; T[l].setZero(); q[frame_num - 1] = q[l] * Quaterniond(relative_R); T[frame_num - 1] = relative_T; //cout << "init q_l " << q[l].w() << " " << q[l].vec().transpose() << endl; //cout << "init t_l " << T[l].transpose() << endl; //rotate to cam frame Matrix3d c_Rotation[frame_num]; Vector3d c_Translation[frame_num]; Quaterniond c_Quat[frame_num]; double c_rotation[frame_num][4]; double c_translation[frame_num][3]; double c_normal[3]; Eigen::Matrix Pose[frame_num]; c_Quat[l] = q[l].inverse(); c_Rotation[l] = c_Quat[l].toRotationMatrix(); c_Translation[l] = -1 * (c_Rotation[l] * T[l]); Pose[l].block<3, 3>(0, 0) = c_Rotation[l]; Pose[l].block<3, 1>(0, 3) = c_Translation[l]; c_Quat[frame_num - 1] = q[frame_num - 1].inverse(); c_Rotation[frame_num - 1] = c_Quat[frame_num - 1].toRotationMatrix(); c_Translation[frame_num - 1] = -1 * (c_Rotation[frame_num - 1] * T[frame_num - 1]); Pose[frame_num - 1].block<3, 3>(0, 0) = c_Rotation[frame_num - 1]; Pose[frame_num - 1].block<3, 1>(0, 3) = c_Translation[frame_num - 1]; //1: trangulate between l ----- frame_num - 1 //2: solve pnp l + 1; trangulate l + 1 ------- frame_num - 1; for (int i = l; i < frame_num - 1 ; i++) { // solve pnp if (i > l) { Matrix3d R_initial = c_Rotation[i - 1]; Vector3d P_initial = c_Translation[i - 1]; if(!solveFrameByPnP(R_initial, P_initial, i, sfm_f)) return false; c_Rotation[i] = R_initial; c_Translation[i] = P_initial; c_Quat[i] = c_Rotation[i]; Pose[i].block<3, 3>(0, 0) = c_Rotation[i]; Pose[i].block<3, 1>(0, 3) = c_Translation[i]; } // triangulate point based on the solve pnp result triangulateTwoFrames(i, Pose[i], frame_num - 1, Pose[frame_num - 1], sfm_f); } //3: triangulate l-----l+1 l+2 ... frame_num -2 for (int i = l + 1; i < frame_num - 1; i++) triangulateTwoFrames(l, Pose[l], i, Pose[i], sfm_f); //4: solve pnp l-1; triangulate l-1 ----- l // l-2 l-2 ----- l for (int i = l - 1; i >= 0; i--) { //solve pnp Matrix3d R_initial = c_Rotation[i + 1]; Vector3d P_initial = c_Translation[i + 1]; if(!solveFrameByPnP(R_initial, P_initial, i, sfm_f)) return false; c_Rotation[i] = R_initial; c_Translation[i] = P_initial; c_Quat[i] = c_Rotation[i]; Pose[i].block<3, 3>(0, 0) = c_Rotation[i]; Pose[i].block<3, 1>(0, 3) = c_Translation[i]; //triangulate triangulateTwoFrames(i, Pose[i], l, Pose[l], sfm_f); } //5: triangulate all other points for (int j = 0; j < feature_num; j++) { if (sfm_f[j].state == true) continue; if ((int)sfm_f[j].observation.size() >= 2) { Vector2d point0, point1; int frame_0 = sfm_f[j].observation[0].first; point0 = sfm_f[j].observation[0].second; int frame_1 = sfm_f[j].observation.back().first; point1 = sfm_f[j].observation.back().second; Vector3d point_3d; triangulatePoint(Pose[frame_0], Pose[frame_1], point0, point1, point_3d); sfm_f[j].state = true; sfm_f[j].position[0] = point_3d(0); sfm_f[j].position[1] = point_3d(1); sfm_f[j].position[2] = point_3d(2); sfm_f[j].image[0] = point1[0]; sfm_f[j].image[1] = point1[1]; sfm_f[j].image[2] = 1; //cout << "trangulated : " << frame_0 << " " << frame_1 << " 3d point : " << j << " " << point_3d.transpose() << endl; } } /* for (int i = 0; i < frame_num; i++) { q[i] = c_Rotation[i].transpose(); cout << "solvePnP q" << " i " << i <<" " < #include #include #include #include #include #include #include #include using namespace Eigen; using namespace std; struct SFMFeature { bool state; int id; int plane_id; vector> observation; double position[3]; double image[3]; // u v 1 double depth; }; struct ReprojectionError3D { ReprojectionError3D(double observed_u, double observed_v) :observed_u(observed_u), observed_v(observed_v) {} template bool operator()(const T* const camera_R, const T* const camera_T, const T* point, T* residuals) const { T p[3]; ceres::QuaternionRotatePoint(camera_R, point, p); p[0] += camera_T[0]; p[1] += camera_T[1]; p[2] += camera_T[2]; T xp = p[0] / p[2]; T yp = p[1] / p[2]; residuals[0] = xp - T(observed_u); residuals[1] = yp - T(observed_v); return true; } static ceres::CostFunction* Create(const double observed_x, const double observed_y) { return (new ceres::AutoDiffCostFunction< ReprojectionError3D, 2, 4, 3, 3>( new ReprojectionError3D(observed_x,observed_y))); } double observed_u; double observed_v; }; struct ReprojectionErrorH { ReprojectionErrorH(double observed_u, double observed_v) :observed_u(observed_u), observed_v(observed_v) {} template bool operator()(const T* const R, const T* const t, const T* const n, const T* point, T* residuals) const { T rp[3]; T norm_point[3] = {point[0]/point[2], point[1]/point[2], point[2]/point[2]}; ceres::QuaternionRotatePoint(R, norm_point, rp); T np = n[0] * norm_point[0] + n[1] * norm_point[1] + n[2] * norm_point[2]; rp[0] += t[0]*np; rp[1] += t[1]*np; rp[2] += t[2]*np; T xp = rp[0] / rp[2]; T yp = rp[1] / rp[2]; residuals[0] = xp - T(observed_u); residuals[1] = yp - T(observed_v); return true; } static ceres::CostFunction* Create(const double observed_x, const double observed_y) { return (new ceres::AutoDiffCostFunction< ReprojectionErrorH, 2, 4, 3, 3, 3>( new ReprojectionErrorH(observed_x,observed_y))); } double observed_u; double observed_v; }; class GlobalSFM { public: GlobalSFM(); bool construct(int frame_num, Quaterniond* q, Vector3d* T, int l, const Matrix3d relative_R, const Vector3d relative_T, vector &sfm_f, map &sfm_tracked_points); bool constructH(int frame_num, Quaterniond* q, Vector3d* T, int l, const Matrix3d relative_R, const Vector3d relative_T, Vector3d &n, vector &sfm_f, map &sfm_tracked_points); private: bool solveFrameByPnP(Matrix3d &R_initial, Vector3d &P_initial, int i, vector &sfm_f); void triangulatePoint(Eigen::Matrix &Pose0, Eigen::Matrix &Pose1, Vector2d &point0, Vector2d &point1, Vector3d &point_3d); void triangulateTwoFrames(int frame0, Eigen::Matrix &Pose0, int frame1, Eigen::Matrix &Pose1, vector &sfm_f); int feature_num; }; ================================================ FILE: rpvio_estimator/src/initial/solve_5pts.cpp ================================================ #include "solve_5pts.h" #include namespace cv { void decomposeEssentialMat( InputArray _E, OutputArray _R1, OutputArray _R2, OutputArray _t ) { Mat E = _E.getMat().reshape(1, 3); CV_Assert(E.cols == 3 && E.rows == 3); Mat D, U, Vt; SVD::compute(E, D, U, Vt); if (determinant(U) < 0) U *= -1.; if (determinant(Vt) < 0) Vt *= -1.; Mat W = (Mat_(3, 3) << 0, 1, 0, -1, 0, 0, 0, 0, 1); W.convertTo(W, E.type()); Mat R1, R2, t; R1 = U * W * Vt; R2 = U * W.t() * Vt; t = U.col(2) * 1.0; R1.copyTo(_R1); R2.copyTo(_R2); t.copyTo(_t); } int recoverPose( InputArray E, InputArray _points1, InputArray _points2, InputArray _cameraMatrix, OutputArray _R, OutputArray _t, InputOutputArray _mask) { Mat points1, points2, cameraMatrix; _points1.getMat().convertTo(points1, CV_64F); _points2.getMat().convertTo(points2, CV_64F); _cameraMatrix.getMat().convertTo(cameraMatrix, CV_64F); int npoints = points1.checkVector(2); CV_Assert( npoints >= 0 && points2.checkVector(2) == npoints && points1.type() == points2.type()); CV_Assert(cameraMatrix.rows == 3 && cameraMatrix.cols == 3 && cameraMatrix.channels() == 1); if (points1.channels() > 1) { points1 = points1.reshape(1, npoints); points2 = points2.reshape(1, npoints); } double fx = cameraMatrix.at(0,0); double fy = cameraMatrix.at(1,1); double cx = cameraMatrix.at(0,2); double cy = cameraMatrix.at(1,2); points1.col(0) = (points1.col(0) - cx) / fx; points2.col(0) = (points2.col(0) - cx) / fx; points1.col(1) = (points1.col(1) - cy) / fy; points2.col(1) = (points2.col(1) - cy) / fy; points1 = points1.t(); points2 = points2.t(); Mat R1, R2, t; decomposeEssentialMat(E, R1, R2, t); Mat P0 = Mat::eye(3, 4, R1.type()); Mat P1(3, 4, R1.type()), P2(3, 4, R1.type()), P3(3, 4, R1.type()), P4(3, 4, R1.type()); P1(Range::all(), Range(0, 3)) = R1 * 1.0; P1.col(3) = t * 1.0; P2(Range::all(), Range(0, 3)) = R2 * 1.0; P2.col(3) = t * 1.0; P3(Range::all(), Range(0, 3)) = R1 * 1.0; P3.col(3) = -t * 1.0; P4(Range::all(), Range(0, 3)) = R2 * 1.0; P4.col(3) = -t * 1.0; // Do the cheirality check. // Notice here a threshold dist is used to filter // out far away points (i.e. infinite points) since // there depth may vary between postive and negtive. double dist = 50.0; Mat Q; triangulatePoints(P0, P1, points1, points2, Q); Mat mask1 = Q.row(2).mul(Q.row(3)) > 0; Q.row(0) /= Q.row(3); Q.row(1) /= Q.row(3); Q.row(2) /= Q.row(3); Q.row(3) /= Q.row(3); mask1 = (Q.row(2) < dist) & mask1; Q = P1 * Q; mask1 = (Q.row(2) > 0) & mask1; mask1 = (Q.row(2) < dist) & mask1; triangulatePoints(P0, P2, points1, points2, Q); Mat mask2 = Q.row(2).mul(Q.row(3)) > 0; Q.row(0) /= Q.row(3); Q.row(1) /= Q.row(3); Q.row(2) /= Q.row(3); Q.row(3) /= Q.row(3); mask2 = (Q.row(2) < dist) & mask2; Q = P2 * Q; mask2 = (Q.row(2) > 0) & mask2; mask2 = (Q.row(2) < dist) & mask2; triangulatePoints(P0, P3, points1, points2, Q); Mat mask3 = Q.row(2).mul(Q.row(3)) > 0; Q.row(0) /= Q.row(3); Q.row(1) /= Q.row(3); Q.row(2) /= Q.row(3); Q.row(3) /= Q.row(3); mask3 = (Q.row(2) < dist) & mask3; Q = P3 * Q; mask3 = (Q.row(2) > 0) & mask3; mask3 = (Q.row(2) < dist) & mask3; triangulatePoints(P0, P4, points1, points2, Q); Mat mask4 = Q.row(2).mul(Q.row(3)) > 0; Q.row(0) /= Q.row(3); Q.row(1) /= Q.row(3); Q.row(2) /= Q.row(3); Q.row(3) /= Q.row(3); mask4 = (Q.row(2) < dist) & mask4; Q = P4 * Q; mask4 = (Q.row(2) > 0) & mask4; mask4 = (Q.row(2) < dist) & mask4; mask1 = mask1.t(); mask2 = mask2.t(); mask3 = mask3.t(); mask4 = mask4.t(); // If _mask is given, then use it to filter outliers. if (!_mask.empty()) { Mat mask = _mask.getMat(); CV_Assert(mask.size() == mask1.size()); bitwise_and(mask, mask1, mask1); bitwise_and(mask, mask2, mask2); bitwise_and(mask, mask3, mask3); bitwise_and(mask, mask4, mask4); } if (_mask.empty() && _mask.needed()) { _mask.create(mask1.size(), CV_8U); } CV_Assert(_R.needed() && _t.needed()); _R.create(3, 3, R1.type()); _t.create(3, 1, t.type()); int good1 = countNonZero(mask1); int good2 = countNonZero(mask2); int good3 = countNonZero(mask3); int good4 = countNonZero(mask4); if (good1 >= good2 && good1 >= good3 && good1 >= good4) { R1.copyTo(_R); t.copyTo(_t); if (_mask.needed()) mask1.copyTo(_mask); return good1; } else if (good2 >= good1 && good2 >= good3 && good2 >= good4) { R2.copyTo(_R); t.copyTo(_t); if (_mask.needed()) mask2.copyTo(_mask); return good2; } else if (good3 >= good1 && good3 >= good2 && good3 >= good4) { t = -t; R1.copyTo(_R); t.copyTo(_t); if (_mask.needed()) mask3.copyTo(_mask); return good3; } else { t = -t; R2.copyTo(_R); t.copyTo(_t); if (_mask.needed()) mask4.copyTo(_mask); return good4; } } int recoverPose( InputArray E, InputArray _points1, InputArray _points2, OutputArray _R, OutputArray _t, double focal, Point2d pp, InputOutputArray _mask) { Mat cameraMatrix = (Mat_(3,3) << focal, 0, pp.x, 0, focal, pp.y, 0, 0, 1); return cv::recoverPose(E, _points1, _points2, cameraMatrix, _R, _t, _mask); } } bool MotionEstimator::solveRelativeRT(const vector> &corres, Matrix3d &Rotation, Vector3d &Translation) { if (corres.size() >= 15) { vector ll, rr; for (int i = 0; i < int(corres.size()); i++) { ll.push_back(cv::Point2f(corres[i].first(0), corres[i].first(1))); rr.push_back(cv::Point2f(corres[i].second(0), corres[i].second(1))); } cv::Mat mask; cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask); cv::Mat cameraMatrix = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); cv::Mat rot, trans; int inlier_cnt = cv::recoverPose(E, ll, rr, cameraMatrix, rot, trans, mask); //cout << "inlier_cnt " << inlier_cnt << endl; Eigen::Matrix3d R; Eigen::Vector3d T; for (int i = 0; i < 3; i++) { T(i) = trans.at(i, 0); for (int j = 0; j < 3; j++) R(i, j) = rot.at(i, j); } Rotation = R.transpose(); Translation = -R.transpose() * T; if(inlier_cnt > 12) return true; else return false; } return false; } bool MotionEstimator::solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &Rotation, Vector3d &Translation, Vector3d &n) { if (corres.size() >= 15) { vector ll, rr; for (int i = 0; i < int(corres.size()); i++) { ll.push_back(cv::Point2f(corres[i].first(0), corres[i].first(1))); rr.push_back(cv::Point2f(corres[i].second(0), corres[i].second(1))); } cv::Mat mask; //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask); cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/460, mask, 2000, 0.99); cv::Mat K = (cv::Mat_(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1); // Compute mean point vector Eigen::Vector3d mean_l(0, 0, 1); int inlier_count = 0; for(int i = 0; i < ll.size(); i++) { if(mask.at(i,0) != 0) { mean_l(0) += ll[i].x; mean_l(1) += ll[i].y; inlier_count++; } } mean_l(0) /= int(inlier_count); mean_l(1) /= int(inlier_count); Eigen::Matrix4d est_Tr; Eigen::Vector3d est_n(0,0,0); decomposeH(H, K, R_imu, TrIC, mean_l, est_Tr, est_n); Rotation = est_Tr.block(0,0,3,3); Translation = est_Tr.block(0,3,3,1); n = est_n; if(n.isZero()) return false; return true; } return false; } void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, const Vector3d &mean_l, Matrix4d &est_Tr, Vector3d &est_n) { vector cv_Rs, cv_ts, cv_ns; int n_sols = cv::decomposeHomographyMat(H, K, cv_Rs, cv_ts, cv_ns); vector positive_depth_transforms; vector positive_depth_normals; if(n_sols > 1) { for(int i = 0; i < n_sols; i++) { Matrix4d Tr = Matrix4d::Identity(); Matrix3d R; cv::cv2eigen(cv_Rs[i], R); Tr.block(0,0,3,3) = R; Vector3d t; cv::cv2eigen(cv_ts[i], t); Tr.block(0,3,3,1) = t; //Tr = TrIC * Tr * TrIC.inverse(); Tr = Tr.inverse().eval(); Vector3d n; cv::cv2eigen(cv_ns[i], n); n.normalize(); if(n.dot(mean_l) > 0) { positive_depth_transforms.push_back(Tr); positive_depth_normals.push_back(n); } } if(positive_depth_transforms.size() > 0) { vector rot_diff; for(size_t i = 0; i < positive_depth_transforms.size(); i++) { Eigen::Matrix4d Tr = TrIC * positive_depth_transforms[i] * TrIC.inverse(); Eigen::Matrix3d R = Tr.block(0,0,3,3); double f = (R.transpose()*R_imu - MatrixXd::Identity(3,3)).norm(); rot_diff.push_back(f); } int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin(); est_Tr = positive_depth_transforms[min_index]; est_n = positive_depth_normals[min_index]; } } else { Matrix4d Tr = Matrix4d::Identity(); Matrix3d R; cv::cv2eigen(cv_Rs[0], R); Tr.block(0,0,3,3) = R; Vector3d t; cv::cv2eigen(cv_ts[0], t); Tr.block(0,3,3,1) = t; Vector3d n; cv::cv2eigen(cv_ns[0], n); //Tr = TrIC * Tr * TrIC.inverse(); est_Tr = Tr.inverse().eval(); est_n = n.normalized(); } } ================================================ FILE: rpvio_estimator/src/initial/solve_5pts.h ================================================ #pragma once #include using namespace std; #include #include #include using namespace Eigen; #include class MotionEstimator { public: bool solveRelativeRT(const vector> &corres, Matrix3d &R, Vector3d &T); bool solveRelativeHRT(const vector> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T, Vector3d &n); void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, const Vector3d &mean_l, Matrix4d &est_Tr, Vector3d &est_n); private: double testTriangulation(const vector &l, const vector &r, cv::Mat_ R, cv::Mat_ t); void decomposeE(cv::Mat E, cv::Mat_ &R1, cv::Mat_ &R2, cv::Mat_ &t1, cv::Mat_ &t2); }; ================================================ FILE: rpvio_estimator/src/parameters.cpp ================================================ #include "parameters.h" double INIT_DEPTH; double MIN_PARALLAX; double ACC_N, ACC_W; double GYR_N, GYR_W; std::vector RIC; std::vector TIC; Eigen::Vector3d G{0.0, 0.0, 9.8}; double BIAS_ACC_THRESHOLD; double BIAS_GYR_THRESHOLD; double SOLVER_TIME; int NUM_ITERATIONS; int ESTIMATE_EXTRINSIC; int ESTIMATE_TD; int ROLLING_SHUTTER; std::string EX_CALIB_RESULT_PATH; std::string RPVIO_RESULT_PATH; std::string IMU_TOPIC; double ROW, COL; double TD, TR; template T readParam(ros::NodeHandle &n, std::string name) { T ans; if (n.getParam(name, ans)) { ROS_INFO_STREAM("Loaded " << name << ": " << ans); } else { ROS_ERROR_STREAM("Failed to load " << name); n.shutdown(); } return ans; } void readParameters(ros::NodeHandle &n) { std::string config_file; config_file = readParam(n, "config_file"); cv::FileStorage fsSettings(config_file, cv::FileStorage::READ); if(!fsSettings.isOpened()) { std::cerr << "ERROR: Wrong path to settings" << std::endl; } fsSettings["imu_topic"] >> IMU_TOPIC; SOLVER_TIME = fsSettings["max_solver_time"]; NUM_ITERATIONS = fsSettings["max_num_iterations"]; MIN_PARALLAX = fsSettings["keyframe_parallax"]; MIN_PARALLAX = MIN_PARALLAX / FOCAL_LENGTH; std::string OUTPUT_PATH; fsSettings["output_path"] >> OUTPUT_PATH; RPVIO_RESULT_PATH = OUTPUT_PATH + "/rpvio_result_no_loop.csv"; std::cout << "result path " << RPVIO_RESULT_PATH << std::endl; // create folder if not exists FileSystemHelper::createDirectoryIfNotExists(OUTPUT_PATH.c_str()); std::ofstream fout(RPVIO_RESULT_PATH, std::ios::out); fout.close(); ACC_N = fsSettings["acc_n"]; ACC_W = fsSettings["acc_w"]; GYR_N = fsSettings["gyr_n"]; GYR_W = fsSettings["gyr_w"]; G.z() = fsSettings["g_norm"]; ROW = fsSettings["image_height"]; COL = fsSettings["image_width"]; ROS_INFO("ROW: %f COL: %f ", ROW, COL); ESTIMATE_EXTRINSIC = fsSettings["estimate_extrinsic"]; if (ESTIMATE_EXTRINSIC == 2) { ROS_WARN("have no prior about extrinsic param, calibrate extrinsic param"); RIC.push_back(Eigen::Matrix3d::Identity()); TIC.push_back(Eigen::Vector3d::Zero()); EX_CALIB_RESULT_PATH = OUTPUT_PATH + "/extrinsic_parameter.csv"; } else { if ( ESTIMATE_EXTRINSIC == 1) { ROS_WARN(" Optimize extrinsic param around initial guess!"); EX_CALIB_RESULT_PATH = OUTPUT_PATH + "/extrinsic_parameter.csv"; } if (ESTIMATE_EXTRINSIC == 0) ROS_WARN(" fix extrinsic param "); cv::Mat cv_R, cv_T; fsSettings["extrinsicRotation"] >> cv_R; fsSettings["extrinsicTranslation"] >> cv_T; Eigen::Matrix3d eigen_R; Eigen::Vector3d eigen_T; cv::cv2eigen(cv_R, eigen_R); cv::cv2eigen(cv_T, eigen_T); Eigen::Quaterniond Q(eigen_R); eigen_R = Q.normalized(); RIC.push_back(eigen_R); TIC.push_back(eigen_T); ROS_INFO_STREAM("Extrinsic_R : " << std::endl << RIC[0]); ROS_INFO_STREAM("Extrinsic_T : " << std::endl << TIC[0].transpose()); } INIT_DEPTH = 5.0; BIAS_ACC_THRESHOLD = 0.1; BIAS_GYR_THRESHOLD = 0.1; TD = fsSettings["td"]; ESTIMATE_TD = fsSettings["estimate_td"]; if (ESTIMATE_TD) ROS_INFO_STREAM("Unsynchronized sensors, online estimate time offset, initial td: " << TD); else ROS_INFO_STREAM("Synchronized sensors, fix time offset: " << TD); ROLLING_SHUTTER = fsSettings["rolling_shutter"]; if (ROLLING_SHUTTER) { TR = fsSettings["rolling_shutter_tr"]; ROS_INFO_STREAM("rolling shutter camera, read out time per line: " << TR); } else { TR = 0; } fsSettings.release(); } ================================================ FILE: rpvio_estimator/src/parameters.h ================================================ #pragma once #include #include #include #include "utility/utility.h" #include #include #include const double FOCAL_LENGTH = 460.0; const int WINDOW_SIZE = 10; const int NUM_OF_CAM = 1; const int NUM_OF_F = 1000; //#define UNIT_SPHERE_ERROR extern double INIT_DEPTH; extern double MIN_PARALLAX; extern int ESTIMATE_EXTRINSIC; extern double ACC_N, ACC_W; extern double GYR_N, GYR_W; extern std::vector RIC; extern std::vector TIC; extern Eigen::Vector3d G; extern double BIAS_ACC_THRESHOLD; extern double BIAS_GYR_THRESHOLD; extern double SOLVER_TIME; extern int NUM_ITERATIONS; extern std::string EX_CALIB_RESULT_PATH; extern std::string RPVIO_RESULT_PATH; extern std::string IMU_TOPIC; extern double TD; extern double TR; extern int ESTIMATE_TD; extern int ROLLING_SHUTTER; extern double ROW, COL; void readParameters(ros::NodeHandle &n); enum SIZE_PARAMETERIZATION { SIZE_POSE = 7, SIZE_SPEEDBIAS = 9, SIZE_FEATURE = 1 }; enum StateOrder { O_P = 0, O_R = 3, O_V = 6, O_BA = 9, O_BG = 12 }; enum NoiseOrder { O_AN = 0, O_GN = 3, O_AW = 6, O_GW = 9 }; ================================================ FILE: rpvio_estimator/src/utility/CameraPoseVisualization.cpp ================================================ #include "CameraPoseVisualization.h" const Eigen::Vector3d CameraPoseVisualization::imlt = Eigen::Vector3d(-1.0, -0.5, 1.0); const Eigen::Vector3d CameraPoseVisualization::imrt = Eigen::Vector3d( 1.0, -0.5, 1.0); const Eigen::Vector3d CameraPoseVisualization::imlb = Eigen::Vector3d(-1.0, 0.5, 1.0); const Eigen::Vector3d CameraPoseVisualization::imrb = Eigen::Vector3d( 1.0, 0.5, 1.0); const Eigen::Vector3d CameraPoseVisualization::lt0 = Eigen::Vector3d(-0.7, -0.5, 1.0); const Eigen::Vector3d CameraPoseVisualization::lt1 = Eigen::Vector3d(-0.7, -0.2, 1.0); const Eigen::Vector3d CameraPoseVisualization::lt2 = Eigen::Vector3d(-1.0, -0.2, 1.0); const Eigen::Vector3d CameraPoseVisualization::oc = Eigen::Vector3d(0.0, 0.0, 0.0); void Eigen2Point(const Eigen::Vector3d& v, geometry_msgs::Point& p) { p.x = v.x(); p.y = v.y(); p.z = v.z(); } CameraPoseVisualization::CameraPoseVisualization(float r, float g, float b, float a) : m_marker_ns("CameraPoseVisualization"), m_scale(0.2), m_line_width(0.01) { m_image_boundary_color.r = r; m_image_boundary_color.g = g; m_image_boundary_color.b = b; m_image_boundary_color.a = a; m_optical_center_connector_color.r = r; m_optical_center_connector_color.g = g; m_optical_center_connector_color.b = b; m_optical_center_connector_color.a = a; } void CameraPoseVisualization::setImageBoundaryColor(float r, float g, float b, float a) { m_image_boundary_color.r = r; m_image_boundary_color.g = g; m_image_boundary_color.b = b; m_image_boundary_color.a = a; } void CameraPoseVisualization::setOpticalCenterConnectorColor(float r, float g, float b, float a) { m_optical_center_connector_color.r = r; m_optical_center_connector_color.g = g; m_optical_center_connector_color.b = b; m_optical_center_connector_color.a = a; } void CameraPoseVisualization::setScale(double s) { m_scale = s; } void CameraPoseVisualization::setLineWidth(double width) { m_line_width = width; } void CameraPoseVisualization::add_edge(const Eigen::Vector3d& p0, const Eigen::Vector3d& p1){ visualization_msgs::Marker marker; marker.ns = m_marker_ns; marker.id = m_markers.size() + 1; marker.type = visualization_msgs::Marker::LINE_LIST; marker.action = visualization_msgs::Marker::ADD; marker.scale.x = 0.005; marker.color.g = 1.0f; marker.color.a = 1.0; geometry_msgs::Point point0, point1; Eigen2Point(p0, point0); Eigen2Point(p1, point1); marker.points.push_back(point0); marker.points.push_back(point1); m_markers.push_back(marker); } void CameraPoseVisualization::add_loopedge(const Eigen::Vector3d& p0, const Eigen::Vector3d& p1){ visualization_msgs::Marker marker; marker.ns = m_marker_ns; marker.id = m_markers.size() + 1; marker.type = visualization_msgs::Marker::LINE_LIST; marker.action = visualization_msgs::Marker::ADD; marker.scale.x = 0.04; //marker.scale.x = 0.3; marker.color.r = 1.0f; marker.color.b = 1.0f; marker.color.a = 1.0; geometry_msgs::Point point0, point1; Eigen2Point(p0, point0); Eigen2Point(p1, point1); marker.points.push_back(point0); marker.points.push_back(point1); m_markers.push_back(marker); } void CameraPoseVisualization::add_pose(const Eigen::Vector3d& p, const Eigen::Quaterniond& q) { visualization_msgs::Marker marker; marker.ns = m_marker_ns; marker.id = m_markers.size() + 1; marker.type = visualization_msgs::Marker::LINE_STRIP; marker.action = visualization_msgs::Marker::ADD; marker.scale.x = m_line_width; marker.pose.position.x = 0.0; marker.pose.position.y = 0.0; marker.pose.position.z = 0.0; marker.pose.orientation.w = 1.0; marker.pose.orientation.x = 0.0; marker.pose.orientation.y = 0.0; marker.pose.orientation.z = 0.0; geometry_msgs::Point pt_lt, pt_lb, pt_rt, pt_rb, pt_oc, pt_lt0, pt_lt1, pt_lt2; Eigen2Point(q * (m_scale *imlt) + p, pt_lt); Eigen2Point(q * (m_scale *imlb) + p, pt_lb); Eigen2Point(q * (m_scale *imrt) + p, pt_rt); Eigen2Point(q * (m_scale *imrb) + p, pt_rb); Eigen2Point(q * (m_scale *lt0 ) + p, pt_lt0); Eigen2Point(q * (m_scale *lt1 ) + p, pt_lt1); Eigen2Point(q * (m_scale *lt2 ) + p, pt_lt2); Eigen2Point(q * (m_scale *oc ) + p, pt_oc); // image boundaries marker.points.push_back(pt_lt); marker.points.push_back(pt_lb); marker.colors.push_back(m_image_boundary_color); marker.colors.push_back(m_image_boundary_color); marker.points.push_back(pt_lb); marker.points.push_back(pt_rb); marker.colors.push_back(m_image_boundary_color); marker.colors.push_back(m_image_boundary_color); marker.points.push_back(pt_rb); marker.points.push_back(pt_rt); marker.colors.push_back(m_image_boundary_color); marker.colors.push_back(m_image_boundary_color); marker.points.push_back(pt_rt); marker.points.push_back(pt_lt); marker.colors.push_back(m_image_boundary_color); marker.colors.push_back(m_image_boundary_color); // top-left indicator marker.points.push_back(pt_lt0); marker.points.push_back(pt_lt1); marker.colors.push_back(m_image_boundary_color); marker.colors.push_back(m_image_boundary_color); marker.points.push_back(pt_lt1); marker.points.push_back(pt_lt2); marker.colors.push_back(m_image_boundary_color); marker.colors.push_back(m_image_boundary_color); // optical center connector marker.points.push_back(pt_lt); marker.points.push_back(pt_oc); marker.colors.push_back(m_optical_center_connector_color); marker.colors.push_back(m_optical_center_connector_color); marker.points.push_back(pt_lb); marker.points.push_back(pt_oc); marker.colors.push_back(m_optical_center_connector_color); marker.colors.push_back(m_optical_center_connector_color); marker.points.push_back(pt_rt); marker.points.push_back(pt_oc); marker.colors.push_back(m_optical_center_connector_color); marker.colors.push_back(m_optical_center_connector_color); marker.points.push_back(pt_rb); marker.points.push_back(pt_oc); marker.colors.push_back(m_optical_center_connector_color); marker.colors.push_back(m_optical_center_connector_color); m_markers.push_back(marker); } void CameraPoseVisualization::reset() { m_markers.clear(); } void CameraPoseVisualization::publish_by( ros::Publisher &pub, const std_msgs::Header &header ) { visualization_msgs::MarkerArray markerArray_msg; for(auto& marker : m_markers) { marker.header = header; markerArray_msg.markers.push_back(marker); } pub.publish(markerArray_msg); } ================================================ FILE: rpvio_estimator/src/utility/CameraPoseVisualization.h ================================================ #pragma once #include #include #include #include #include #include class CameraPoseVisualization { public: std::string m_marker_ns; CameraPoseVisualization(float r, float g, float b, float a); void setImageBoundaryColor(float r, float g, float b, float a=1.0); void setOpticalCenterConnectorColor(float r, float g, float b, float a=1.0); void setScale(double s); void setLineWidth(double width); void add_pose(const Eigen::Vector3d& p, const Eigen::Quaterniond& q); void reset(); void publish_by(ros::Publisher& pub, const std_msgs::Header& header); void add_edge(const Eigen::Vector3d& p0, const Eigen::Vector3d& p1); void add_loopedge(const Eigen::Vector3d& p0, const Eigen::Vector3d& p1); private: std::vector m_markers; std_msgs::ColorRGBA m_image_boundary_color; std_msgs::ColorRGBA m_optical_center_connector_color; double m_scale; double m_line_width; static const Eigen::Vector3d imlt; static const Eigen::Vector3d imlb; static const Eigen::Vector3d imrt; static const Eigen::Vector3d imrb; static const Eigen::Vector3d oc ; static const Eigen::Vector3d lt0 ; static const Eigen::Vector3d lt1 ; static const Eigen::Vector3d lt2 ; }; ================================================ FILE: rpvio_estimator/src/utility/tic_toc.h ================================================ #pragma once #include #include #include class TicToc { public: TicToc() { tic(); } void tic() { start = std::chrono::system_clock::now(); } double toc() { end = std::chrono::system_clock::now(); std::chrono::duration elapsed_seconds = end - start; return elapsed_seconds.count() * 1000; } private: std::chrono::time_point start, end; }; ================================================ FILE: rpvio_estimator/src/utility/utility.cpp ================================================ #include "utility.h" Eigen::Matrix3d Utility::g2R(const Eigen::Vector3d &g) { Eigen::Matrix3d R0; Eigen::Vector3d ng1 = g.normalized(); Eigen::Vector3d ng2{0, 0, 1.0}; R0 = Eigen::Quaterniond::FromTwoVectors(ng1, ng2).toRotationMatrix(); double yaw = Utility::R2ypr(R0).x(); R0 = Utility::ypr2R(Eigen::Vector3d{-yaw, 0, 0}) * R0; // R0 = Utility::ypr2R(Eigen::Vector3d{-90, 0, 0}) * R0; return R0; } ================================================ FILE: rpvio_estimator/src/utility/utility.h ================================================ #pragma once #include #include #include #include #include #include #include #include #include class Utility { public: template static Eigen::Quaternion deltaQ(const Eigen::MatrixBase &theta) { typedef typename Derived::Scalar Scalar_t; Eigen::Quaternion dq; Eigen::Matrix half_theta = theta; half_theta /= static_cast(2.0); dq.w() = static_cast(1.0); dq.x() = half_theta.x(); dq.y() = half_theta.y(); dq.z() = half_theta.z(); return dq; } template static Eigen::Matrix skewSymmetric(const Eigen::MatrixBase &q) { Eigen::Matrix ans; ans << typename Derived::Scalar(0), -q(2), q(1), q(2), typename Derived::Scalar(0), -q(0), -q(1), q(0), typename Derived::Scalar(0); return ans; } template static Eigen::Quaternion positify(const Eigen::QuaternionBase &q) { //printf("a: %f %f %f %f", q.w(), q.x(), q.y(), q.z()); //Eigen::Quaternion p(-q.w(), -q.x(), -q.y(), -q.z()); //printf("b: %f %f %f %f", p.w(), p.x(), p.y(), p.z()); //return q.template w() >= (typename Derived::Scalar)(0.0) ? q : Eigen::Quaternion(-q.w(), -q.x(), -q.y(), -q.z()); return q; } template static Eigen::Matrix Qleft(const Eigen::QuaternionBase &q) { Eigen::Quaternion qq = positify(q); Eigen::Matrix ans; ans(0, 0) = qq.w(), ans.template block<1, 3>(0, 1) = -qq.vec().transpose(); ans.template block<3, 1>(1, 0) = qq.vec(), ans.template block<3, 3>(1, 1) = qq.w() * Eigen::Matrix::Identity() + skewSymmetric(qq.vec()); return ans; } template static Eigen::Matrix Qright(const Eigen::QuaternionBase &p) { Eigen::Quaternion pp = positify(p); Eigen::Matrix ans; ans(0, 0) = pp.w(), ans.template block<1, 3>(0, 1) = -pp.vec().transpose(); ans.template block<3, 1>(1, 0) = pp.vec(), ans.template block<3, 3>(1, 1) = pp.w() * Eigen::Matrix::Identity() - skewSymmetric(pp.vec()); return ans; } static Eigen::Vector3d R2ypr(const Eigen::Matrix3d &R) { Eigen::Vector3d n = R.col(0); Eigen::Vector3d o = R.col(1); Eigen::Vector3d a = R.col(2); Eigen::Vector3d ypr(3); double y = atan2(n(1), n(0)); double p = atan2(-n(2), n(0) * cos(y) + n(1) * sin(y)); double r = atan2(a(0) * sin(y) - a(1) * cos(y), -o(0) * sin(y) + o(1) * cos(y)); ypr(0) = y; ypr(1) = p; ypr(2) = r; return ypr / M_PI * 180.0; } template static Eigen::Matrix ypr2R(const Eigen::MatrixBase &ypr) { typedef typename Derived::Scalar Scalar_t; Scalar_t y = ypr(0) / 180.0 * M_PI; Scalar_t p = ypr(1) / 180.0 * M_PI; Scalar_t r = ypr(2) / 180.0 * M_PI; Eigen::Matrix Rz; Rz << cos(y), -sin(y), 0, sin(y), cos(y), 0, 0, 0, 1; Eigen::Matrix Ry; Ry << cos(p), 0., sin(p), 0., 1., 0., -sin(p), 0., cos(p); Eigen::Matrix Rx; Rx << 1., 0., 0., 0., cos(r), -sin(r), 0., sin(r), cos(r); return Rz * Ry * Rx; } static Eigen::Matrix3d g2R(const Eigen::Vector3d &g); template struct uint_ { }; template void unroller(const Lambda &f, const IterT &iter, uint_) { unroller(f, iter, uint_()); f(iter + N); } template void unroller(const Lambda &f, const IterT &iter, uint_<0>) { f(iter); } template static T normalizeAngle(const T& angle_degrees) { T two_pi(2.0 * 180); if (angle_degrees > 0) return angle_degrees - two_pi * std::floor((angle_degrees + T(180)) / two_pi); else return angle_degrees + two_pi * std::floor((-angle_degrees + T(180)) / two_pi); }; }; class FileSystemHelper { public: /****************************************************************************** * Recursively create directory if `path` not exists. * Return 0 if success. *****************************************************************************/ static int createDirectoryIfNotExists(const char *path) { struct stat info; int statRC = stat(path, &info); if( statRC != 0 ) { if (errno == ENOENT) { printf("%s not exists, trying to create it \n", path); if (! createDirectoryIfNotExists(dirname(strdupa(path)))) { if (mkdir(path, S_IRWXU | S_IRWXG | S_IROTH | S_IXOTH) != 0) { fprintf(stderr, "Failed to create folder %s \n", path); return 1; } else return 0; } else return 1; } // directory not exists if (errno == ENOTDIR) { fprintf(stderr, "%s is not a directory path \n", path); return 1; } // something in path prefix is not a dir return 1; } return ( info.st_mode & S_IFDIR ) ? 0 : 1; } }; ================================================ FILE: rpvio_estimator/src/utility/visualization.cpp ================================================ #include "visualization.h" using namespace ros; using namespace Eigen; ros::Publisher pub_odometry, pub_latest_odometry; ros::Publisher pub_path, pub_relo_path; ros::Publisher pub_point_cloud, pub_margin_cloud; ros::Publisher pub_key_poses; ros::Publisher pub_relo_relative_pose; ros::Publisher pub_camera_pose; ros::Publisher pub_camera_pose_visual; nav_msgs::Path path, relo_path; ros::Publisher pub_keyframe_pose; ros::Publisher pub_keyframe_point; ros::Publisher pub_extrinsic; CameraPoseVisualization cameraposevisual(0, 1, 0, 1); CameraPoseVisualization keyframebasevisual(0.0, 0.0, 1.0, 1.0); static double sum_of_path = 0; static Vector3d last_path(0.0, 0.0, 0.0); void registerPub(ros::NodeHandle &n) { pub_latest_odometry = n.advertise("imu_propagate", 1000); pub_path = n.advertise("path", 1000); pub_relo_path = n.advertise("relocalization_path", 1000); pub_odometry = n.advertise("odometry", 1000); pub_point_cloud = n.advertise("point_cloud", 1000); pub_margin_cloud = n.advertise("history_cloud", 1000); pub_key_poses = n.advertise("key_poses", 1000); pub_camera_pose = n.advertise("camera_pose", 1000); pub_camera_pose_visual = n.advertise("camera_pose_visual", 1000); pub_keyframe_pose = n.advertise("keyframe_pose", 1000); pub_keyframe_point = n.advertise("keyframe_point", 1000); pub_extrinsic = n.advertise("extrinsic", 1000); pub_relo_relative_pose= n.advertise("relo_relative_pose", 1000); cameraposevisual.setScale(1); cameraposevisual.setLineWidth(0.05); keyframebasevisual.setScale(0.1); keyframebasevisual.setLineWidth(0.01); } void pubLatestOdometry(const Eigen::Vector3d &P, const Eigen::Quaterniond &Q, const Eigen::Vector3d &V, const std_msgs::Header &header) { Eigen::Quaterniond quadrotor_Q = Q ; nav_msgs::Odometry odometry; odometry.header = header; odometry.header.frame_id = "world"; odometry.pose.pose.position.x = P.x(); odometry.pose.pose.position.y = P.y(); odometry.pose.pose.position.z = P.z(); odometry.pose.pose.orientation.x = quadrotor_Q.x(); odometry.pose.pose.orientation.y = quadrotor_Q.y(); odometry.pose.pose.orientation.z = quadrotor_Q.z(); odometry.pose.pose.orientation.w = quadrotor_Q.w(); odometry.twist.twist.linear.x = V.x(); odometry.twist.twist.linear.y = V.y(); odometry.twist.twist.linear.z = V.z(); pub_latest_odometry.publish(odometry); } void printStatistics(const Estimator &estimator, double t) { if (estimator.solver_flag != Estimator::SolverFlag::NON_LINEAR) return; printf("position: %f, %f, %f\r", estimator.Ps[WINDOW_SIZE].x(), estimator.Ps[WINDOW_SIZE].y(), estimator.Ps[WINDOW_SIZE].z()); ROS_DEBUG_STREAM("position: " << estimator.Ps[WINDOW_SIZE].transpose()); ROS_DEBUG_STREAM("orientation: " << estimator.Vs[WINDOW_SIZE].transpose()); for (int i = 0; i < NUM_OF_CAM; i++) { //ROS_DEBUG("calibration result for camera %d", i); ROS_DEBUG_STREAM("extirnsic tic: " << estimator.tic[i].transpose()); ROS_DEBUG_STREAM("extrinsic ric: " << Utility::R2ypr(estimator.ric[i]).transpose()); if (ESTIMATE_EXTRINSIC) { cv::FileStorage fs(EX_CALIB_RESULT_PATH, cv::FileStorage::WRITE); Eigen::Matrix3d eigen_R; Eigen::Vector3d eigen_T; eigen_R = estimator.ric[i]; eigen_T = estimator.tic[i]; cv::Mat cv_R, cv_T; cv::eigen2cv(eigen_R, cv_R); cv::eigen2cv(eigen_T, cv_T); fs << "extrinsicRotation" << cv_R << "extrinsicTranslation" << cv_T; fs.release(); } } static double sum_of_time = 0; static int sum_of_calculation = 0; sum_of_time += t; sum_of_calculation++; ROS_DEBUG("vo solver costs: %f ms", t); ROS_DEBUG("average of time %f ms", sum_of_time / sum_of_calculation); sum_of_path += (estimator.Ps[WINDOW_SIZE] - last_path).norm(); last_path = estimator.Ps[WINDOW_SIZE]; ROS_DEBUG("sum of path %f", sum_of_path); if (ESTIMATE_TD) ROS_INFO("td %f", estimator.td); } void pubOdometry(const Estimator &estimator, const std_msgs::Header &header) { if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR) { nav_msgs::Odometry odometry; odometry.header = header; odometry.header.frame_id = "world"; odometry.child_frame_id = "world"; Quaterniond tmp_Q; tmp_Q = Quaterniond(estimator.Rs[WINDOW_SIZE]); odometry.pose.pose.position.x = estimator.Ps[WINDOW_SIZE].x(); odometry.pose.pose.position.y = estimator.Ps[WINDOW_SIZE].y(); odometry.pose.pose.position.z = estimator.Ps[WINDOW_SIZE].z(); odometry.pose.pose.orientation.x = tmp_Q.x(); odometry.pose.pose.orientation.y = tmp_Q.y(); odometry.pose.pose.orientation.z = tmp_Q.z(); odometry.pose.pose.orientation.w = tmp_Q.w(); odometry.twist.twist.linear.x = estimator.Vs[WINDOW_SIZE].x(); odometry.twist.twist.linear.y = estimator.Vs[WINDOW_SIZE].y(); odometry.twist.twist.linear.z = estimator.Vs[WINDOW_SIZE].z(); pub_odometry.publish(odometry); geometry_msgs::PoseStamped pose_stamped; pose_stamped.header = header; pose_stamped.header.frame_id = "world"; pose_stamped.pose = odometry.pose.pose; path.header = header; path.header.frame_id = "world"; path.poses.push_back(pose_stamped); pub_path.publish(path); Vector3d correct_t; Vector3d correct_v; Quaterniond correct_q; correct_t = estimator.drift_correct_r * estimator.Ps[WINDOW_SIZE] + estimator.drift_correct_t; correct_q = estimator.drift_correct_r * estimator.Rs[WINDOW_SIZE]; odometry.pose.pose.position.x = correct_t.x(); odometry.pose.pose.position.y = correct_t.y(); odometry.pose.pose.position.z = correct_t.z(); odometry.pose.pose.orientation.x = correct_q.x(); odometry.pose.pose.orientation.y = correct_q.y(); odometry.pose.pose.orientation.z = correct_q.z(); odometry.pose.pose.orientation.w = correct_q.w(); pose_stamped.pose = odometry.pose.pose; relo_path.header = header; relo_path.header.frame_id = "world"; relo_path.poses.push_back(pose_stamped); pub_relo_path.publish(relo_path); // write result to file ofstream foutC(RPVIO_RESULT_PATH, ios::app); foutC.setf(ios::fixed, ios::floatfield); foutC.precision(0); foutC << header.stamp.toSec() * 1e9 << ","; foutC.precision(5); foutC << estimator.Ps[WINDOW_SIZE].x() << "," << estimator.Ps[WINDOW_SIZE].y() << "," << estimator.Ps[WINDOW_SIZE].z() << "," << tmp_Q.w() << "," << tmp_Q.x() << "," << tmp_Q.y() << "," << tmp_Q.z() << "," << estimator.Vs[WINDOW_SIZE].x() << "," << estimator.Vs[WINDOW_SIZE].y() << "," << estimator.Vs[WINDOW_SIZE].z() << "," << endl; foutC.close(); } } void pubKeyPoses(const Estimator &estimator, const std_msgs::Header &header) { if (estimator.key_poses.size() == 0) return; visualization_msgs::Marker key_poses; key_poses.header = header; key_poses.header.frame_id = "world"; key_poses.ns = "key_poses"; key_poses.type = visualization_msgs::Marker::SPHERE_LIST; key_poses.action = visualization_msgs::Marker::ADD; key_poses.pose.orientation.w = 1.0; key_poses.lifetime = ros::Duration(); //static int key_poses_id = 0; key_poses.id = 0; //key_poses_id++; key_poses.scale.x = 0.05; key_poses.scale.y = 0.05; key_poses.scale.z = 0.05; key_poses.color.r = 1.0; key_poses.color.a = 1.0; for (int i = 0; i <= WINDOW_SIZE; i++) { geometry_msgs::Point pose_marker; Vector3d correct_pose; correct_pose = estimator.key_poses[i]; pose_marker.x = correct_pose.x(); pose_marker.y = correct_pose.y(); pose_marker.z = correct_pose.z(); key_poses.points.push_back(pose_marker); } pub_key_poses.publish(key_poses); } void pubCameraPose(const Estimator &estimator, const std_msgs::Header &header) { int idx2 = WINDOW_SIZE - 1; if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR) { int i = idx2; Vector3d P = estimator.Ps[i] + estimator.Rs[i] * estimator.tic[0]; Quaterniond R = Quaterniond(estimator.Rs[i] * estimator.ric[0]); nav_msgs::Odometry odometry; odometry.header = header; odometry.header.frame_id = "world"; odometry.pose.pose.position.x = P.x(); odometry.pose.pose.position.y = P.y(); odometry.pose.pose.position.z = P.z(); odometry.pose.pose.orientation.x = R.x(); odometry.pose.pose.orientation.y = R.y(); odometry.pose.pose.orientation.z = R.z(); odometry.pose.pose.orientation.w = R.w(); pub_camera_pose.publish(odometry); cameraposevisual.reset(); cameraposevisual.add_pose(P, R); cameraposevisual.publish_by(pub_camera_pose_visual, odometry.header); } } void pubPointCloud(const Estimator &estimator, const std_msgs::Header &header) { sensor_msgs::PointCloud point_cloud, loop_point_cloud; point_cloud.header = header; loop_point_cloud.header = header; for (auto &it_per_id : estimator.f_manager.feature) { int used_num; used_num = it_per_id.feature_per_frame.size(); if (!(used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; if (it_per_id.start_frame > WINDOW_SIZE * 3.0 / 4.0 || it_per_id.solve_flag != 1) continue; int imu_i = it_per_id.start_frame; Vector3d pts_i = it_per_id.feature_per_frame[0].point * it_per_id.estimated_depth; Vector3d w_pts_i = estimator.Rs[imu_i] * (estimator.ric[0] * pts_i + estimator.tic[0]) + estimator.Ps[imu_i]; geometry_msgs::Point32 p; p.x = w_pts_i(0); p.y = w_pts_i(1); p.z = w_pts_i(2); point_cloud.points.push_back(p); } pub_point_cloud.publish(point_cloud); // pub margined potin sensor_msgs::PointCloud margin_cloud; margin_cloud.header = header; for (auto &it_per_id : estimator.f_manager.feature) { int used_num; used_num = it_per_id.feature_per_frame.size(); if (!(used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2)) continue; //if (it_per_id->start_frame > WINDOW_SIZE * 3.0 / 4.0 || it_per_id->solve_flag != 1) // continue; if (it_per_id.start_frame == 0 && it_per_id.feature_per_frame.size() <= 2 && it_per_id.solve_flag == 1 ) { int imu_i = it_per_id.start_frame; Vector3d pts_i = it_per_id.feature_per_frame[0].point * it_per_id.estimated_depth; Vector3d w_pts_i = estimator.Rs[imu_i] * (estimator.ric[0] * pts_i + estimator.tic[0]) + estimator.Ps[imu_i]; geometry_msgs::Point32 p; p.x = w_pts_i(0); p.y = w_pts_i(1); p.z = w_pts_i(2); margin_cloud.points.push_back(p); } } pub_margin_cloud.publish(margin_cloud); } void pubTF(const Estimator &estimator, const std_msgs::Header &header) { if( estimator.solver_flag != Estimator::SolverFlag::NON_LINEAR) return; static tf::TransformBroadcaster br; tf::Transform transform; tf::Quaternion q; // body frame Vector3d correct_t; Quaterniond correct_q; correct_t = estimator.Ps[WINDOW_SIZE]; correct_q = estimator.Rs[WINDOW_SIZE]; transform.setOrigin(tf::Vector3(correct_t(0), correct_t(1), correct_t(2))); q.setW(correct_q.w()); q.setX(correct_q.x()); q.setY(correct_q.y()); q.setZ(correct_q.z()); transform.setRotation(q); br.sendTransform(tf::StampedTransform(transform, header.stamp, "world", "body")); // camera frame transform.setOrigin(tf::Vector3(estimator.tic[0].x(), estimator.tic[0].y(), estimator.tic[0].z())); q.setW(Quaterniond(estimator.ric[0]).w()); q.setX(Quaterniond(estimator.ric[0]).x()); q.setY(Quaterniond(estimator.ric[0]).y()); q.setZ(Quaterniond(estimator.ric[0]).z()); transform.setRotation(q); br.sendTransform(tf::StampedTransform(transform, header.stamp, "body", "camera")); nav_msgs::Odometry odometry; odometry.header = header; odometry.header.frame_id = "world"; odometry.pose.pose.position.x = estimator.tic[0].x(); odometry.pose.pose.position.y = estimator.tic[0].y(); odometry.pose.pose.position.z = estimator.tic[0].z(); Quaterniond tmp_q{estimator.ric[0]}; odometry.pose.pose.orientation.x = tmp_q.x(); odometry.pose.pose.orientation.y = tmp_q.y(); odometry.pose.pose.orientation.z = tmp_q.z(); odometry.pose.pose.orientation.w = tmp_q.w(); pub_extrinsic.publish(odometry); } void pubKeyframe(const Estimator &estimator) { // pub camera pose, 2D-3D points of keyframe if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR && estimator.marginalization_flag == 0) { int i = WINDOW_SIZE - 2; //Vector3d P = estimator.Ps[i] + estimator.Rs[i] * estimator.tic[0]; Vector3d P = estimator.Ps[i]; Quaterniond R = Quaterniond(estimator.Rs[i]); nav_msgs::Odometry odometry; odometry.header = estimator.Headers[WINDOW_SIZE - 2]; odometry.header.frame_id = "world"; odometry.pose.pose.position.x = P.x(); odometry.pose.pose.position.y = P.y(); odometry.pose.pose.position.z = P.z(); odometry.pose.pose.orientation.x = R.x(); odometry.pose.pose.orientation.y = R.y(); odometry.pose.pose.orientation.z = R.z(); odometry.pose.pose.orientation.w = R.w(); //printf("time: %f t: %f %f %f r: %f %f %f %f\n", odometry.header.stamp.toSec(), P.x(), P.y(), P.z(), R.w(), R.x(), R.y(), R.z()); pub_keyframe_pose.publish(odometry); sensor_msgs::PointCloud point_cloud; point_cloud.header = estimator.Headers[WINDOW_SIZE - 2]; for (auto &it_per_id : estimator.f_manager.feature) { int frame_size = it_per_id.feature_per_frame.size(); if(it_per_id.start_frame < WINDOW_SIZE - 2 && it_per_id.start_frame + frame_size - 1 >= WINDOW_SIZE - 2 && it_per_id.solve_flag == 1) { int imu_i = it_per_id.start_frame; Vector3d pts_i = it_per_id.feature_per_frame[0].point * it_per_id.estimated_depth; Vector3d w_pts_i = estimator.Rs[imu_i] * (estimator.ric[0] * pts_i + estimator.tic[0]) + estimator.Ps[imu_i]; geometry_msgs::Point32 p; p.x = w_pts_i(0); p.y = w_pts_i(1); p.z = w_pts_i(2); point_cloud.points.push_back(p); int imu_j = WINDOW_SIZE - 2 - it_per_id.start_frame; sensor_msgs::ChannelFloat32 p_2d; p_2d.values.push_back(it_per_id.feature_per_frame[imu_j].point.x()); p_2d.values.push_back(it_per_id.feature_per_frame[imu_j].point.y()); p_2d.values.push_back(it_per_id.feature_per_frame[imu_j].uv.x()); p_2d.values.push_back(it_per_id.feature_per_frame[imu_j].uv.y()); p_2d.values.push_back(it_per_id.feature_id); point_cloud.channels.push_back(p_2d); } } pub_keyframe_point.publish(point_cloud); } } void pubRelocalization(const Estimator &estimator) { nav_msgs::Odometry odometry; odometry.header.stamp = ros::Time(estimator.relo_frame_stamp); odometry.header.frame_id = "world"; odometry.pose.pose.position.x = estimator.relo_relative_t.x(); odometry.pose.pose.position.y = estimator.relo_relative_t.y(); odometry.pose.pose.position.z = estimator.relo_relative_t.z(); odometry.pose.pose.orientation.x = estimator.relo_relative_q.x(); odometry.pose.pose.orientation.y = estimator.relo_relative_q.y(); odometry.pose.pose.orientation.z = estimator.relo_relative_q.z(); odometry.pose.pose.orientation.w = estimator.relo_relative_q.w(); odometry.twist.twist.linear.x = estimator.relo_relative_yaw; odometry.twist.twist.linear.y = estimator.relo_frame_index; pub_relo_relative_pose.publish(odometry); } ================================================ FILE: rpvio_estimator/src/utility/visualization.h ================================================ #pragma once #include #include #include #include #include #include #include #include #include #include #include #include #include #include "CameraPoseVisualization.h" #include #include "../estimator.h" #include "../parameters.h" #include extern ros::Publisher pub_odometry; extern ros::Publisher pub_path, pub_pose; extern ros::Publisher pub_cloud, pub_map; extern ros::Publisher pub_key_poses; extern ros::Publisher pub_ref_pose, pub_cur_pose; extern ros::Publisher pub_key; extern nav_msgs::Path path; extern ros::Publisher pub_pose_graph; extern int IMAGE_ROW, IMAGE_COL; void registerPub(ros::NodeHandle &n); void pubLatestOdometry(const Eigen::Vector3d &P, const Eigen::Quaterniond &Q, const Eigen::Vector3d &V, const std_msgs::Header &header); void printStatistics(const Estimator &estimator, double t); void pubOdometry(const Estimator &estimator, const std_msgs::Header &header); void pubInitialGuess(const Estimator &estimator, const std_msgs::Header &header); void pubKeyPoses(const Estimator &estimator, const std_msgs::Header &header); void pubCameraPose(const Estimator &estimator, const std_msgs::Header &header); void pubPointCloud(const Estimator &estimator, const std_msgs::Header &header); void pubTF(const Estimator &estimator, const std_msgs::Header &header); void pubKeyframe(const Estimator &estimator); void pubRelocalization(const Estimator &estimator); ================================================ FILE: rpvio_feature_tracker/CMakeLists.txt ================================================ cmake_minimum_required(VERSION 2.8.3) project(rpvio_feature_tracker) set(CMAKE_BUILD_TYPE "Release") set(CMAKE_CXX_FLAGS "-std=c++11") set(CMAKE_CXX_FLAGS_RELEASE "-O3 -Wall -g") find_package(catkin REQUIRED COMPONENTS roscpp std_msgs sensor_msgs cv_bridge camera_model message_filters ) find_package(OpenCV REQUIRED) catkin_package() include_directories( ${catkin_INCLUDE_DIRS} ) set(CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake) find_package(Eigen3) include_directories( ${catkin_INCLUDE_DIRS} ${EIGEN3_INCLUDE_DIR} ) add_executable(rpvio_feature_tracker src/feature_tracker_node.cpp src/parameters.cpp src/feature_tracker.cpp ) target_link_libraries(rpvio_feature_tracker ${catkin_LIBRARIES} ${OpenCV_LIBS}) ================================================ FILE: rpvio_feature_tracker/cmake/FindEigen.cmake ================================================ # Ceres Solver - A fast non-linear least squares minimizer # Copyright 2015 Google Inc. All rights reserved. # http://ceres-solver.org/ # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # * Redistributions of source code must retain the above copyright notice, # this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright notice, # this list of conditions and the following disclaimer in the documentation # and/or other materials provided with the distribution. # * Neither the name of Google Inc. nor the names of its contributors may be # used to endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE # ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE # LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR # CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF # SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS # INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN # CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) # ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE # POSSIBILITY OF SUCH DAMAGE. # # Author: alexs.mac@gmail.com (Alex Stewart) # # FindEigen.cmake - Find Eigen library, version >= 3. # # This module defines the following variables: # # EIGEN_FOUND: TRUE iff Eigen is found. # EIGEN_INCLUDE_DIRS: Include directories for Eigen. # # EIGEN_VERSION: Extracted from Eigen/src/Core/util/Macros.h # EIGEN_WORLD_VERSION: Equal to 3 if EIGEN_VERSION = 3.2.0 # EIGEN_MAJOR_VERSION: Equal to 2 if EIGEN_VERSION = 3.2.0 # EIGEN_MINOR_VERSION: Equal to 0 if EIGEN_VERSION = 3.2.0 # # The following variables control the behaviour of this module: # # EIGEN_INCLUDE_DIR_HINTS: List of additional directories in which to # search for eigen includes, e.g: /timbuktu/eigen3. # # The following variables are also defined by this module, but in line with # CMake recommended FindPackage() module style should NOT be referenced directly # by callers (use the plural variables detailed above instead). These variables # do however affect the behaviour of the module via FIND_[PATH/LIBRARY]() which # are NOT re-called (i.e. search for library is not repeated) if these variables # are set with valid values _in the CMake cache_. This means that if these # variables are set directly in the cache, either by the user in the CMake GUI, # or by the user passing -DVAR=VALUE directives to CMake when called (which # explicitly defines a cache variable), then they will be used verbatim, # bypassing the HINTS variables and other hard-coded search locations. # # EIGEN_INCLUDE_DIR: Include directory for CXSparse, not including the # include directory of any dependencies. # Called if we failed to find Eigen or any of it's required dependencies, # unsets all public (designed to be used externally) variables and reports # error message at priority depending upon [REQUIRED/QUIET/] argument. macro(EIGEN_REPORT_NOT_FOUND REASON_MSG) unset(EIGEN_FOUND) unset(EIGEN_INCLUDE_DIRS) # Make results of search visible in the CMake GUI if Eigen has not # been found so that user does not have to toggle to advanced view. mark_as_advanced(CLEAR EIGEN_INCLUDE_DIR) # Note _FIND_[REQUIRED/QUIETLY] variables defined by FindPackage() # use the camelcase library name, not uppercase. if (Eigen_FIND_QUIETLY) message(STATUS "Failed to find Eigen - " ${REASON_MSG} ${ARGN}) elseif (Eigen_FIND_REQUIRED) message(FATAL_ERROR "Failed to find Eigen - " ${REASON_MSG} ${ARGN}) else() # Neither QUIETLY nor REQUIRED, use no priority which emits a message # but continues configuration and allows generation. message("-- Failed to find Eigen - " ${REASON_MSG} ${ARGN}) endif () return() endmacro(EIGEN_REPORT_NOT_FOUND) # Protect against any alternative find_package scripts for this library having # been called previously (in a client project) which set EIGEN_FOUND, but not # the other variables we require / set here which could cause the search logic # here to fail. unset(EIGEN_FOUND) # Search user-installed locations first, so that we prefer user installs # to system installs where both exist. list(APPEND EIGEN_CHECK_INCLUDE_DIRS /usr/local/include /usr/local/homebrew/include # Mac OS X /opt/local/var/macports/software # Mac OS X. /opt/local/include /usr/include) # Additional suffixes to try appending to each search path. list(APPEND EIGEN_CHECK_PATH_SUFFIXES eigen3 # Default root directory for Eigen. Eigen/include/eigen3 # Windows (for C:/Program Files prefix) < 3.3 Eigen3/include/eigen3 ) # Windows (for C:/Program Files prefix) >= 3.3 # Search supplied hint directories first if supplied. find_path(EIGEN_INCLUDE_DIR NAMES Eigen/Core PATHS ${EIGEN_INCLUDE_DIR_HINTS} ${EIGEN_CHECK_INCLUDE_DIRS} PATH_SUFFIXES ${EIGEN_CHECK_PATH_SUFFIXES}) if (NOT EIGEN_INCLUDE_DIR OR NOT EXISTS ${EIGEN_INCLUDE_DIR}) eigen_report_not_found( "Could not find eigen3 include directory, set EIGEN_INCLUDE_DIR to " "path to eigen3 include directory, e.g. /usr/local/include/eigen3.") endif (NOT EIGEN_INCLUDE_DIR OR NOT EXISTS ${EIGEN_INCLUDE_DIR}) # Mark internally as found, then verify. EIGEN_REPORT_NOT_FOUND() unsets # if called. set(EIGEN_FOUND TRUE) # Extract Eigen version from Eigen/src/Core/util/Macros.h if (EIGEN_INCLUDE_DIR) set(EIGEN_VERSION_FILE ${EIGEN_INCLUDE_DIR}/Eigen/src/Core/util/Macros.h) if (NOT EXISTS ${EIGEN_VERSION_FILE}) eigen_report_not_found( "Could not find file: ${EIGEN_VERSION_FILE} " "containing version information in Eigen install located at: " "${EIGEN_INCLUDE_DIR}.") else (NOT EXISTS ${EIGEN_VERSION_FILE}) file(READ ${EIGEN_VERSION_FILE} EIGEN_VERSION_FILE_CONTENTS) string(REGEX MATCH "#define EIGEN_WORLD_VERSION [0-9]+" EIGEN_WORLD_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_WORLD_VERSION ([0-9]+)" "\\1" EIGEN_WORLD_VERSION "${EIGEN_WORLD_VERSION}") string(REGEX MATCH "#define EIGEN_MAJOR_VERSION [0-9]+" EIGEN_MAJOR_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_MAJOR_VERSION ([0-9]+)" "\\1" EIGEN_MAJOR_VERSION "${EIGEN_MAJOR_VERSION}") string(REGEX MATCH "#define EIGEN_MINOR_VERSION [0-9]+" EIGEN_MINOR_VERSION "${EIGEN_VERSION_FILE_CONTENTS}") string(REGEX REPLACE "#define EIGEN_MINOR_VERSION ([0-9]+)" "\\1" EIGEN_MINOR_VERSION "${EIGEN_MINOR_VERSION}") # This is on a single line s/t CMake does not interpret it as a list of # elements and insert ';' separators which would result in 3.;2.;0 nonsense. set(EIGEN_VERSION "${EIGEN_WORLD_VERSION}.${EIGEN_MAJOR_VERSION}.${EIGEN_MINOR_VERSION}") endif (NOT EXISTS ${EIGEN_VERSION_FILE}) endif (EIGEN_INCLUDE_DIR) # Set standard CMake FindPackage variables if found. if (EIGEN_FOUND) set(EIGEN_INCLUDE_DIRS ${EIGEN_INCLUDE_DIR}) endif (EIGEN_FOUND) # Handle REQUIRED / QUIET optional arguments and version. include(FindPackageHandleStandardArgs) find_package_handle_standard_args(Eigen REQUIRED_VARS EIGEN_INCLUDE_DIRS VERSION_VAR EIGEN_VERSION) # Only mark internal variables as advanced if we found Eigen, otherwise # leave it visible in the standard GUI for the user to set manually. if (EIGEN_FOUND) mark_as_advanced(FORCE EIGEN_INCLUDE_DIR) endif (EIGEN_FOUND) ================================================ FILE: rpvio_feature_tracker/package.xml ================================================ rpvio_feature_tracker 0.0.0 The rpvio_feature_tracker package karnikram GPLv3 catkin roscpp camera_model message_generation roscpp camera_model message_runtime ================================================ FILE: rpvio_feature_tracker/src/feature_tracker.cpp ================================================ #include "feature_tracker.h" int FeatureTracker::n_id = 0; bool inBorder(const cv::Point2f &pt) { const int BORDER_SIZE = 1; int img_x = cvRound(pt.x); int img_y = cvRound(pt.y); return BORDER_SIZE <= img_x && img_x < COL - BORDER_SIZE && BORDER_SIZE <= img_y && img_y < ROW - BORDER_SIZE; } void reduceVector(vector &v, vector status) { int j = 0; for (int i = 0; i < int(v.size()); i++) if (status[i]) v[j++] = v[i]; v.resize(j); } void reduceVector(vector &v, vector status) { int j = 0; for (int i = 0; i < int(v.size()); i++) if (status[i]) v[j++] = v[i]; v.resize(j); } FeatureTracker::FeatureTracker() { } void FeatureTracker::setMask(const cv::Mat &_mask) { //mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255)); mask = _mask.clone(); //cv::Mat erosion_mat = cv::getStructuringElement(cv::MORPH_RECT, cv::Size(21,21)); cv::erode(mask, mask, cv::Mat(), cv::Point(-1, -1), 2, 1, 1); // prefer to keep features that are tracked for long time vector>>> cnt_pts_id_pid; for (unsigned int i = 0; i < forw_pts.size(); i++) cnt_pts_id_pid.push_back(make_pair(track_cnt[i], make_pair(forw_pts[i], make_pair(ids[i], plane_ids[i])))); sort(cnt_pts_id_pid.begin(), cnt_pts_id_pid.end(), [](const pair>> &a, const pair>> &b) { return a.first > b.first; }); forw_pts.clear(); ids.clear(); plane_ids.clear(); track_cnt.clear(); for (auto &it : cnt_pts_id_pid) { if (mask.at(it.second.first) != 0) { forw_pts.push_back(it.second.first); ids.push_back(it.second.second.first); plane_ids.push_back(it.second.second.second); track_cnt.push_back(it.first); cv::circle(mask, it.second.first, MIN_DIST, 0, -1); } } } void FeatureTracker::addPoints(const cv::Mat &_mask) { // Find unique labels cv::Mat mask_copy = _mask.clone(); std::sort(mask_copy.begin(), mask_copy.end()); auto last = std::unique(mask_copy.begin(), mask_copy.end()); std::vector unique_labels(mask_copy.begin(), last); // Create mask for each label std::map unique_masks; for(auto &label : unique_labels) unique_masks.emplace(label, cv::Mat::zeros(_mask.size(), CV_8UC1)); for(int i = 0; i < _mask.rows; i++) for(int j = 0; j < _mask.cols; j++) unique_masks[_mask.at(i,j)].at(i,j) = 255; // Erode each mask to avoid dynamic edges for(auto &unique_mask : unique_masks) cv::erode(unique_mask.second, unique_mask.second, cv::Mat(), cv::Point(-1, -1), 2, 1, 1); // Store plane id for each feature for (auto &p : n_pts) { for(auto &unique_mask : unique_masks) { if(unique_mask.first == 0) continue; if(unique_mask.second.at(p) == 255) { forw_pts.push_back(p); ids.push_back(-1); track_cnt.push_back(1); plane_ids.push_back(unique_mask.first); break; } } } } void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time) { cv::Mat img; TicToc t_r; cur_time = _cur_time; if (EQUALIZE) { cv::Ptr clahe = cv::createCLAHE(3.0, cv::Size(8, 8)); TicToc t_c; clahe->apply(_img, img); ROS_DEBUG("CLAHE costs: %fms", t_c.toc()); } else img = _img; if (forw_img.empty()) { prev_img = cur_img = forw_img = img; } else { forw_img = img; } forw_pts.clear(); if (cur_pts.size() > 0) { TicToc t_o; vector status; vector err; cv::calcOpticalFlowPyrLK(cur_img, forw_img, cur_pts, forw_pts, status, err, cv::Size(21, 21), 3); for (int i = 0; i < int(forw_pts.size()); i++) if (status[i] && !inBorder(forw_pts[i])) status[i] = 0; reduceVector(prev_pts, status); reduceVector(cur_pts, status); reduceVector(forw_pts, status); reduceVector(ids, status); reduceVector(plane_ids, status); reduceVector(cur_un_pts, status); reduceVector(track_cnt, status); ROS_DEBUG("temporal optical flow costs: %fms", t_o.toc()); } for (auto &n : track_cnt) n++; if (PUB_THIS_FRAME) { rejectWithH(); ROS_DEBUG("set mask begins"); TicToc t_m; setMask(_mask); ROS_DEBUG("set mask costs %fms", t_m.toc()); ROS_DEBUG("detect feature begins"); TicToc t_t; int n_max_cnt = MAX_CNT - static_cast(forw_pts.size()); if (n_max_cnt > 0) { if(mask.empty()) cout << "mask is empty " << endl; if (mask.type() != CV_8UC1) cout << "mask type wrong " << endl; if (mask.size() != forw_img.size()) cout << "wrong size " << endl; cv::goodFeaturesToTrack(forw_img, n_pts, MAX_CNT - forw_pts.size(), 0.01, MIN_DIST, mask); ROS_DEBUG("add feature begins"); TicToc t_a; addPoints(_mask); ROS_DEBUG("selectFeature costs: %fms", t_a.toc()); } else n_pts.clear(); ROS_DEBUG("detect feature costs: %fms", t_t.toc()); } prev_img = cur_img; prev_pts = cur_pts; prev_un_pts = cur_un_pts; cur_img = forw_img; cur_pts = forw_pts; undistortedPoints(); prev_time = cur_time; } void FeatureTracker::rejectWithH() { if (forw_pts.size() >= 4) { ROS_DEBUG("HM ransac begins"); TicToc t_f; vector un_cur_pts(cur_pts.size()), un_forw_pts(forw_pts.size()); for (unsigned int i = 0; i < cur_pts.size(); i++) { Eigen::Vector3d tmp_p; m_camera->liftProjective(Eigen::Vector2d(cur_pts[i].x, cur_pts[i].y), tmp_p); tmp_p.x() = FOCAL_LENGTH * tmp_p.x() / tmp_p.z() + COL / 2.0; tmp_p.y() = FOCAL_LENGTH * tmp_p.y() / tmp_p.z() + ROW / 2.0; un_cur_pts[i] = cv::Point2f(tmp_p.x(), tmp_p.y()); m_camera->liftProjective(Eigen::Vector2d(forw_pts[i].x, forw_pts[i].y), tmp_p); tmp_p.x() = FOCAL_LENGTH * tmp_p.x() / tmp_p.z() + COL / 2.0; tmp_p.y() = FOCAL_LENGTH * tmp_p.y() / tmp_p.z() + ROW / 2.0; un_forw_pts[i] = cv::Point2f(tmp_p.x(), tmp_p.y()); } vector status(un_cur_pts.size()); //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status); //cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status); map> pid_un_cur_pts, pid_un_forw_pts; map, int> new_ids_to_old; for(uint i = 0; i < un_cur_pts.size(); i++) { pid_un_cur_pts[plane_ids[i]].push_back(un_cur_pts[i]); pid_un_forw_pts[plane_ids[i]].push_back(un_forw_pts[i]); new_ids_to_old[make_pair(plane_ids[i], pid_un_cur_pts[plane_ids[i]].size()-1)] = i; } for(auto &imap : pid_un_cur_pts) { vector pid_status; cv::findHomography(pid_un_cur_pts[imap.first], pid_un_forw_pts[imap.first], cv::RANSAC, H_THRESHOLD, pid_status, 2000, 0.99); for(int i = 0; i < pid_status.size(); i++) status[new_ids_to_old[make_pair(imap.first, i)]] = pid_status[i]; } int size_a = cur_pts.size(); reduceVector(prev_pts, status); reduceVector(cur_pts, status); reduceVector(forw_pts, status); reduceVector(cur_un_pts, status); reduceVector(ids, status); reduceVector(plane_ids, status); reduceVector(track_cnt, status); ROS_DEBUG("FM ransac: %d -> %lu: %f", size_a, forw_pts.size(), 1.0 * forw_pts.size() / size_a); ROS_DEBUG("FM ransac costs: %fms", t_f.toc()); } } bool FeatureTracker::updateID(unsigned int i) { if (i < ids.size()) { if (ids[i] == -1) ids[i] = n_id++; return true; } else return false; } void FeatureTracker::readIntrinsicParameter(const string &calib_file) { ROS_INFO("reading paramerter of camera %s", calib_file.c_str()); m_camera = CameraFactory::instance()->generateCameraFromYamlFile(calib_file); } void FeatureTracker::showUndistortion(const string &name) { cv::Mat undistortedImg(ROW + 600, COL + 600, CV_8UC1, cv::Scalar(0)); vector distortedp, undistortedp; for (int i = 0; i < COL; i++) for (int j = 0; j < ROW; j++) { Eigen::Vector2d a(i, j); Eigen::Vector3d b; m_camera->liftProjective(a, b); distortedp.push_back(a); undistortedp.push_back(Eigen::Vector2d(b.x() / b.z(), b.y() / b.z())); //printf("%f,%f->%f,%f,%f\n)\n", a.x(), a.y(), b.x(), b.y(), b.z()); } for (int i = 0; i < int(undistortedp.size()); i++) { cv::Mat pp(3, 1, CV_32FC1); pp.at(0, 0) = undistortedp[i].x() * FOCAL_LENGTH + COL / 2; pp.at(1, 0) = undistortedp[i].y() * FOCAL_LENGTH + ROW / 2; pp.at(2, 0) = 1.0; //cout << trackerData[0].K << endl; //printf("%lf %lf\n", p.at(1, 0), p.at(0, 0)); //printf("%lf %lf\n", pp.at(1, 0), pp.at(0, 0)); if (pp.at(1, 0) + 300 >= 0 && pp.at(1, 0) + 300 < ROW + 600 && pp.at(0, 0) + 300 >= 0 && pp.at(0, 0) + 300 < COL + 600) { undistortedImg.at(pp.at(1, 0) + 300, pp.at(0, 0) + 300) = cur_img.at(distortedp[i].y(), distortedp[i].x()); } else { //ROS_ERROR("(%f %f) -> (%f %f)", distortedp[i].y, distortedp[i].x, pp.at(1, 0), pp.at(0, 0)); } } cv::imshow(name, undistortedImg); cv::waitKey(0); } void FeatureTracker::undistortedPoints() { cur_un_pts.clear(); cur_un_pts_map.clear(); //cv::undistortPoints(cur_pts, un_pts, K, cv::Mat()); for (unsigned int i = 0; i < cur_pts.size(); i++) { Eigen::Vector2d a(cur_pts[i].x, cur_pts[i].y); Eigen::Vector3d b; m_camera->liftProjective(a, b); cur_un_pts.push_back(cv::Point2f(b.x() / b.z(), b.y() / b.z())); cur_un_pts_map.insert(make_pair(ids[i], cv::Point2f(b.x() / b.z(), b.y() / b.z()))); //printf("cur pts id %d %f %f", ids[i], cur_un_pts[i].x, cur_un_pts[i].y); } // caculate points velocity if (!prev_un_pts_map.empty()) { double dt = cur_time - prev_time; pts_velocity.clear(); for (unsigned int i = 0; i < cur_un_pts.size(); i++) { if (ids[i] != -1) { std::map::iterator it; it = prev_un_pts_map.find(ids[i]); if (it != prev_un_pts_map.end()) { double v_x = (cur_un_pts[i].x - it->second.x) / dt; double v_y = (cur_un_pts[i].y - it->second.y) / dt; pts_velocity.push_back(cv::Point2f(v_x, v_y)); } else pts_velocity.push_back(cv::Point2f(0, 0)); } else { pts_velocity.push_back(cv::Point2f(0, 0)); } } } else { for (unsigned int i = 0; i < cur_pts.size(); i++) { pts_velocity.push_back(cv::Point2f(0, 0)); } } prev_un_pts_map = cur_un_pts_map; } ================================================ FILE: rpvio_feature_tracker/src/feature_tracker.h ================================================ #pragma once #include #include #include #include #include #include #include #include "camodocal/camera_models/CameraFactory.h" #include "camodocal/camera_models/CataCamera.h" #include "camodocal/camera_models/PinholeCamera.h" #include "parameters.h" #include "tic_toc.h" using namespace std; using namespace camodocal; using namespace Eigen; bool inBorder(const cv::Point2f &pt); void reduceVector(vector &v, vector status); void reduceVector(vector &v, vector status); class FeatureTracker { public: FeatureTracker(); void readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time); void setMask(const cv::Mat &_mask); void addPoints(const cv::Mat &_mask); bool updateID(unsigned int i); void readIntrinsicParameter(const string &calib_file); void showUndistortion(const string &name); void rejectWithH(); void undistortedPoints(); cv::Mat mask; cv::Mat fisheye_mask; cv::Mat prev_img, cur_img, forw_img; vector n_pts; vector prev_pts, cur_pts, forw_pts; vector prev_un_pts, cur_un_pts; vector pts_velocity; vector ids; vector plane_ids; vector track_cnt; map cur_un_pts_map; map prev_un_pts_map; camodocal::CameraPtr m_camera; double cur_time; double prev_time; static int n_id; }; ================================================ FILE: rpvio_feature_tracker/src/feature_tracker_node.cpp ================================================ #include #include #include #include #include #include #include #include #include #include "feature_tracker.h" #define SHOW_UNDISTORTION 0 vector r_status; vector r_err; queue img_buf; ros::Publisher pub_img,pub_match; ros::Publisher pub_restart; FeatureTracker trackerData[NUM_OF_CAM]; double first_image_time; int pub_count = 1; bool first_image_flag = true; double last_image_time = 0; bool init_pub = 0; void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::ImageConstPtr &mask_msg) { if(first_image_flag) { first_image_flag = false; first_image_time = img_msg->header.stamp.toSec(); last_image_time = img_msg->header.stamp.toSec(); return; } // detect unstable camera stream if (img_msg->header.stamp.toSec() - last_image_time > 1.0 || img_msg->header.stamp.toSec() < last_image_time) { ROS_WARN("image discontinue! reset the feature tracker!"); first_image_flag = true; last_image_time = 0; pub_count = 1; std_msgs::Bool restart_flag; restart_flag.data = true; pub_restart.publish(restart_flag); return; } last_image_time = img_msg->header.stamp.toSec(); // frequency control if (round(1.0 * pub_count / (img_msg->header.stamp.toSec() - first_image_time)) <= FREQ) { PUB_THIS_FRAME = true; // reset the frequency control if (abs(1.0 * pub_count / (img_msg->header.stamp.toSec() - first_image_time) - FREQ) < 0.01 * FREQ) { first_image_time = img_msg->header.stamp.toSec(); pub_count = 0; } } else PUB_THIS_FRAME = false; cv_bridge::CvImageConstPtr ptr, mask_ptr; if (img_msg->encoding == "8UC1") { sensor_msgs::Image img; img.header = img_msg->header; img.height = img_msg->height; img.width = img_msg->width; img.is_bigendian = img_msg->is_bigendian; img.step = img_msg->step; img.data = img_msg->data; img.encoding = "mono8"; ptr = cv_bridge::toCvCopy(img, sensor_msgs::image_encodings::MONO8); } else ptr = cv_bridge::toCvCopy(img_msg, sensor_msgs::image_encodings::MONO8); mask_ptr = cv_bridge::toCvCopy(mask_msg, sensor_msgs::image_encodings::MONO8); cv::Mat show_img = ptr->image; TicToc t_r; for (int i = 0; i < NUM_OF_CAM; i++) { ROS_DEBUG("processing camera %d", i); if (i != 1 || !STEREO_TRACK) trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image.rowRange(ROW * i,ROW * (i + 1)), img_msg->header.stamp.toSec()); else { if (EQUALIZE) { cv::Ptr clahe = cv::createCLAHE(); clahe->apply(ptr->image.rowRange(ROW * i, ROW * (i + 1)), trackerData[i].cur_img); } else trackerData[i].cur_img = ptr->image.rowRange(ROW * i, ROW * (i + 1)); } #if SHOW_UNDISTORTION trackerData[i].showUndistortion("undistrotion_" + std::to_string(i)); #endif } for (unsigned int i = 0;; i++) { bool completed = false; for (int j = 0; j < NUM_OF_CAM; j++) if (j != 1 || !STEREO_TRACK) completed |= trackerData[j].updateID(i); if (!completed) break; } if (PUB_THIS_FRAME) { pub_count++; sensor_msgs::PointCloudPtr feature_points(new sensor_msgs::PointCloud); sensor_msgs::ChannelFloat32 id_of_point; sensor_msgs::ChannelFloat32 plane_id_of_point; sensor_msgs::ChannelFloat32 u_of_point; sensor_msgs::ChannelFloat32 v_of_point; sensor_msgs::ChannelFloat32 velocity_x_of_point; sensor_msgs::ChannelFloat32 velocity_y_of_point; feature_points->header = img_msg->header; feature_points->header.frame_id = "world"; vector> hash_ids(NUM_OF_CAM); for (int i = 0; i < NUM_OF_CAM; i++) { auto &un_pts = trackerData[i].cur_un_pts; auto &cur_pts = trackerData[i].cur_pts; auto &ids = trackerData[i].ids; auto &plane_ids = trackerData[i].plane_ids; auto &pts_velocity = trackerData[i].pts_velocity; for (unsigned int j = 0; j < ids.size(); j++) { if (trackerData[i].track_cnt[j] > 1) { int p_id = ids[j]; hash_ids[i].insert(p_id); geometry_msgs::Point32 p; p.x = un_pts[j].x; p.y = un_pts[j].y; p.z = 1; feature_points->points.push_back(p); id_of_point.values.push_back(p_id * NUM_OF_CAM + i); plane_id_of_point.values.push_back(plane_ids[j]); u_of_point.values.push_back(cur_pts[j].x); v_of_point.values.push_back(cur_pts[j].y); velocity_x_of_point.values.push_back(pts_velocity[j].x); velocity_y_of_point.values.push_back(pts_velocity[j].y); } } } feature_points->channels.push_back(id_of_point); feature_points->channels.push_back(plane_id_of_point); feature_points->channels.push_back(u_of_point); feature_points->channels.push_back(v_of_point); feature_points->channels.push_back(velocity_x_of_point); feature_points->channels.push_back(velocity_y_of_point); ROS_DEBUG("publish %f, at %f", feature_points->header.stamp.toSec(), ros::Time::now().toSec()); // skip the first image; since no optical speed on frist image if (!init_pub) { init_pub = 1; } else pub_img.publish(feature_points); if (SHOW_TRACK) { ptr = cv_bridge::cvtColor(ptr, sensor_msgs::image_encodings::BGR8); //cv::Mat stereo_img(ROW * NUM_OF_CAM, COL, CV_8UC3); cv::Mat stereo_img = ptr->image; cv::Mat mask_img = mask_ptr->image; std::vector pids = trackerData[0].plane_ids; std::set unique_pids(pids.begin(), pids.end()); std::map plane_counts; for(auto &pid : unique_pids) plane_counts[pid] = 0; for(auto &pid : pids) plane_counts[pid]++; int largest_count = 0, largest_pid; for(auto &plane_count : plane_counts) { if(plane_count.second > largest_count) { largest_count = plane_count.second; largest_pid = plane_count.first; } } cv::Mat mask = mask_img == largest_pid; cv::Mat mask_viz(ROW, COL, CV_8UC3, cv::Scalar(0,0,0)); mask_viz.setTo(cv::Scalar(0,255,0), mask); for (int i = 0; i < NUM_OF_CAM; i++) { cv::Mat tmp_img = stereo_img.rowRange(i * ROW, (i + 1) * ROW); cv::cvtColor(show_img, tmp_img, CV_GRAY2RGB); cv::addWeighted(tmp_img, 1.0, mask_viz, 0.3, 0.0, tmp_img); for (unsigned int j = 0; j < trackerData[i].cur_pts.size(); j++) { double len = std::min(1.0, 1.0 * trackerData[i].track_cnt[j] / WINDOW_SIZE); cv::circle(tmp_img, trackerData[i].cur_pts[j], 2, cv::Scalar(255 * (1 - len), 0, 255 * len), 2); //draw speed line /* Vector2d tmp_cur_un_pts (trackerData[i].cur_un_pts[j].x, trackerData[i].cur_un_pts[j].y); Vector2d tmp_pts_velocity (trackerData[i].pts_velocity[j].x, trackerData[i].pts_velocity[j].y); Vector3d tmp_prev_un_pts; tmp_prev_un_pts.head(2) = tmp_cur_un_pts - 0.10 * tmp_pts_velocity; tmp_prev_un_pts.z() = 1; Vector2d tmp_prev_uv; trackerData[i].m_camera->spaceToPlane(tmp_prev_un_pts, tmp_prev_uv); cv::line(tmp_img, trackerData[i].cur_pts[j], cv::Point2f(tmp_prev_uv.x(), tmp_prev_uv.y()), cv::Scalar(255 , 0, 0), 1 , 8, 0); */ //char name[10]; //sprintf(name, "%d", trackerData[i].plane_ids[j]); //cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0)); } } //cv::imshow("vis", stereo_img); //cv::waitKey(5); pub_match.publish(ptr->toImageMsg()); } } ROS_INFO("whole feature tracker processing costs: %f", t_r.toc()); } int main(int argc, char **argv) { ros::init(argc, argv, "rpvio_feature_tracker"); ros::NodeHandle n("~"); ros::console::set_logger_level(ROSCONSOLE_DEFAULT_NAME, ros::console::levels::Info); readParameters(n); for (int i = 0; i < NUM_OF_CAM; i++) trackerData[i].readIntrinsicParameter(CAM_NAMES[i]); if(FISHEYE) { for (int i = 0; i < NUM_OF_CAM; i++) { trackerData[i].fisheye_mask = cv::imread(FISHEYE_MASK, 0); if(!trackerData[i].fisheye_mask.data) { ROS_INFO("load mask fail"); ROS_BREAK(); } else ROS_INFO("load mask success"); } } //ros::Subscriber sub_img = n.subscribe(IMAGE_TOPIC, 100, img_callback); message_filters::Subscriber image_sub(n, IMAGE_TOPIC, 100); message_filters::Subscriber mask_image_sub(n, MASK_TOPIC, 100); message_filters::TimeSynchronizer sync(image_sub, mask_image_sub, 10); sync.registerCallback(boost::bind(&callback, _1, _2)); pub_img = n.advertise("feature", 1000); pub_match = n.advertise("feature_img",1000); pub_restart = n.advertise("restart",1000); /* if (SHOW_TRACK) cv::namedWindow("vis", cv::WINDOW_NORMAL); */ ros::spin(); return 0; } ================================================ FILE: rpvio_feature_tracker/src/parameters.cpp ================================================ #include "parameters.h" std::string IMAGE_TOPIC; std::string MASK_TOPIC; std::string IMU_TOPIC; std::vector CAM_NAMES; std::string FISHEYE_MASK; int MAX_CNT; int MIN_DIST; int WINDOW_SIZE; int FREQ; double H_THRESHOLD; int SHOW_TRACK; int STEREO_TRACK; int EQUALIZE; int ROW; int COL; int FOCAL_LENGTH; int FISHEYE; bool PUB_THIS_FRAME; template T readParam(ros::NodeHandle &n, std::string name) { T ans; if (n.getParam(name, ans)) { ROS_INFO_STREAM("Loaded " << name << ": " << ans); } else { ROS_ERROR_STREAM("Failed to load " << name); n.shutdown(); } return ans; } void readParameters(ros::NodeHandle &n) { std::string config_file; config_file = readParam(n, "config_file"); cv::FileStorage fsSettings(config_file, cv::FileStorage::READ); if(!fsSettings.isOpened()) { std::cerr << "ERROR: Wrong path to settings" << std::endl; } std::string RPVIO_FOLDER_PATH = readParam(n, "rpvio_folder"); fsSettings["image_topic"] >> IMAGE_TOPIC; fsSettings["mask_topic"] >> MASK_TOPIC; fsSettings["imu_topic"] >> IMU_TOPIC; MAX_CNT = fsSettings["max_cnt"]; MIN_DIST = fsSettings["min_dist"]; ROW = fsSettings["image_height"]; COL = fsSettings["image_width"]; FREQ = fsSettings["freq"]; H_THRESHOLD = fsSettings["H_threshold"]; SHOW_TRACK = fsSettings["show_track"]; EQUALIZE = fsSettings["equalize"]; FISHEYE = fsSettings["fisheye"]; if (FISHEYE == 1) FISHEYE_MASK = RPVIO_FOLDER_PATH + "config/fisheye_mask.jpg"; CAM_NAMES.push_back(config_file); WINDOW_SIZE = 20; STEREO_TRACK = false; FOCAL_LENGTH = 460; PUB_THIS_FRAME = false; if (FREQ == 0) FREQ = 100; fsSettings.release(); } ================================================ FILE: rpvio_feature_tracker/src/parameters.h ================================================ #pragma once #include #include extern int ROW; extern int COL; extern int FOCAL_LENGTH; const int NUM_OF_CAM = 1; extern std::string IMAGE_TOPIC; extern std::string MASK_TOPIC; extern std::string IMU_TOPIC; extern std::string FISHEYE_MASK; extern std::vector CAM_NAMES; extern int MAX_CNT; extern int MIN_DIST; extern int WINDOW_SIZE; extern int FREQ; extern double H_THRESHOLD; extern int SHOW_TRACK; extern int STEREO_TRACK; extern int EQUALIZE; extern int FISHEYE; extern bool PUB_THIS_FRAME; void readParameters(ros::NodeHandle &n); ================================================ FILE: rpvio_feature_tracker/src/tic_toc.h ================================================ #pragma once #include #include #include class TicToc { public: TicToc() { tic(); } void tic() { start = std::chrono::system_clock::now(); } double toc() { end = std::chrono::system_clock::now(); std::chrono::duration elapsed_seconds = end - start; return elapsed_seconds.count() * 1000; } private: std::chrono::time_point start, end; }; ================================================ FILE: scripts/convert_vins_to_tum.py ================================================ import pandas as pd import argparse def main(): parser = argparse.ArgumentParser(description='Converts the csv output of VINS-Mono to TUM RGB-D format txt') parser.add_argument('input_file_path', help='Path to the csv file') parser.add_argument('output_file_path', help='Path to the output txt file') args = parser.parse_args() data = pd.read_csv(args.input_file_path, usecols=range(8), names=list('txyzwijk')) with open(args.output_file_path,'w') as f: for i in range(len(data['t'])): f.write(f"{data['t'][i]*1e-9} {data['x'][i]} {data['y'][i]} {data['z'][i]} {data['i'][i]} {data['j'][i]} {data['k'][i]} {data['w'][i]}\n") if __name__ == '__main__': main() ================================================ FILE: scripts/run_advio_12.sh ================================================ #! /bin/bash dataset=$1 rm -r $dataset/results/advio mkdir -p $dataset/results/advio cd $dataset/results/advio sed -i "s@~@$HOME@g" ~/catkin_ws/src/rp-vio/config/advio_12_config.yaml echo -e "######advio-12######\n" > report.txt for((i = 1; i <=1; i++)) do roslaunch rpvio_estimator advio_12.launch bagfile_path:=$dataset/12-mask.bag |& tee -a run_log.txt cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv est_traj_$i.txt rm rpvio_est.csv evo_ape tum $dataset/groundtruth.txt est_traj_$i.txt --align --save_plot est_traj_$i.pdf |& tee -a report.txt done ================================================ FILE: scripts/run_ol_market1.sh ================================================ #! /bin/bash dataset=$1 rm -r $dataset/results/ol mkdir -p $dataset/results/ol cd $dataset/results/ol sed -i "s@~@$HOME@g" ~/catkin_ws/src/rp-vio/config/ol_market1_config.yaml echo -e "######ol-market1######\n" > report.txt for((i = 1; i <=1; i++)) do roslaunch rpvio_estimator ol_market1.launch bagfile_path:=$dataset/market1-mask.bag cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv est_traj_$i.txt rm rpvio_est.csv evo_ape tum $dataset/groundtruth.txt est_traj_$i.txt --align --save_plot est_traj_$i.pdf |& tee -a report.txt done ================================================ FILE: scripts/run_rpvio_sim.sh ================================================ dataset=$1 rm -r $dataset/results/rpvio-sim/ mkdir -p $dataset/results/rpvio-sim/ cd $dataset/results/rpvio-sim/ sed -i "s@~@$HOME@g" ~/catkin_ws/src/rp-vio/config/rpvio_sim_config.yaml ########## static run ############# mkdir static echo -e "######static######\n" > report.txt for((i = 1; i <=1; i++)) do roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/static/static.bag cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv static/est_traj_$i.txt rm rpvio_est.csv evo_ape tum $dataset/static/groundtruth.txt static/est_traj_$i.txt --align --save_plot static/est_traj_$i.pdf |& tee -a report.txt done ########## c1 run ############# mkdir c1 echo -e "######c1######\n" > report.txt for((i = 1; i <=1; i++)) do roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c1/c1.bag cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c1/est_traj_$i.txt rm rpvio_est.csv evo_ape tum $dataset/c1/groundtruth.txt c1/est_traj_$i.txt --align --save_plot c1/est_traj_$i.pdf |& tee -a report.txt done ########### c2 run ############# mkdir c2 echo -e "######c2######\n" > report.txt for((i = 1; i <=1; i++)) do roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c2/c2.bag cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c2/est_traj_$i.txt rm rpvio_est.csv evo_ape tum $dataset/c2/groundtruth.txt c2/est_traj_$i.txt --align --save_plot c2/est_traj_$i.pdf |& tee -a report.txt done ########### c4 run ############# mkdir c4 echo -e "######c4######\n" > report.txt for((i = 1; i <=1; i++)) do roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c4/c4.bag cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c4/est_traj_$i.txt rm rpvio_est.csv evo_ape tum $dataset/c4/groundtruth.txt c4/est_traj_$i.txt --align --save_plot c4/est_traj_$i.pdf |& tee -a report.txt done ############ c6 run ############# mkdir c6 echo -e "######c6######\n" > report.txt for((i = 1; i <=1; i++)) do roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c6/c6.bag cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c6/est_traj_$i.txt rm rpvio_est.csv evo_ape tum $dataset/c6/groundtruth.txt c6/est_traj_$i.txt --align --save_plot c6/est_traj_$i.pdf |& tee -a report.txt done ############ c8 run ############# mkdir c8 echo -e "######c8######\n" > report.txt for((i = 1; i <=1; i++)) do roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c8/c8.bag cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c8/est_traj_$i.txt rm rpvio_est.csv evo_ape tum $dataset/c8/groundtruth.txt c8/est_traj_$i.txt --align --save_plot c8/est_traj_$i.pdf |& tee -a report.txt done