[
  {
    "path": "LICENCE",
    "content": "                    GNU GENERAL PUBLIC LICENSE\n                       Version 3, 29 June 2007\n\n Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n                            Preamble\n\n  The GNU General Public License is a free, copyleft license for\nsoftware and other kinds of works.\n\n  The licenses for most software and other practical works are designed\nto take away your freedom to share and change the works.  By contrast,\nthe GNU General Public License is intended to guarantee your freedom to\nshare and change all versions of a program--to make sure it remains free\nsoftware for all its users.  We, the Free Software Foundation, use the\nGNU General Public License for most of our software; it applies also to\nany other work released this way by its authors.  You can apply it to\nyour programs, too.\n\n  When we speak of free software, we are referring to freedom, not\nprice.  Our General Public Licenses are designed to make sure that you\nhave the freedom to distribute copies of free software (and charge for\nthem if you wish), that you receive source code or can get it if you\nwant it, that you can change the software or use pieces of it in new\nfree programs, and that you know you can do these things.\n\n  To protect your rights, we need to prevent others from denying you\nthese rights or asking you to surrender the rights.  Therefore, you have\ncertain responsibilities if you distribute copies of the software, or if\nyou modify it: responsibilities to respect the freedom of others.\n\n  For example, if you distribute copies of such a program, whether\ngratis or for a fee, you must pass on to the recipients the same\nfreedoms that you received.  You must make sure that they, too, receive\nor can get the source code.  And you must show them these terms so they\nknow their rights.\n\n  Developers that use the GNU GPL protect your rights with two steps:\n(1) assert copyright on the software, and (2) offer you this License\ngiving you legal permission to copy, distribute and/or modify it.\n\n  For the developers' and authors' protection, the GPL clearly explains\nthat there is no warranty for this free software.  For both users' and\nauthors' sake, the GPL requires that modified versions be marked as\nchanged, so that their problems will not be attributed erroneously to\nauthors of previous versions.\n\n  Some devices are designed to deny users access to install or run\nmodified versions of the software inside them, although the manufacturer\ncan do so.  This is fundamentally incompatible with the aim of\nprotecting users' freedom to change the software.  The systematic\npattern of such abuse occurs in the area of products for individuals to\nuse, which is precisely where it is most unacceptable.  Therefore, we\nhave designed this version of the GPL to prohibit the practice for those\nproducts.  If such problems arise substantially in other domains, we\nstand ready to extend this provision to those domains in future versions\nof the GPL, as needed to protect the freedom of users.\n\n  Finally, every program is threatened constantly by software patents.\nStates should not allow patents to restrict development and use of\nsoftware on general-purpose computers, but in those that do, we wish to\navoid the special danger that patents applied to a free program could\nmake it effectively proprietary.  To prevent this, the GPL assures that\npatents cannot be used to render the program non-free.\n\n  The precise terms and conditions for copying, distribution and\nmodification follow.\n\n                       TERMS AND CONDITIONS\n\n  0. Definitions.\n\n  \"This License\" refers to version 3 of the GNU General Public License.\n\n  \"Copyright\" also means copyright-like laws that apply to other kinds of\nworks, such as semiconductor masks.\n\n  \"The Program\" refers to any copyrightable work licensed under this\nLicense.  Each licensee is addressed as \"you\".  \"Licensees\" and\n\"recipients\" may be individuals or organizations.\n\n  To \"modify\" a work means to copy from or adapt all or part of the work\nin a fashion requiring copyright permission, other than the making of an\nexact copy.  The resulting work is called a \"modified version\" of the\nearlier work or a work \"based on\" the earlier work.\n\n  A \"covered work\" means either the unmodified Program or a work based\non the Program.\n\n  To \"propagate\" a work means to do anything with it that, without\npermission, would make you directly or secondarily liable for\ninfringement under applicable copyright law, except executing it on a\ncomputer or modifying a private copy.  Propagation includes copying,\ndistribution (with or without modification), making available to the\npublic, and in some countries other activities as well.\n\n  To \"convey\" a work means any kind of propagation that enables other\nparties to make or receive copies.  Mere interaction with a user through\na computer network, with no transfer of a copy, is not conveying.\n\n  An interactive user interface displays \"Appropriate Legal Notices\"\nto the extent that it includes a convenient and prominently visible\nfeature that (1) displays an appropriate copyright notice, and (2)\ntells the user that there is no warranty for the work (except to the\nextent that warranties are provided), that licensees may convey the\nwork under this License, and how to view a copy of this License.  If\nthe interface presents a list of user commands or options, such as a\nmenu, a prominent item in the list meets this criterion.\n\n  1. Source Code.\n\n  The \"source code\" for a work means the preferred form of the work\nfor making modifications to it.  \"Object code\" means any non-source\nform of a work.\n\n  A \"Standard Interface\" means an interface that either is an official\nstandard defined by a recognized standards body, or, in the case of\ninterfaces specified for a particular programming language, one that\nis widely used among developers working in that language.\n\n  The \"System Libraries\" of an executable work include anything, other\nthan the work as a whole, that (a) is included in the normal form of\npackaging a Major Component, but which is not part of that Major\nComponent, and (b) serves only to enable use of the work with that\nMajor Component, or to implement a Standard Interface for which an\nimplementation is available to the public in source code form.  A\n\"Major Component\", in this context, means a major essential component\n(kernel, window system, and so on) of the specific operating system\n(if any) on which the executable work runs, or a compiler used to\nproduce the work, or an object code interpreter used to run it.\n\n  The \"Corresponding Source\" for a work in object code form means all\nthe source code needed to generate, install, and (for an executable\nwork) run the object code and to modify the work, including scripts to\ncontrol those activities.  However, it does not include the work's\nSystem Libraries, or general-purpose tools or generally available free\nprograms which are used unmodified in performing those activities but\nwhich are not part of the work.  For example, Corresponding Source\nincludes interface definition files associated with source files for\nthe work, and the source code for shared libraries and dynamically\nlinked subprograms that the work is specifically designed to require,\nsuch as by intimate data communication or control flow between those\nsubprograms and other parts of the work.\n\n  The Corresponding Source need not include anything that users\ncan regenerate automatically from other parts of the Corresponding\nSource.\n\n  The Corresponding Source for a work in source code form is that\nsame work.\n\n  2. Basic Permissions.\n\n  All rights granted under this License are granted for the term of\ncopyright on the Program, and are irrevocable provided the stated\nconditions are met.  This License explicitly affirms your unlimited\npermission to run the unmodified Program.  The output from running a\ncovered work is covered by this License only if the output, given its\ncontent, constitutes a covered work.  This License acknowledges your\nrights of fair use or other equivalent, as provided by copyright law.\n\n  You may make, run and propagate covered works that you do not\nconvey, without conditions so long as your license otherwise remains\nin force.  You may convey covered works to others for the sole purpose\nof having them make modifications exclusively for you, or provide you\nwith facilities for running those works, provided that you comply with\nthe terms of this License in conveying all material for which you do\nnot control copyright.  Those thus making or running the covered works\nfor you must do so exclusively on your behalf, under your direction\nand control, on terms that prohibit them from making any copies of\nyour copyrighted material outside their relationship with you.\n\n  Conveying under any other circumstances is permitted solely under\nthe conditions stated below.  Sublicensing is not allowed; section 10\nmakes it unnecessary.\n\n  3. Protecting Users' Legal Rights From Anti-Circumvention Law.\n\n  No covered work shall be deemed part of an effective technological\nmeasure under any applicable law fulfilling obligations under article\n11 of the WIPO copyright treaty adopted on 20 December 1996, or\nsimilar laws prohibiting or restricting circumvention of such\nmeasures.\n\n  When you convey a covered work, you waive any legal power to forbid\ncircumvention of technological measures to the extent such circumvention\nis effected by exercising rights under this License with respect to\nthe covered work, and you disclaim any intention to limit operation or\nmodification of the work as a means of enforcing, against the work's\nusers, your or third parties' legal rights to forbid circumvention of\ntechnological measures.\n\n  4. Conveying Verbatim Copies.\n\n  You may convey verbatim copies of the Program's source code as you\nreceive it, in any medium, provided that you conspicuously and\nappropriately publish on each copy an appropriate copyright notice;\nkeep intact all notices stating that this License and any\nnon-permissive terms added in accord with section 7 apply to the code;\nkeep intact all notices of the absence of any warranty; and give all\nrecipients a copy of this License along with the Program.\n\n  You may charge any price or no price for each copy that you convey,\nand you may offer support or warranty protection for a fee.\n\n  5. Conveying Modified Source Versions.\n\n  You may convey a work based on the Program, or the modifications to\nproduce it from the Program, in the form of source code under the\nterms of section 4, provided that you also meet all of these conditions:\n\n    a) The work must carry prominent notices stating that you modified\n    it, and giving a relevant date.\n\n    b) The work must carry prominent notices stating that it is\n    released under this License and any conditions added under section\n    7.  This requirement modifies the requirement in section 4 to\n    \"keep intact all notices\".\n\n    c) You must license the entire work, as a whole, under this\n    License to anyone who comes into possession of a copy.  This\n    License will therefore apply, along with any applicable section 7\n    additional terms, to the whole of the work, and all its parts,\n    regardless of how they are packaged.  This License gives no\n    permission to license the work in any other way, but it does not\n    invalidate such permission if you have separately received it.\n\n    d) If the work has interactive user interfaces, each must display\n    Appropriate Legal Notices; however, if the Program has interactive\n    interfaces that do not display Appropriate Legal Notices, your\n    work need not make them do so.\n\n  A compilation of a covered work with other separate and independent\nworks, which are not by their nature extensions of the covered work,\nand which are not combined with it such as to form a larger program,\nin or on a volume of a storage or distribution medium, is called an\n\"aggregate\" if the compilation and its resulting copyright are not\nused to limit the access or legal rights of the compilation's users\nbeyond what the individual works permit.  Inclusion of a covered work\nin an aggregate does not cause this License to apply to the other\nparts of the aggregate.\n\n  6. Conveying Non-Source Forms.\n\n  You may convey a covered work in object code form under the terms\nof sections 4 and 5, provided that you also convey the\nmachine-readable Corresponding Source under the terms of this License,\nin one of these ways:\n\n    a) Convey the object code in, or embodied in, a physical product\n    (including a physical distribution medium), accompanied by the\n    Corresponding Source fixed on a durable physical medium\n    customarily used for software interchange.\n\n    b) Convey the object code in, or embodied in, a physical product\n    (including a physical distribution medium), accompanied by a\n    written offer, valid for at least three years and valid for as\n    long as you offer spare parts or customer support for that product\n    model, to give anyone who possesses the object code either (1) a\n    copy of the Corresponding Source for all the software in the\n    product that is covered by this License, on a durable physical\n    medium customarily used for software interchange, for a price no\n    more than your reasonable cost of physically performing this\n    conveying of source, or (2) access to copy the\n    Corresponding Source from a network server at no charge.\n\n    c) Convey individual copies of the object code with a copy of the\n    written offer to provide the Corresponding Source.  This\n    alternative is allowed only occasionally and noncommercially, and\n    only if you received the object code with such an offer, in accord\n    with subsection 6b.\n\n    d) Convey the object code by offering access from a designated\n    place (gratis or for a charge), and offer equivalent access to the\n    Corresponding Source in the same way through the same place at no\n    further charge.  You need not require recipients to copy the\n    Corresponding Source along with the object code.  If the place to\n    copy the object code is a network server, the Corresponding Source\n    may be on a different server (operated by you or a third party)\n    that supports equivalent copying facilities, provided you maintain\n    clear directions next to the object code saying where to find the\n    Corresponding Source.  Regardless of what server hosts the\n    Corresponding Source, you remain obligated to ensure that it is\n    available for as long as needed to satisfy these requirements.\n\n    e) Convey the object code using peer-to-peer transmission, provided\n    you inform other peers where the object code and Corresponding\n    Source of the work are being offered to the general public at no\n    charge under subsection 6d.\n\n  A separable portion of the object code, whose source code is excluded\nfrom the Corresponding Source as a System Library, need not be\nincluded in conveying the object code work.\n\n  A \"User Product\" is either (1) a \"consumer product\", which means any\ntangible personal property which is normally used for personal, family,\nor household purposes, or (2) anything designed or sold for incorporation\ninto a dwelling.  In determining whether a product is a consumer product,\ndoubtful cases shall be resolved in favor of coverage.  For a particular\nproduct received by a particular user, \"normally used\" refers to a\ntypical or common use of that class of product, regardless of the status\nof the particular user or of the way in which the particular user\nactually uses, or expects or is expected to use, the product.  A product\nis a consumer product regardless of whether the product has substantial\ncommercial, industrial or non-consumer uses, unless such uses represent\nthe only significant mode of use of the product.\n\n  \"Installation Information\" for a User Product means any methods,\nprocedures, authorization keys, or other information required to install\nand execute modified versions of a covered work in that User Product from\na modified version of its Corresponding Source.  The information must\nsuffice to ensure that the continued functioning of the modified object\ncode is in no case prevented or interfered with solely because\nmodification has been made.\n\n  If you convey an object code work under this section in, or with, or\nspecifically for use in, a User Product, and the conveying occurs as\npart of a transaction in which the right of possession and use of the\nUser Product is transferred to the recipient in perpetuity or for a\nfixed term (regardless of how the transaction is characterized), the\nCorresponding Source conveyed under this section must be accompanied\nby the Installation Information.  But this requirement does not apply\nif neither you nor any third party retains the ability to install\nmodified object code on the User Product (for example, the work has\nbeen installed in ROM).\n\n  The requirement to provide Installation Information does not include a\nrequirement to continue to provide support service, warranty, or updates\nfor a work that has been modified or installed by the recipient, or for\nthe User Product in which it has been modified or installed.  Access to a\nnetwork may be denied when the modification itself materially and\nadversely affects the operation of the network or violates the rules and\nprotocols for communication across the network.\n\n  Corresponding Source conveyed, and Installation Information provided,\nin accord with this section must be in a format that is publicly\ndocumented (and with an implementation available to the public in\nsource code form), and must require no special password or key for\nunpacking, reading or copying.\n\n  7. Additional Terms.\n\n  \"Additional permissions\" are terms that supplement the terms of this\nLicense by making exceptions from one or more of its conditions.\nAdditional permissions that are applicable to the entire Program shall\nbe treated as though they were included in this License, to the extent\nthat they are valid under applicable law.  If additional permissions\napply only to part of the Program, that part may be used separately\nunder those permissions, but the entire Program remains governed by\nthis License without regard to the additional permissions.\n\n  When you convey a copy of a covered work, you may at your option\nremove any additional permissions from that copy, or from any part of\nit.  (Additional permissions may be written to require their own\nremoval in certain cases when you modify the work.)  You may place\nadditional permissions on material, added by you to a covered work,\nfor which you have or can give appropriate copyright permission.\n\n  Notwithstanding any other provision of this License, for material you\nadd to a covered work, you may (if authorized by the copyright holders of\nthat material) supplement the terms of this License with terms:\n\n    a) Disclaiming warranty or limiting liability differently from the\n    terms of sections 15 and 16 of this License; or\n\n    b) Requiring preservation of specified reasonable legal notices or\n    author attributions in that material or in the Appropriate Legal\n    Notices displayed by works containing it; or\n\n    c) Prohibiting misrepresentation of the origin of that material, or\n    requiring that modified versions of such material be marked in\n    reasonable ways as different from the original version; or\n\n    d) Limiting the use for publicity purposes of names of licensors or\n    authors of the material; or\n\n    e) Declining to grant rights under trademark law for use of some\n    trade names, trademarks, or service marks; or\n\n    f) Requiring indemnification of licensors and authors of that\n    material by anyone who conveys the material (or modified versions of\n    it) with contractual assumptions of liability to the recipient, for\n    any liability that these contractual assumptions directly impose on\n    those licensors and authors.\n\n  All other non-permissive additional terms are considered \"further\nrestrictions\" within the meaning of section 10.  If the Program as you\nreceived it, or any part of it, contains a notice stating that it is\ngoverned by this License along with a term that is a further\nrestriction, you may remove that term.  If a license document contains\na further restriction but permits relicensing or conveying under this\nLicense, you may add to a covered work material governed by the terms\nof that license document, provided that the further restriction does\nnot survive such relicensing or conveying.\n\n  If you add terms to a covered work in accord with this section, you\nmust place, in the relevant source files, a statement of the\nadditional terms that apply to those files, or a notice indicating\nwhere to find the applicable terms.\n\n  Additional terms, permissive or non-permissive, may be stated in the\nform of a separately written license, or stated as exceptions;\nthe above requirements apply either way.\n\n  8. Termination.\n\n  You may not propagate or modify a covered work except as expressly\nprovided under this License.  Any attempt otherwise to propagate or\nmodify it is void, and will automatically terminate your rights under\nthis License (including any patent licenses granted under the third\nparagraph of section 11).\n\n  However, if you cease all violation of this License, then your\nlicense from a particular copyright holder is reinstated (a)\nprovisionally, unless and until the copyright holder explicitly and\nfinally terminates your license, and (b) permanently, if the copyright\nholder fails to notify you of the violation by some reasonable means\nprior to 60 days after the cessation.\n\n  Moreover, your license from a particular copyright holder is\nreinstated permanently if the copyright holder notifies you of the\nviolation by some reasonable means, this is the first time you have\nreceived notice of violation of this License (for any work) from that\ncopyright holder, and you cure the violation prior to 30 days after\nyour receipt of the notice.\n\n  Termination of your rights under this section does not terminate the\nlicenses of parties who have received copies or rights from you under\nthis License.  If your rights have been terminated and not permanently\nreinstated, you do not qualify to receive new licenses for the same\nmaterial under section 10.\n\n  9. Acceptance Not Required for Having Copies.\n\n  You are not required to accept this License in order to receive or\nrun a copy of the Program.  Ancillary propagation of a covered work\noccurring solely as a consequence of using peer-to-peer transmission\nto receive a copy likewise does not require acceptance.  However,\nnothing other than this License grants you permission to propagate or\nmodify any covered work.  These actions infringe copyright if you do\nnot accept this License.  Therefore, by modifying or propagating a\ncovered work, you indicate your acceptance of this License to do so.\n\n  10. Automatic Licensing of Downstream Recipients.\n\n  Each time you convey a covered work, the recipient automatically\nreceives a license from the original licensors, to run, modify and\npropagate that work, subject to this License.  You are not responsible\nfor enforcing compliance by third parties with this License.\n\n  An \"entity transaction\" is a transaction transferring control of an\norganization, or substantially all assets of one, or subdividing an\norganization, or merging organizations.  If propagation of a covered\nwork results from an entity transaction, each party to that\ntransaction who receives a copy of the work also receives whatever\nlicenses to the work the party's predecessor in interest had or could\ngive under the previous paragraph, plus a right to possession of the\nCorresponding Source of the work from the predecessor in interest, if\nthe predecessor has it or can get it with reasonable efforts.\n\n  You may not impose any further restrictions on the exercise of the\nrights granted or affirmed under this License.  For example, you may\nnot impose a license fee, royalty, or other charge for exercise of\nrights granted under this License, and you may not initiate litigation\n(including a cross-claim or counterclaim in a lawsuit) alleging that\nany patent claim is infringed by making, using, selling, offering for\nsale, or importing the Program or any portion of it.\n\n  11. Patents.\n\n  A \"contributor\" is a copyright holder who authorizes use under this\nLicense of the Program or a work on which the Program is based.  The\nwork thus licensed is called the contributor's \"contributor version\".\n\n  A contributor's \"essential patent claims\" are all patent claims\nowned or controlled by the contributor, whether already acquired or\nhereafter acquired, that would be infringed by some manner, permitted\nby this License, of making, using, or selling its contributor version,\nbut do not include claims that would be infringed only as a\nconsequence of further modification of the contributor version.  For\npurposes of this definition, \"control\" includes the right to grant\npatent sublicenses in a manner consistent with the requirements of\nthis License.\n\n  Each contributor grants you a non-exclusive, worldwide, royalty-free\npatent license under the contributor's essential patent claims, to\nmake, use, sell, offer for sale, import and otherwise run, modify and\npropagate the contents of its contributor version.\n\n  In the following three paragraphs, a \"patent license\" is any express\nagreement or commitment, however denominated, not to enforce a patent\n(such as an express permission to practice a patent or covenant not to\nsue for patent infringement).  To \"grant\" such a patent license to a\nparty means to make such an agreement or commitment not to enforce a\npatent against the party.\n\n  If you convey a covered work, knowingly relying on a patent license,\nand the Corresponding Source of the work is not available for anyone\nto copy, free of charge and under the terms of this License, through a\npublicly available network server or other readily accessible means,\nthen you must either (1) cause the Corresponding Source to be so\navailable, or (2) arrange to deprive yourself of the benefit of the\npatent license for this particular work, or (3) arrange, in a manner\nconsistent with the requirements of this License, to extend the patent\nlicense to downstream recipients.  \"Knowingly relying\" means you have\nactual knowledge that, but for the patent license, your conveying the\ncovered work in a country, or your recipient's use of the covered work\nin a country, would infringe one or more identifiable patents in that\ncountry that you have reason to believe are valid.\n\n  If, pursuant to or in connection with a single transaction or\narrangement, you convey, or propagate by procuring conveyance of, a\ncovered work, and grant a patent license to some of the parties\nreceiving the covered work authorizing them to use, propagate, modify\nor convey a specific copy of the covered work, then the patent license\nyou grant is automatically extended to all recipients of the covered\nwork and works based on it.\n\n  A patent license is \"discriminatory\" if it does not include within\nthe scope of its coverage, prohibits the exercise of, or is\nconditioned on the non-exercise of one or more of the rights that are\nspecifically granted under this License.  You may not convey a covered\nwork if you are a party to an arrangement with a third party that is\nin the business of distributing software, under which you make payment\nto the third party based on the extent of your activity of conveying\nthe work, and under which the third party grants, to any of the\nparties who would receive the covered work from you, a discriminatory\npatent license (a) in connection with copies of the covered work\nconveyed by you (or copies made from those copies), or (b) primarily\nfor and in connection with specific products or compilations that\ncontain the covered work, unless you entered into that arrangement,\nor that patent license was granted, prior to 28 March 2007.\n\n  Nothing in this License shall be construed as excluding or limiting\nany implied license or other defenses to infringement that may\notherwise be available to you under applicable patent law.\n\n  12. No Surrender of Others' Freedom.\n\n  If conditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License.  If you cannot convey a\ncovered work so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you may\nnot convey it at all.  For example, if you agree to terms that obligate you\nto collect a royalty for further conveying from those to whom you convey\nthe Program, the only way you could satisfy both those terms and this\nLicense would be to refrain entirely from conveying the Program.\n\n  13. Use with the GNU Affero General Public License.\n\n  Notwithstanding any other provision of this License, you have\npermission to link or combine any covered work with a work licensed\nunder version 3 of the GNU Affero General Public License into a single\ncombined work, and to convey the resulting work.  The terms of this\nLicense will continue to apply to the part which is the covered work,\nbut the special requirements of the GNU Affero General Public License,\nsection 13, concerning interaction through a network will apply to the\ncombination as such.\n\n  14. Revised Versions of this License.\n\n  The Free Software Foundation may publish revised and/or new versions of\nthe GNU General Public License from time to time.  Such new versions will\nbe similar in spirit to the present version, but may differ in detail to\naddress new problems or concerns.\n\n  Each version is given a distinguishing version number.  If the\nProgram specifies that a certain numbered version of the GNU General\nPublic License \"or any later version\" applies to it, you have the\noption of following the terms and conditions either of that numbered\nversion or of any later version published by the Free Software\nFoundation.  If the Program does not specify a version number of the\nGNU General Public License, you may choose any version ever published\nby the Free Software Foundation.\n\n  If the Program specifies that a proxy can decide which future\nversions of the GNU General Public License can be used, that proxy's\npublic statement of acceptance of a version permanently authorizes you\nto choose that version for the Program.\n\n  Later license versions may give you additional or different\npermissions.  However, no additional obligations are imposed on any\nauthor or copyright holder as a result of your choosing to follow a\nlater version.\n\n  15. Disclaimer of Warranty.\n\n  THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY\nAPPLICABLE LAW.  EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT\nHOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM \"AS IS\" WITHOUT WARRANTY\nOF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,\nTHE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE.  THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM\nIS WITH YOU.  SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF\nALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n\n  16. Limitation of Liability.\n\n  IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING\nWILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS\nTHE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY\nGENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE\nUSE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF\nDATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD\nPARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),\nEVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF\nSUCH DAMAGES.\n\n  17. Interpretation of Sections 15 and 16.\n\n  If the disclaimer of warranty and limitation of liability provided\nabove cannot be given local legal effect according to their terms,\nreviewing courts shall apply local law that most closely approximates\nan absolute waiver of all civil liability in connection with the\nProgram, unless a warranty or assumption of liability accompanies a\ncopy of the Program in return for a fee.\n\n                     END OF TERMS AND CONDITIONS\n\n            How to Apply These Terms to Your New Programs\n\n  If you develop a new program, and you want it to be of the greatest\npossible use to the public, the best way to achieve this is to make it\nfree software which everyone can redistribute and change under these terms.\n\n  To do so, attach the following notices to the program.  It is safest\nto attach them to the start of each source file to most effectively\nstate the exclusion of warranty; and each file should have at least\nthe \"copyright\" line and a pointer to where the full notice is found.\n\n    {one line to give the program's name and a brief idea of what it does.}\n    Copyright (C) {year}  {name of author}\n\n    This program is free software: you can redistribute it and/or modify\n    it under the terms of the GNU General Public License as published by\n    the Free Software Foundation, either version 3 of the License, or\n    (at your option) any later version.\n\n    This program is distributed in the hope that it will be useful,\n    but WITHOUT ANY WARRANTY; without even the implied warranty of\n    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the\n    GNU General Public License for more details.\n\n    You should have received a copy of the GNU General Public License\n    along with this program.  If not, see <http://www.gnu.org/licenses/>.\n\nAlso add information on how to contact you by electronic and paper mail.\n\n  If the program does terminal interaction, make it output a short\nnotice like this when it starts in an interactive mode:\n\n    {project}  Copyright (C) {year}  {fullname}\n    This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.\n    This is free software, and you are welcome to redistribute it\n    under certain conditions; type `show c' for details.\n\nThe hypothetical commands `show w' and `show c' should show the appropriate\nparts of the General Public License.  Of course, your program's commands\nmight be different; for a GUI interface, you would use an \"about box\".\n\n  You should also get your employer (if you work as a programmer) or school,\nif any, to sign a \"copyright disclaimer\" for the program, if necessary.\nFor more information on this, and how to apply and follow the GNU GPL, see\n<http://www.gnu.org/licenses/>.\n\n  The GNU General Public License does not permit incorporating your program\ninto proprietary programs.  If your program is a subroutine library, you\nmay consider it more useful to permit linking proprietary applications with\nthe library.  If this is what you want to do, use the GNU Lesser General\nPublic License instead of this License.  But first, please read\n<http://www.gnu.org/philosophy/why-not-lgpl.html>.\n"
  },
  {
    "path": "README.md",
    "content": "## RP-VIO: Robust Plane-based Visual-Inertial Odometry for Dynamic Environments\nKarnik Ram, Chaitanya Kharyal, Sudarshan S. Harithas, K. Madhava Krishna.\n\n[[`arXiv`](https://arxiv.org/pdf/2103.10400.pdf)]\n[[`Project Page`](https://karnikram.info/rp-vio/)]\n\nIn IROS 2021\n\n<p align=\"center\">\n<a href=\"https://user-images.githubusercontent.com/12653355/111314569-88dfb000-8687-11eb-87c8-212f7ad13489.png\"><img src=\"https://user-images.githubusercontent.com/12653355/111314569-88dfb000-8687-11eb-87c8-212f7ad13489.png\" width=\"700\"/></a>\n</p>\nRP-VIO is a monocular visual-inertial odometry (VIO) system that uses only planar features and their induced homographies, during both initialization and sliding-window estimation, for increased robustness and accuracy in dynamic environments.\n\n## Setup\nOur evaluation setup is a 6-core Intel Core i5-8400 CPU with 8GB RAM and a 1 TB HDD, running Ubuntu 18.04.1. We recommend using a more powerful setup, especially for heavy datasets like ADVIO or OpenLORIS.\n\n### Pre-requisites\n[ROS Melodic](http://wiki.ros.org/melodic) (OpenCV 3.2.0, Eigen 3.3.4-4)<br>\n[Ceres Solver 1.14.0](https://github.com/ceres-solver/ceres-solver/releases)<br>\n[EVO](https://github.com/MichaelGrupp/evo)\n\nThe versions indicated are the versions used in our evaluation setup, and we do not guarantee our code to run on newer versions like ROS Noetic (OpenCV 4.2).\n\n### Build\nRun the following commands in your terminal to clone our project and build,\n\n```\n    cd ~/catkin_ws/src\n    git clone https://github.com/karnikram/rp-vio.git\n    cd ../\n    catkin_make -j4\n    source ~/catkin_ws/devel/setup.bash\n```\n\n\n## Evaluation\nWe provide evaluation scripts to run RP-VIO on the [RPVIO-Sim](https://github.com/karnikram/rp-vio#rpvio-sim-dataset-1) dataset, and select sequences from the [OpenLORIS-Scene]((https://lifelong-robotic-vision.github.io/dataset/scene.html)), [ADVIO](https://github.com/AaltoVision/ADVIO), and [VIODE](https://github.com/kminoda/VIODE) datasets. The output errors from your evaluation might not be exactly the same as reported in our paper, but should be similar.\n\n### RPVIO-Sim Dataset\nDownload the [dataset](https://github.com/karnikram/rp-vio#rpvio-sim-dataset-1) files to a parent folder, and then run the following commands to launch our evaluation script. The script runs rp-vio on each of the six sequences once and computes the ATE error statistics.\n\n```\n    cd ~/catkin_ws/src/rp-vio/scripts/\n    ./run_rpvio_sim.sh <PATH-TO-DATASET-FOLDER>\n```\n\nTo run the multiple planes version (RPVIO-Multi), checkout the corresponding branch by running `git checkout rpvio-multi`, and re-run the above script.\n\n### Real-world sequences\nWe evaluate on two real-world sequences: the market1-1 sequence from the OpenLORIS-Scene dataset and the metro station sequence (12) from the ADVIO dataset. Both of these sequences along with their segmented plane masks are available as bagfiles for download [here](https://iiitaphyd-my.sharepoint.com/:f:/g/personal/robotics_iiit_ac_in/EozZ6vJP5UZFmZhA-9w0bBcBvTXpszD42fPx3x3ZlKvD6A?e=FtzFRz). After downloading and extracting the files run the following commands for evaluation,\n\n```\n    cd ~/catkin_ws/src/rp-vio/scripts/\n    ./run_ol_market1.sh <PATH-TO-EXTRACTED-DATASET-FOLDER>\n    ./run_advio_12.sh <PATH-TO-EXTRACTED-DATASET-FOLDER>\n```\n\n### Own data\nTo run RP-VIO on your own data, you need to provide synchronized monocular images, IMU readings, and plane masks on three separate ROS topics. The camera and IMU need to be properly calibrated and synchronized as there is no online calibration. A plane segmentation model to segment plane masks from images is provided [below](https://github.com/karnikram/rp-vio#plane-segmentation).\n\nA semantic segmentation model can also be used as long as the RGB labels of the (static) planar semantic classes are provided. As an example, we evaluate on a sequence from the VIODE dataset (provided [here](https://iiitaphyd-my.sharepoint.com/:f:/g/personal/robotics_iiit_ac_in/EoxFVvuAxUdFsnXu0XJY0egBMFxB9D8XbNqe0jkUkRdjVg?e=G18fDo)) using semantic segmentation labels which are specified in the [config file](https://github.com/karnikram/rp-vio/blob/semantic-viode/config/viode_config.yaml). To run, \n\n```\n    cd ~/catkin_ws/src/rp-vio/scripts\n    git checkout semantic-viode\n    ./run_viode_night.sh <PATH-TO-EXTRACTED-DATASET-FOLDER>\n```\n\n## Plane segmentation\nWe provide a pre-trained plane instance segmentation model, based on the [Plane-Recover](https://github.com/fuy34/planerecover) model. We retrained their model, with an added inter-plane constraint, on their SYNTHIA training data and two additional sequences (00,01) from the ScanNet dataset. The model was trained on a single Titan X (maxwell) GPU for about 700K iterations. We also provide the training script.\n\nWe run the model offline, after extracting and [processing](https://github.com/fuy34/planerecover#preparing-training-data) the input RGB images from their ROS bagfiles. Follow the steps given below to run the pre-trained model on your custom dataset (requires CUDA 9.0),\n\n#### Create Environment \n\nRun the following commands to create a suitable conda environemnt,\n```\ncd plane_segmentation/\nconda create --name plane_seg --file requirements.txt\nconda activate plane_seg\n```\n\n#### Run inference\n\nNow extract images from your dataset to a test folder, resize them to (192,320) (height, width), and run the following, \n\n```\npython inference.py --dataset=<PATH_TO_DATASET> --output_dir=<PATH_TO_OUTPUT_DIRECTORY> --test_list=<TEST_DATA_LIST.txt FILE> --ckpt_file=<MODEL> --use_preprocessed=true \n```\n*TEST_DATA_LIST.txt* is a file that points to every image within the test dataset, an example can be found [here](./plane_segmentation/openloris.txt).  *PATH_TO_DATASET* is the path to the parent directory of the test folder.\n\n\nThe result of the inference would be a stored in three folders that are named as *plane_sgmts* (predicted masks in grayscale), *plane_sgmts_vis* (predicted masks in color), *plane_sgmts_modified* (grayscale masks but suitable for visualization (feed this output to the CRF inference)).\n\n#### Run CRF inference\n\nWe also use a dense CRF model (from [PyDenseCRF](https://github.com/lucasb-eyer/pydensecrf)) to further refine the output masks. To run,\n\n```\npython crf_inference.py <rgb_image_dir> <labels_dir> <output_dir>\n```\nwhere the *labels_dir* is the path to the *plane_sgmts_modified* folder. \n \nWe then write these outputs mask images back into the original bagfile on a separate topic for running with RP-VIO.\n\n## RPVIO-Sim Dataset\n<figure>\n<a href=\"https://user-images.githubusercontent.com/12653355/111727645-48538280-8891-11eb-90db-027f82087586.png\"><img src=\"https://user-images.githubusercontent.com/12653355/111727645-48538280-8891-11eb-90db-027f82087586.png\" width=\"400\"/></a>\n</figure>\n<br>\n\nFor an effective evaluation of the capabilities of modern VINS systems, we generate a highly-dynamic visual-inertial dataset using [AirSim](https://github.com/microsoft/AirSim/) which contains dynamic characters present throughout the sequences (including initialization), and with sufficient IMU excitation. Dynamic characters are progressively added, keeping everything else fixed, starting from no characters in the `static` sequence to eight characters in the `C8` sequence. All the generated sequences (six) in rosbag format, along with their groundtruth files, have been made available via [Zenodo](https://zenodo.org/record/4603494#.YE4BzlMzZH4).\n\nEach rosbag contains RGB images published on the `/image` topic at 20 Hz, imu measurements published on the`/imu` topic at ~1000 Hz (which we sub-sample to 200Hz for our evaluations), and plane-instance mask images published on the`/mask` topic at 20 Hz. The groundtruth trajectory is saved as a txt file in TUM format. The parameters for the camera and IMU used in our dataset are as follows,\n<br>\n<figure>\n<a href=\"https://user-images.githubusercontent.com/12653355/111068192-c3ade080-84ed-11eb-82ba-486ee0cfa2a4.png\"><img src=\"https://user-images.githubusercontent.com/12653355/111068192-c3ade080-84ed-11eb-82ba-486ee0cfa2a4.png\" width=\"300\"/></a>\n</figure>\n\nTo quantify the dynamic nature of our generated sequences, we compute the percentage of dynamic pixels out of all the pixels present in every image. We report these values in the following table,\n<figure>\n<a href=\"https://user-images.githubusercontent.com/12653355/111068119-6f0a6580-84ed-11eb-86ee-12571e7c7476.png\"><img src=\"https://user-images.githubusercontent.com/12653355/111068119-6f0a6580-84ed-11eb-86ee-12571e7c7476.png\" width=\"400\"/></a>\n</figure>\n\n### TO-DO\n- [ ] Provide Unreal Engine environment\n- [ ] Provide AirSim recording scripts\n\n## Acknowledgement\nOur code is built upon [VINS-Mono](https://github.com/HKUST-Aerial-Robotics/VINS-Mono). Its implementations of feature tracking, IMU preintegration, IMU state initialization, the reprojection factor, and marginalization are used as such. Our contributions include planar features tracking, planar homography based initialization, and the planar homography factor. All these changes (corresponding to a slightly older version) are available as a [git patch file](./rpvio.patch).\n\nFor our simulated dataset, we imported several high-quality assets from the [FlightGoggles](https://flightgoggles.mit.edu/) project into [Unreal Engine](unrealengine.com) before integrating it with AirSim. The dynamic characters were downloaded from [Mixamo](https://mixamo.com).\n"
  },
  {
    "path": "camera_model/CMakeLists.txt",
    "content": "cmake_minimum_required(VERSION 2.8.3)\nproject(camera_model)\n\nset(CMAKE_BUILD_TYPE \"Release\")\nset(CMAKE_CXX_FLAGS \"-std=c++11\")\nset(CMAKE_CXX_FLAGS_RELEASE \"-O3 -fPIC\")\n\nfind_package(catkin REQUIRED COMPONENTS\n    roscpp\n    std_msgs\n    )\n\nfind_package(Boost REQUIRED COMPONENTS filesystem program_options system)\ninclude_directories(${Boost_INCLUDE_DIRS})\n\nfind_package(OpenCV REQUIRED)\n\n# set(EIGEN_INCLUDE_DIR \"/usr/local/include/eigen3\")\nfind_package(Ceres REQUIRED)\ninclude_directories(${CERES_INCLUDE_DIRS})\n\n\ncatkin_package(\n    INCLUDE_DIRS include\n    LIBRARIES camera_model\n    CATKIN_DEPENDS roscpp std_msgs\n#    DEPENDS system_lib\n    )\n\ninclude_directories(\n    ${catkin_INCLUDE_DIRS}\n    )\n\ninclude_directories(\"include\")\n\n\nadd_executable(Calibration \n    src/intrinsic_calib.cc\n    src/chessboard/Chessboard.cc\n    src/calib/CameraCalibration.cc\n    src/camera_models/Camera.cc\n    src/camera_models/CameraFactory.cc\n    src/camera_models/CostFunctionFactory.cc\n    src/camera_models/PinholeCamera.cc\n    src/camera_models/CataCamera.cc\n    src/camera_models/EquidistantCamera.cc\n    src/camera_models/ScaramuzzaCamera.cc\n    src/sparse_graph/Transform.cc\n    src/gpl/gpl.cc\n    src/gpl/EigenQuaternionParameterization.cc)\n\nadd_library(camera_model\n    src/chessboard/Chessboard.cc\n    src/calib/CameraCalibration.cc\n    src/camera_models/Camera.cc\n    src/camera_models/CameraFactory.cc\n    src/camera_models/CostFunctionFactory.cc\n    src/camera_models/PinholeCamera.cc\n    src/camera_models/CataCamera.cc\n    src/camera_models/EquidistantCamera.cc\n    src/camera_models/ScaramuzzaCamera.cc\n    src/sparse_graph/Transform.cc\n    src/gpl/gpl.cc\n    src/gpl/EigenQuaternionParameterization.cc)\n\ntarget_link_libraries(Calibration ${Boost_LIBRARIES} ${OpenCV_LIBS} ${CERES_LIBRARIES})\ntarget_link_libraries(camera_model ${Boost_LIBRARIES} ${OpenCV_LIBS} ${CERES_LIBRARIES})\n"
  },
  {
    "path": "camera_model/cmake/FindEigen.cmake",
    "content": "# Ceres Solver - A fast non-linear least squares minimizer\n# Copyright 2015 Google Inc. All rights reserved.\n# http://ceres-solver.org/\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice,\n#   this list of conditions and the following disclaimer.\n# * Redistributions in binary form must reproduce the above copyright notice,\n#   this list of conditions and the following disclaimer in the documentation\n#   and/or other materials provided with the distribution.\n# * Neither the name of Google Inc. nor the names of its contributors may be\n#   used to endorse or promote products derived from this software without\n#   specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE\n# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE\n# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR\n# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF\n# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS\n# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN\n# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)\n# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE\n# POSSIBILITY OF SUCH DAMAGE.\n#\n# Author: alexs.mac@gmail.com (Alex Stewart)\n#\n\n# FindEigen.cmake - Find Eigen library, version >= 3.\n#\n# This module defines the following variables:\n#\n# EIGEN_FOUND: TRUE iff Eigen is found.\n# EIGEN_INCLUDE_DIRS: Include directories for Eigen.\n#\n# EIGEN_VERSION: Extracted from Eigen/src/Core/util/Macros.h\n# EIGEN_WORLD_VERSION: Equal to 3 if EIGEN_VERSION = 3.2.0\n# EIGEN_MAJOR_VERSION: Equal to 2 if EIGEN_VERSION = 3.2.0\n# EIGEN_MINOR_VERSION: Equal to 0 if EIGEN_VERSION = 3.2.0\n#\n# The following variables control the behaviour of this module:\n#\n# EIGEN_INCLUDE_DIR_HINTS: List of additional directories in which to\n#                          search for eigen includes, e.g: /timbuktu/eigen3.\n#\n# The following variables are also defined by this module, but in line with\n# CMake recommended FindPackage() module style should NOT be referenced directly\n# by callers (use the plural variables detailed above instead).  These variables\n# do however affect the behaviour of the module via FIND_[PATH/LIBRARY]() which\n# are NOT re-called (i.e. search for library is not repeated) if these variables\n# are set with valid values _in the CMake cache_. This means that if these\n# variables are set directly in the cache, either by the user in the CMake GUI,\n# or by the user passing -DVAR=VALUE directives to CMake when called (which\n# explicitly defines a cache variable), then they will be used verbatim,\n# bypassing the HINTS variables and other hard-coded search locations.\n#\n# EIGEN_INCLUDE_DIR: Include directory for CXSparse, not including the\n#                    include directory of any dependencies.\n\n# Called if we failed to find Eigen or any of it's required dependencies,\n# unsets all public (designed to be used externally) variables and reports\n# error message at priority depending upon [REQUIRED/QUIET/<NONE>] argument.\nmacro(EIGEN_REPORT_NOT_FOUND REASON_MSG)\n  unset(EIGEN_FOUND)\n  unset(EIGEN_INCLUDE_DIRS)\n  # Make results of search visible in the CMake GUI if Eigen has not\n  # been found so that user does not have to toggle to advanced view.\n  mark_as_advanced(CLEAR EIGEN_INCLUDE_DIR)\n  # Note <package>_FIND_[REQUIRED/QUIETLY] variables defined by FindPackage()\n  # use the camelcase library name, not uppercase.\n  if (Eigen_FIND_QUIETLY)\n    message(STATUS \"Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  elseif (Eigen_FIND_REQUIRED)\n    message(FATAL_ERROR \"Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  else()\n    # Neither QUIETLY nor REQUIRED, use no priority which emits a message\n    # but continues configuration and allows generation.\n    message(\"-- Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  endif ()\n  return()\nendmacro(EIGEN_REPORT_NOT_FOUND)\n\n# Protect against any alternative find_package scripts for this library having\n# been called previously (in a client project) which set EIGEN_FOUND, but not\n# the other variables we require / set here which could cause the search logic\n# here to fail.\nunset(EIGEN_FOUND)\n\n# Search user-installed locations first, so that we prefer user installs\n# to system installs where both exist.\nlist(APPEND EIGEN_CHECK_INCLUDE_DIRS\n  /usr/local/include\n  /usr/local/homebrew/include # Mac OS X\n  /opt/local/var/macports/software # Mac OS X.\n  /opt/local/include\n  /usr/include)\n# Additional suffixes to try appending to each search path.\nlist(APPEND EIGEN_CHECK_PATH_SUFFIXES\n  eigen3 # Default root directory for Eigen.\n  Eigen/include/eigen3 # Windows (for C:/Program Files prefix) < 3.3\n  Eigen3/include/eigen3 ) # Windows (for C:/Program Files prefix) >= 3.3\n\n# Search supplied hint directories first if supplied.\nfind_path(EIGEN_INCLUDE_DIR\n  NAMES Eigen/Core\n  PATHS ${EIGEN_INCLUDE_DIR_HINTS}\n  ${EIGEN_CHECK_INCLUDE_DIRS}\n  PATH_SUFFIXES ${EIGEN_CHECK_PATH_SUFFIXES})\n\nif (NOT EIGEN_INCLUDE_DIR OR\n    NOT EXISTS ${EIGEN_INCLUDE_DIR})\n  eigen_report_not_found(\n    \"Could not find eigen3 include directory, set EIGEN_INCLUDE_DIR to \"\n    \"path to eigen3 include directory, e.g. /usr/local/include/eigen3.\")\nendif (NOT EIGEN_INCLUDE_DIR OR\n       NOT EXISTS ${EIGEN_INCLUDE_DIR})\n\n# Mark internally as found, then verify. EIGEN_REPORT_NOT_FOUND() unsets\n# if called.\nset(EIGEN_FOUND TRUE)\n\n# Extract Eigen version from Eigen/src/Core/util/Macros.h\nif (EIGEN_INCLUDE_DIR)\n  set(EIGEN_VERSION_FILE ${EIGEN_INCLUDE_DIR}/Eigen/src/Core/util/Macros.h)\n  if (NOT EXISTS ${EIGEN_VERSION_FILE})\n    eigen_report_not_found(\n      \"Could not find file: ${EIGEN_VERSION_FILE} \"\n      \"containing version information in Eigen install located at: \"\n      \"${EIGEN_INCLUDE_DIR}.\")\n  else (NOT EXISTS ${EIGEN_VERSION_FILE})\n    file(READ ${EIGEN_VERSION_FILE} EIGEN_VERSION_FILE_CONTENTS)\n\n    string(REGEX MATCH \"#define EIGEN_WORLD_VERSION [0-9]+\"\n      EIGEN_WORLD_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_WORLD_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_WORLD_VERSION \"${EIGEN_WORLD_VERSION}\")\n\n    string(REGEX MATCH \"#define EIGEN_MAJOR_VERSION [0-9]+\"\n      EIGEN_MAJOR_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_MAJOR_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_MAJOR_VERSION \"${EIGEN_MAJOR_VERSION}\")\n\n    string(REGEX MATCH \"#define EIGEN_MINOR_VERSION [0-9]+\"\n      EIGEN_MINOR_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_MINOR_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_MINOR_VERSION \"${EIGEN_MINOR_VERSION}\")\n\n    # This is on a single line s/t CMake does not interpret it as a list of\n    # elements and insert ';' separators which would result in 3.;2.;0 nonsense.\n    set(EIGEN_VERSION \"${EIGEN_WORLD_VERSION}.${EIGEN_MAJOR_VERSION}.${EIGEN_MINOR_VERSION}\")\n  endif (NOT EXISTS ${EIGEN_VERSION_FILE})\nendif (EIGEN_INCLUDE_DIR)\n\n# Set standard CMake FindPackage variables if found.\nif (EIGEN_FOUND)\n  set(EIGEN_INCLUDE_DIRS ${EIGEN_INCLUDE_DIR})\nendif (EIGEN_FOUND)\n\n# Handle REQUIRED / QUIET optional arguments and version.\ninclude(FindPackageHandleStandardArgs)\nfind_package_handle_standard_args(Eigen\n  REQUIRED_VARS EIGEN_INCLUDE_DIRS\n  VERSION_VAR EIGEN_VERSION)\n\n# Only mark internal variables as advanced if we found Eigen, otherwise\n# leave it visible in the standard GUI for the user to set manually.\nif (EIGEN_FOUND)\n  mark_as_advanced(FORCE EIGEN_INCLUDE_DIR)\nendif (EIGEN_FOUND)\n"
  },
  {
    "path": "camera_model/include/camodocal/calib/CameraCalibration.h",
    "content": "#ifndef CAMERACALIBRATION_H\n#define CAMERACALIBRATION_H\n\n#include <opencv2/core/core.hpp>\n\n#include \"camodocal/camera_models/Camera.h\"\n\nnamespace camodocal\n{\n\nclass CameraCalibration\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n    CameraCalibration();\n\n    CameraCalibration(Camera::ModelType modelType,\n                      const std::string& cameraName,\n                      const cv::Size& imageSize,\n                      const cv::Size& boardSize,\n                      float squareSize);\n\n    void clear(void);\n\n    void addChessboardData(const std::vector<cv::Point2f>& corners);\n\n    bool calibrate(void);\n\n    int sampleCount(void) const;\n    std::vector<std::vector<cv::Point2f> >& imagePoints(void);\n    const std::vector<std::vector<cv::Point2f> >& imagePoints(void) const;\n    std::vector<std::vector<cv::Point3f> >& scenePoints(void);\n    const std::vector<std::vector<cv::Point3f> >& scenePoints(void) const;\n    CameraPtr& camera(void);\n    const CameraConstPtr camera(void) const;\n\n    Eigen::Matrix2d& measurementCovariance(void);\n    const Eigen::Matrix2d& measurementCovariance(void) const;\n\n    cv::Mat& cameraPoses(void);\n    const cv::Mat& cameraPoses(void) const;\n\n    void drawResults(std::vector<cv::Mat>& images) const;\n\n    void writeParams(const std::string& filename) const;\n\n    bool writeChessboardData(const std::string& filename) const;\n    bool readChessboardData(const std::string& filename);\n\n    void setVerbose(bool verbose);\n\nprivate:\n    bool calibrateHelper(CameraPtr& camera,\n                         std::vector<cv::Mat>& rvecs, std::vector<cv::Mat>& tvecs) const;\n\n    void optimize(CameraPtr& camera,\n                  std::vector<cv::Mat>& rvecs, std::vector<cv::Mat>& tvecs) const;\n\n    template<typename T>\n    void readData(std::ifstream& ifs, T& data) const;\n\n    template<typename T>\n    void writeData(std::ofstream& ofs, T data) const;\n\n    cv::Size m_boardSize;\n    float m_squareSize;\n\n    CameraPtr m_camera;\n    cv::Mat m_cameraPoses;\n\n    std::vector<std::vector<cv::Point2f> > m_imagePoints;\n    std::vector<std::vector<cv::Point3f> > m_scenePoints;\n\n    Eigen::Matrix2d m_measurementCovariance;\n\n    bool m_verbose;\n};\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/camera_models/Camera.h",
    "content": "#ifndef CAMERA_H\n#define CAMERA_H\n\n#include <boost/shared_ptr.hpp>\n#include <eigen3/Eigen/Dense>\n#include <opencv2/core/core.hpp>\n#include <vector>\n\nnamespace camodocal\n{\n\nclass Camera\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n    enum ModelType\n    {\n        KANNALA_BRANDT,\n        MEI,\n        PINHOLE,\n        SCARAMUZZA\n    };\n\n    class Parameters\n    {\n    public:\n        EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n        Parameters(ModelType modelType);\n\n        Parameters(ModelType modelType, const std::string& cameraName,\n                   int w, int h);\n\n        ModelType& modelType(void);\n        std::string& cameraName(void);\n        int& imageWidth(void);\n        int& imageHeight(void);\n\n        ModelType modelType(void) const;\n        const std::string& cameraName(void) const;\n        int imageWidth(void) const;\n        int imageHeight(void) const;\n\n        int nIntrinsics(void) const;\n\n        virtual bool readFromYamlFile(const std::string& filename) = 0;\n        virtual void writeToYamlFile(const std::string& filename) const = 0;\n\n    protected:\n        ModelType m_modelType;\n        int m_nIntrinsics;\n        std::string m_cameraName;\n        int m_imageWidth;\n        int m_imageHeight;\n    };\n\n    virtual ModelType modelType(void) const = 0;\n    virtual const std::string& cameraName(void) const = 0;\n    virtual int imageWidth(void) const = 0;\n    virtual int imageHeight(void) const = 0;\n\n    virtual cv::Mat& mask(void);\n    virtual const cv::Mat& mask(void) const;\n\n    virtual void estimateIntrinsics(const cv::Size& boardSize,\n                                    const std::vector< std::vector<cv::Point3f> >& objectPoints,\n                                    const std::vector< std::vector<cv::Point2f> >& imagePoints) = 0;\n    virtual void estimateExtrinsics(const std::vector<cv::Point3f>& objectPoints,\n                                    const std::vector<cv::Point2f>& imagePoints,\n                                    cv::Mat& rvec, cv::Mat& tvec) const;\n\n    // Lift points from the image plane to the sphere\n    virtual void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const = 0;\n    //%output P\n\n    // Lift points from the image plane to the projective space\n    virtual void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const = 0;\n    //%output P\n\n    // Projects 3D points to the image plane (Pi function)\n    virtual void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const = 0;\n    //%output p\n\n    // Projects 3D points to the image plane (Pi function)\n    // and calculates jacobian\n    //virtual void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p,\n    //                          Eigen::Matrix<double,2,3>& J) const = 0;\n    //%output p\n    //%output J\n\n    virtual void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const = 0;\n    //%output p\n\n    //virtual void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const = 0;\n    virtual cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\n                                            float fx = -1.0f, float fy = -1.0f,\n                                            cv::Size imageSize = cv::Size(0, 0),\n                                            float cx = -1.0f, float cy = -1.0f,\n                                            cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const = 0;\n\n    virtual int parameterCount(void) const = 0;\n\n    virtual void readParameters(const std::vector<double>& parameters) = 0;\n    virtual void writeParameters(std::vector<double>& parameters) const = 0;\n\n    virtual void writeParametersToYamlFile(const std::string& filename) const = 0;\n\n    virtual std::string parametersToString(void) const = 0;\n\n    /**\n     * \\brief Calculates the reprojection distance between points\n     *\n     * \\param P1 first 3D point coordinates\n     * \\param P2 second 3D point coordinates\n     * \\return euclidean distance in the plane\n     */\n    double reprojectionDist(const Eigen::Vector3d& P1, const Eigen::Vector3d& P2) const;\n\n    double reprojectionError(const std::vector< std::vector<cv::Point3f> >& objectPoints,\n                             const std::vector< std::vector<cv::Point2f> >& imagePoints,\n                             const std::vector<cv::Mat>& rvecs,\n                             const std::vector<cv::Mat>& tvecs,\n                             cv::OutputArray perViewErrors = cv::noArray()) const;\n\n    double reprojectionError(const Eigen::Vector3d& P,\n                             const Eigen::Quaterniond& camera_q,\n                             const Eigen::Vector3d& camera_t,\n                             const Eigen::Vector2d& observed_p) const;\n\n    void projectPoints(const std::vector<cv::Point3f>& objectPoints,\n                       const cv::Mat& rvec,\n                       const cv::Mat& tvec,\n                       std::vector<cv::Point2f>& imagePoints) const;\nprotected:\n    cv::Mat m_mask;\n};\n\ntypedef boost::shared_ptr<Camera> CameraPtr;\ntypedef boost::shared_ptr<const Camera> CameraConstPtr;\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/camera_models/CameraFactory.h",
    "content": "#ifndef CAMERAFACTORY_H\n#define CAMERAFACTORY_H\n\n#include <boost/shared_ptr.hpp>\n#include <opencv2/core/core.hpp>\n\n#include \"camodocal/camera_models/Camera.h\"\n\nnamespace camodocal\n{\n\nclass CameraFactory\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n    CameraFactory();\n\n    static boost::shared_ptr<CameraFactory> instance(void);\n\n    CameraPtr generateCamera(Camera::ModelType modelType,\n                             const std::string& cameraName,\n                             cv::Size imageSize) const;\n\n    CameraPtr generateCameraFromYamlFile(const std::string& filename);\n\nprivate:\n    static boost::shared_ptr<CameraFactory> m_instance;\n};\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/camera_models/CataCamera.h",
    "content": "#ifndef CATACAMERA_H\r\n#define CATACAMERA_H\r\n\r\n#include <opencv2/core/core.hpp>\r\n#include <string>\r\n\r\n#include \"ceres/rotation.h\"\r\n#include \"Camera.h\"\r\n\r\nnamespace camodocal\r\n{\r\n\r\n/**\r\n * C. Mei, and P. Rives, Single View Point Omnidirectional Camera Calibration\r\n * from Planar Grids, ICRA 2007\r\n */\r\n\r\nclass CataCamera: public Camera\r\n{\r\npublic:\r\n    class Parameters: public Camera::Parameters\r\n    {\r\n    public:\r\n        Parameters();\r\n        Parameters(const std::string& cameraName,\r\n                   int w, int h,\r\n                   double xi,\r\n                   double k1, double k2, double p1, double p2,\r\n                   double gamma1, double gamma2, double u0, double v0);\r\n\r\n        double& xi(void);\r\n        double& k1(void);\r\n        double& k2(void);\r\n        double& p1(void);\r\n        double& p2(void);\r\n        double& gamma1(void);\r\n        double& gamma2(void);\r\n        double& u0(void);\r\n        double& v0(void);\r\n\r\n        double xi(void) const;\r\n        double k1(void) const;\r\n        double k2(void) const;\r\n        double p1(void) const;\r\n        double p2(void) const;\r\n        double gamma1(void) const;\r\n        double gamma2(void) const;\r\n        double u0(void) const;\r\n        double v0(void) const;\r\n\r\n        bool readFromYamlFile(const std::string& filename);\r\n        void writeToYamlFile(const std::string& filename) const;\r\n\r\n        Parameters& operator=(const Parameters& other);\r\n        friend std::ostream& operator<< (std::ostream& out, const Parameters& params);\r\n\r\n    private:\r\n        double m_xi;\r\n        double m_k1;\r\n        double m_k2;\r\n        double m_p1;\r\n        double m_p2;\r\n        double m_gamma1;\r\n        double m_gamma2;\r\n        double m_u0;\r\n        double m_v0;\r\n    };\r\n\r\n    CataCamera();\r\n\r\n    /**\r\n    * \\brief Constructor from the projection model parameters\r\n    */\r\n    CataCamera(const std::string& cameraName,\r\n               int imageWidth, int imageHeight,\r\n               double xi, double k1, double k2, double p1, double p2,\r\n               double gamma1, double gamma2, double u0, double v0);\r\n    /**\r\n    * \\brief Constructor from the projection model parameters\r\n    */\r\n    CataCamera(const Parameters& params);\r\n\r\n    Camera::ModelType modelType(void) const;\r\n    const std::string& cameraName(void) const;\r\n    int imageWidth(void) const;\r\n    int imageHeight(void) const;\r\n\r\n    void estimateIntrinsics(const cv::Size& boardSize,\r\n                            const std::vector< std::vector<cv::Point3f> >& objectPoints,\r\n                            const std::vector< std::vector<cv::Point2f> >& imagePoints);\r\n\r\n    // Lift points from the image plane to the sphere\r\n    void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const;\r\n    //%output P\r\n\r\n    // Lift points from the image plane to the projective space\r\n    void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const;\r\n    //%output P\r\n\r\n    // Projects 3D points to the image plane (Pi function)\r\n    void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const;\r\n    //%output p\r\n\r\n    // Projects 3D points to the image plane (Pi function)\r\n    // and calculates jacobian\r\n    void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p,\r\n                      Eigen::Matrix<double,2,3>& J) const;\r\n    //%output p\r\n    //%output J\r\n\r\n    void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const;\r\n    //%output p\r\n\r\n    template <typename T>\r\n    static void spaceToPlane(const T* const params,\r\n                             const T* const q, const T* const t,\r\n                             const Eigen::Matrix<T, 3, 1>& P,\r\n                             Eigen::Matrix<T, 2, 1>& p);\r\n\r\n    void distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u) const;\r\n    void distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u,\r\n                    Eigen::Matrix2d& J) const;\r\n\r\n    void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const;\r\n    cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\r\n                                    float fx = -1.0f, float fy = -1.0f,\r\n                                    cv::Size imageSize = cv::Size(0, 0),\r\n                                    float cx = -1.0f, float cy = -1.0f,\r\n                                    cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const;\r\n\r\n    int parameterCount(void) const;\r\n\r\n    const Parameters& getParameters(void) const;\r\n    void setParameters(const Parameters& parameters);\r\n\r\n    void readParameters(const std::vector<double>& parameterVec);\r\n    void writeParameters(std::vector<double>& parameterVec) const;\r\n\r\n    void writeParametersToYamlFile(const std::string& filename) const;\r\n\r\n    std::string parametersToString(void) const;\r\n\r\nprivate:\r\n    Parameters mParameters;\r\n\r\n    double m_inv_K11, m_inv_K13, m_inv_K22, m_inv_K23;\r\n    bool m_noDistortion;\r\n};\r\n\r\ntypedef boost::shared_ptr<CataCamera> CataCameraPtr;\r\ntypedef boost::shared_ptr<const CataCamera> CataCameraConstPtr;\r\n\r\ntemplate <typename T>\r\nvoid\r\nCataCamera::spaceToPlane(const T* const params,\r\n                         const T* const q, const T* const t,\r\n                         const Eigen::Matrix<T, 3, 1>& P,\r\n                         Eigen::Matrix<T, 2, 1>& p)\r\n{\r\n    T P_w[3];\r\n    P_w[0] = T(P(0));\r\n    P_w[1] = T(P(1));\r\n    P_w[2] = T(P(2));\r\n\r\n    // Convert quaternion from Eigen convention (x, y, z, w)\r\n    // to Ceres convention (w, x, y, z)\r\n    T q_ceres[4] = {q[3], q[0], q[1], q[2]};\r\n\r\n    T P_c[3];\r\n    ceres::QuaternionRotatePoint(q_ceres, P_w, P_c);\r\n\r\n    P_c[0] += t[0];\r\n    P_c[1] += t[1];\r\n    P_c[2] += t[2];\r\n\r\n    // project 3D object point to the image plane\r\n    T xi = params[0];\r\n    T k1 = params[1];\r\n    T k2 = params[2];\r\n    T p1 = params[3];\r\n    T p2 = params[4];\r\n    T gamma1 = params[5];\r\n    T gamma2 = params[6];\r\n    T alpha = T(0); //cameraParams.alpha();\r\n    T u0 = params[7];\r\n    T v0 = params[8];\r\n\r\n    // Transform to model plane\r\n    T len = sqrt(P_c[0] * P_c[0] + P_c[1] * P_c[1] + P_c[2] * P_c[2]);\r\n    P_c[0] /= len;\r\n    P_c[1] /= len;\r\n    P_c[2] /= len;\r\n\r\n    T u = P_c[0] / (P_c[2] + xi);\r\n    T v = P_c[1] / (P_c[2] + xi);\r\n\r\n    T rho_sqr = u * u + v * v;\r\n    T L = T(1.0) + k1 * rho_sqr + k2 * rho_sqr * rho_sqr;\r\n    T du = T(2.0) * p1 * u * v + p2 * (rho_sqr + T(2.0) * u * u);\r\n    T dv = p1 * (rho_sqr + T(2.0) * v * v) + T(2.0) * p2 * u * v;\r\n\r\n    u = L * u + du;\r\n    v = L * v + dv;\r\n    p(0) = gamma1 * (u + alpha * v) + u0;\r\n    p(1) = gamma2 * v + v0;\r\n}\r\n\r\n}\r\n\r\n#endif\r\n"
  },
  {
    "path": "camera_model/include/camodocal/camera_models/CostFunctionFactory.h",
    "content": "#ifndef COSTFUNCTIONFACTORY_H\n#define COSTFUNCTIONFACTORY_H\n\n#include <boost/shared_ptr.hpp>\n#include <opencv2/core/core.hpp>\n\n#include \"camodocal/camera_models/Camera.h\"\n\nnamespace ceres\n{\n    class CostFunction;\n}\n\nnamespace camodocal\n{\n\nenum\n{\n    CAMERA_INTRINSICS =         1 << 0,\n    CAMERA_POSE =               1 << 1,\n    POINT_3D =                  1 << 2,\n    ODOMETRY_INTRINSICS =       1 << 3,\n    ODOMETRY_3D_POSE =          1 << 4,\n    ODOMETRY_6D_POSE =          1 << 5,\n    CAMERA_ODOMETRY_TRANSFORM = 1 << 6\n};\n\nclass CostFunctionFactory\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n    CostFunctionFactory();\n\n    static boost::shared_ptr<CostFunctionFactory> instance(void);\n\n    ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera,\n                                              const Eigen::Vector3d& observed_P,\n                                              const Eigen::Vector2d& observed_p,\n                                              int flags) const;\n\n    ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera,\n                                              const Eigen::Vector3d& observed_P,\n                                              const Eigen::Vector2d& observed_p,\n                                              const Eigen::Matrix2d& sqrtPrecisionMat,\n                                              int flags) const;\n\n    ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera,\n                                              const Eigen::Vector2d& observed_p,\n                                              int flags, bool optimize_cam_odo_z = true) const;\n\n    ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera,\n                                              const Eigen::Vector2d& observed_p,\n                                              const Eigen::Matrix2d& sqrtPrecisionMat,\n                                              int flags, bool optimize_cam_odo_z = true) const;\n\n    ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera,\n                                              const Eigen::Vector3d& odo_pos,\n                                              const Eigen::Vector3d& odo_att,\n                                              const Eigen::Vector2d& observed_p,\n                                              int flags, bool optimize_cam_odo_z = true) const;\n\n    ceres::CostFunction* generateCostFunction(const CameraConstPtr& camera,\n                                              const Eigen::Quaterniond& cam_odo_q,\n                                              const Eigen::Vector3d& cam_odo_t,\n                                              const Eigen::Vector3d& odo_pos,\n                                              const Eigen::Vector3d& odo_att,\n                                              const Eigen::Vector2d& observed_p,\n                                              int flags) const;\n\n    ceres::CostFunction* generateCostFunction(const CameraConstPtr& cameraLeft,\n                                              const CameraConstPtr& cameraRight,\n                                              const Eigen::Vector3d& observed_P,\n                                              const Eigen::Vector2d& observed_p_left,\n                                              const Eigen::Vector2d& observed_p_right) const;\n\nprivate:\n    static boost::shared_ptr<CostFunctionFactory> m_instance;\n};\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/camera_models/EquidistantCamera.h",
    "content": "#ifndef EQUIDISTANTCAMERA_H\r\n#define EQUIDISTANTCAMERA_H\r\n\r\n#include <opencv2/core/core.hpp>\r\n#include <string>\r\n\r\n#include \"ceres/rotation.h\"\r\n#include \"Camera.h\"\r\n\r\nnamespace camodocal\r\n{\r\n\r\n/**\r\n * J. Kannala, and S. Brandt, A Generic Camera Model and Calibration Method\r\n * for Conventional, Wide-Angle, and Fish-Eye Lenses, PAMI 2006\r\n */\r\n\r\nclass EquidistantCamera: public Camera\r\n{\r\npublic:\r\n    class Parameters: public Camera::Parameters\r\n    {\r\n    public:\r\n        Parameters();\r\n        Parameters(const std::string& cameraName,\r\n                   int w, int h,\r\n                   double k2, double k3, double k4, double k5,\r\n                   double mu, double mv,\r\n                   double u0, double v0);\r\n\r\n        double& k2(void);\r\n        double& k3(void);\r\n        double& k4(void);\r\n        double& k5(void);\r\n        double& mu(void);\r\n        double& mv(void);\r\n        double& u0(void);\r\n        double& v0(void);\r\n\r\n        double k2(void) const;\r\n        double k3(void) const;\r\n        double k4(void) const;\r\n        double k5(void) const;\r\n        double mu(void) const;\r\n        double mv(void) const;\r\n        double u0(void) const;\r\n        double v0(void) const;\r\n\r\n        bool readFromYamlFile(const std::string& filename);\r\n        void writeToYamlFile(const std::string& filename) const;\r\n\r\n        Parameters& operator=(const Parameters& other);\r\n        friend std::ostream& operator<< (std::ostream& out, const Parameters& params);\r\n\r\n    private:\r\n        // projection\r\n        double m_k2;\r\n        double m_k3;\r\n        double m_k4;\r\n        double m_k5;\r\n\r\n        double m_mu;\r\n        double m_mv;\r\n        double m_u0;\r\n        double m_v0;\r\n    };\r\n\r\n    EquidistantCamera();\r\n\r\n    /**\r\n    * \\brief Constructor from the projection model parameters\r\n    */\r\n    EquidistantCamera(const std::string& cameraName,\r\n                      int imageWidth, int imageHeight,\r\n                      double k2, double k3, double k4, double k5,\r\n                      double mu, double mv,\r\n                      double u0, double v0);\r\n    /**\r\n    * \\brief Constructor from the projection model parameters\r\n    */\r\n    EquidistantCamera(const Parameters& params);\r\n\r\n    Camera::ModelType modelType(void) const;\r\n    const std::string& cameraName(void) const;\r\n    int imageWidth(void) const;\r\n    int imageHeight(void) const;\r\n\r\n    void estimateIntrinsics(const cv::Size& boardSize,\r\n                            const std::vector< std::vector<cv::Point3f> >& objectPoints,\r\n                            const std::vector< std::vector<cv::Point2f> >& imagePoints);\r\n\r\n    // Lift points from the image plane to the sphere\r\n    virtual void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const;\r\n    //%output P\r\n\r\n    // Lift points from the image plane to the projective space\r\n    void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const;\r\n    //%output P\r\n\r\n    // Projects 3D points to the image plane (Pi function)\r\n    void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const;\r\n    //%output p\r\n\r\n    // Projects 3D points to the image plane (Pi function)\r\n    // and calculates jacobian\r\n    void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p,\r\n                      Eigen::Matrix<double,2,3>& J) const;\r\n    //%output p\r\n    //%output J\r\n\r\n    void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const;\r\n    //%output p\r\n\r\n    template <typename T>\r\n    static void spaceToPlane(const T* const params,\r\n                             const T* const q, const T* const t,\r\n                             const Eigen::Matrix<T, 3, 1>& P,\r\n                             Eigen::Matrix<T, 2, 1>& p);\r\n\r\n    void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const;\r\n    cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\r\n                                    float fx = -1.0f, float fy = -1.0f,\r\n                                    cv::Size imageSize = cv::Size(0, 0),\r\n                                    float cx = -1.0f, float cy = -1.0f,\r\n                                    cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const;\r\n\r\n    int parameterCount(void) const;\r\n\r\n    const Parameters& getParameters(void) const;\r\n    void setParameters(const Parameters& parameters);\r\n\r\n    void readParameters(const std::vector<double>& parameterVec);\r\n    void writeParameters(std::vector<double>& parameterVec) const;\r\n\r\n    void writeParametersToYamlFile(const std::string& filename) const;\r\n\r\n    std::string parametersToString(void) const;\r\n\r\nprivate:\r\n    template<typename T>\r\n    static T r(T k2, T k3, T k4, T k5, T theta);\r\n\r\n\r\n    void fitOddPoly(const std::vector<double>& x, const std::vector<double>& y,\r\n                    int n, std::vector<double>& coeffs) const;\r\n\r\n    void backprojectSymmetric(const Eigen::Vector2d& p_u,\r\n                              double& theta, double& phi) const;\r\n\r\n    Parameters mParameters;\r\n\r\n    double m_inv_K11, m_inv_K13, m_inv_K22, m_inv_K23;\r\n};\r\n\r\ntypedef boost::shared_ptr<EquidistantCamera> EquidistantCameraPtr;\r\ntypedef boost::shared_ptr<const EquidistantCamera> EquidistantCameraConstPtr;\r\n\r\ntemplate<typename T>\r\nT\r\nEquidistantCamera::r(T k2, T k3, T k4, T k5, T theta)\r\n{\r\n    // k1 = 1\r\n    return theta +\r\n           k2 * theta * theta * theta +\r\n           k3 * theta * theta * theta * theta * theta +\r\n           k4 * theta * theta * theta * theta * theta * theta * theta +\r\n           k5 * theta * theta * theta * theta * theta * theta * theta * theta * theta;\r\n}\r\n\r\ntemplate <typename T>\r\nvoid\r\nEquidistantCamera::spaceToPlane(const T* const params,\r\n                                const T* const q, const T* const t,\r\n                                const Eigen::Matrix<T, 3, 1>& P,\r\n                                Eigen::Matrix<T, 2, 1>& p)\r\n{\r\n    T P_w[3];\r\n    P_w[0] = T(P(0));\r\n    P_w[1] = T(P(1));\r\n    P_w[2] = T(P(2));\r\n\r\n    // Convert quaternion from Eigen convention (x, y, z, w)\r\n    // to Ceres convention (w, x, y, z)\r\n    T q_ceres[4] = {q[3], q[0], q[1], q[2]};\r\n\r\n    T P_c[3];\r\n    ceres::QuaternionRotatePoint(q_ceres, P_w, P_c);\r\n\r\n    P_c[0] += t[0];\r\n    P_c[1] += t[1];\r\n    P_c[2] += t[2];\r\n\r\n    // project 3D object point to the image plane;\r\n    T k2 = params[0];\r\n    T k3 = params[1];\r\n    T k4 = params[2];\r\n    T k5 = params[3];\r\n    T mu = params[4];\r\n    T mv = params[5];\r\n    T u0 = params[6];\r\n    T v0 = params[7];\r\n\r\n    T len = sqrt(P_c[0] * P_c[0] + P_c[1] * P_c[1] + P_c[2] * P_c[2]);\r\n    T theta = acos(P_c[2] / len);\r\n    T phi = atan2(P_c[1], P_c[0]);\r\n\r\n    Eigen::Matrix<T,2,1> p_u = r(k2, k3, k4, k5, theta) * Eigen::Matrix<T,2,1>(cos(phi), sin(phi));\r\n\r\n    p(0) = mu * p_u(0) + u0;\r\n    p(1) = mv * p_u(1) + v0;\r\n}\r\n\r\n}\r\n\r\n#endif\r\n"
  },
  {
    "path": "camera_model/include/camodocal/camera_models/PinholeCamera.h",
    "content": "#ifndef PINHOLECAMERA_H\n#define PINHOLECAMERA_H\n\n#include <opencv2/core/core.hpp>\n#include <string>\n\n#include \"ceres/rotation.h\"\n#include \"Camera.h\"\n\nnamespace camodocal\n{\n\nclass PinholeCamera: public Camera\n{\npublic:\n    class Parameters: public Camera::Parameters\n    {\n    public:\n        Parameters();\n        Parameters(const std::string& cameraName,\n                   int w, int h,\n                   double k1, double k2, double p1, double p2,\n                   double fx, double fy, double cx, double cy);\n\n        double& k1(void);\n        double& k2(void);\n        double& p1(void);\n        double& p2(void);\n        double& fx(void);\n        double& fy(void);\n        double& cx(void);\n        double& cy(void);\n\n        double xi(void) const;\n        double k1(void) const;\n        double k2(void) const;\n        double p1(void) const;\n        double p2(void) const;\n        double fx(void) const;\n        double fy(void) const;\n        double cx(void) const;\n        double cy(void) const;\n\n        bool readFromYamlFile(const std::string& filename);\n        void writeToYamlFile(const std::string& filename) const;\n\n        Parameters& operator=(const Parameters& other);\n        friend std::ostream& operator<< (std::ostream& out, const Parameters& params);\n\n    private:\n        double m_k1;\n        double m_k2;\n        double m_p1;\n        double m_p2;\n        double m_fx;\n        double m_fy;\n        double m_cx;\n        double m_cy;\n    };\n\n    PinholeCamera();\n\n    /**\n    * \\brief Constructor from the projection model parameters\n    */\n    PinholeCamera(const std::string& cameraName,\n                  int imageWidth, int imageHeight,\n                  double k1, double k2, double p1, double p2,\n                  double fx, double fy, double cx, double cy);\n    /**\n    * \\brief Constructor from the projection model parameters\n    */\n    PinholeCamera(const Parameters& params);\n\n    Camera::ModelType modelType(void) const;\n    const std::string& cameraName(void) const;\n    int imageWidth(void) const;\n    int imageHeight(void) const;\n\n    void estimateIntrinsics(const cv::Size& boardSize,\n                            const std::vector< std::vector<cv::Point3f> >& objectPoints,\n                            const std::vector< std::vector<cv::Point2f> >& imagePoints);\n\n    // Lift points from the image plane to the sphere\n    virtual void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const;\n    //%output P\n\n    // Lift points from the image plane to the projective space\n    void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const;\n    //%output P\n\n    // Projects 3D points to the image plane (Pi function)\n    void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const;\n    //%output p\n\n    // Projects 3D points to the image plane (Pi function)\n    // and calculates jacobian\n    void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p,\n                      Eigen::Matrix<double,2,3>& J) const;\n    //%output p\n    //%output J\n\n    void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const;\n    //%output p\n\n    template <typename T>\n    static void spaceToPlane(const T* const params,\n                             const T* const q, const T* const t,\n                             const Eigen::Matrix<T, 3, 1>& P,\n                             Eigen::Matrix<T, 2, 1>& p);\n\n    void distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u) const;\n    void distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u,\n                    Eigen::Matrix2d& J) const;\n\n    void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const;\n    cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\n                                    float fx = -1.0f, float fy = -1.0f,\n                                    cv::Size imageSize = cv::Size(0, 0),\n                                    float cx = -1.0f, float cy = -1.0f,\n                                    cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const;\n\n    int parameterCount(void) const;\n\n    const Parameters& getParameters(void) const;\n    void setParameters(const Parameters& parameters);\n\n    void readParameters(const std::vector<double>& parameterVec);\n    void writeParameters(std::vector<double>& parameterVec) const;\n\n    void writeParametersToYamlFile(const std::string& filename) const;\n\n    std::string parametersToString(void) const;\n\nprivate:\n    Parameters mParameters;\n\n    double m_inv_K11, m_inv_K13, m_inv_K22, m_inv_K23;\n    bool m_noDistortion;\n};\n\ntypedef boost::shared_ptr<PinholeCamera> PinholeCameraPtr;\ntypedef boost::shared_ptr<const PinholeCamera> PinholeCameraConstPtr;\n\ntemplate <typename T>\nvoid\nPinholeCamera::spaceToPlane(const T* const params,\n                            const T* const q, const T* const t,\n                            const Eigen::Matrix<T, 3, 1>& P,\n                            Eigen::Matrix<T, 2, 1>& p)\n{\n    T P_w[3];\n    P_w[0] = T(P(0));\n    P_w[1] = T(P(1));\n    P_w[2] = T(P(2));\n\n    // Convert quaternion from Eigen convention (x, y, z, w)\n    // to Ceres convention (w, x, y, z)\n    T q_ceres[4] = {q[3], q[0], q[1], q[2]};\n\n    T P_c[3];\n    ceres::QuaternionRotatePoint(q_ceres, P_w, P_c);\n\n    P_c[0] += t[0];\n    P_c[1] += t[1];\n    P_c[2] += t[2];\n\n    // project 3D object point to the image plane\n    T k1 = params[0];\n    T k2 = params[1];\n    T p1 = params[2];\n    T p2 = params[3];\n    T fx = params[4];\n    T fy = params[5];\n    T alpha = T(0); //cameraParams.alpha();\n    T cx = params[6];\n    T cy = params[7];\n\n    // Transform to model plane\n    T u = P_c[0] / P_c[2];\n    T v = P_c[1] / P_c[2];\n\n    T rho_sqr = u * u + v * v;\n    T L = T(1.0) + k1 * rho_sqr + k2 * rho_sqr * rho_sqr;\n    T du = T(2.0) * p1 * u * v + p2 * (rho_sqr + T(2.0) * u * u);\n    T dv = p1 * (rho_sqr + T(2.0) * v * v) + T(2.0) * p2 * u * v;\n\n    u = L * u + du;\n    v = L * v + dv;\n    p(0) = fx * (u + alpha * v) + cx;\n    p(1) = fy * v + cy;\n}\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/camera_models/ScaramuzzaCamera.h",
    "content": "#ifndef SCARAMUZZACAMERA_H\r\n#define SCARAMUZZACAMERA_H\r\n\r\n#include <opencv2/core/core.hpp>\r\n#include <string>\r\n\r\n#include \"ceres/rotation.h\"\r\n#include \"Camera.h\"\r\n\r\nnamespace camodocal\r\n{\r\n\r\n#define SCARAMUZZA_POLY_SIZE 5\r\n#define SCARAMUZZA_INV_POLY_SIZE 20\r\n\r\n#define SCARAMUZZA_CAMERA_NUM_PARAMS (SCARAMUZZA_POLY_SIZE + SCARAMUZZA_INV_POLY_SIZE + 2 /*center*/ + 3 /*affine*/)\r\n\r\n/**\r\n * Scaramuzza Camera (Omnidirectional)\r\n * https://sites.google.com/site/scarabotix/ocamcalib-toolbox\r\n */\r\n\r\nclass OCAMCamera: public Camera\r\n{\r\npublic:\r\n    class Parameters: public Camera::Parameters\r\n    {\r\n    public:\r\n        Parameters();\r\n\r\n        double& C(void) { return m_C; }\r\n        double& D(void) { return m_D; }\r\n        double& E(void) { return m_E; }\r\n\r\n        double& center_x(void) { return m_center_x; }\r\n        double& center_y(void) { return m_center_y; }\r\n\r\n        double& poly(int idx) { return m_poly[idx]; }\r\n        double& inv_poly(int idx) { return m_inv_poly[idx]; }\r\n\r\n        double C(void) const { return m_C; }\r\n        double D(void) const { return m_D; }\r\n        double E(void) const { return m_E; }\r\n\r\n        double center_x(void) const { return m_center_x; }\r\n        double center_y(void) const { return m_center_y; }\r\n\r\n        double poly(int idx) const { return m_poly[idx]; }\r\n        double inv_poly(int idx) const { return m_inv_poly[idx]; }\r\n\r\n        bool readFromYamlFile(const std::string& filename);\r\n        void writeToYamlFile(const std::string& filename) const;\r\n\r\n        Parameters& operator=(const Parameters& other);\r\n        friend std::ostream& operator<< (std::ostream& out, const Parameters& params);\r\n\r\n    private:\r\n        double m_poly[SCARAMUZZA_POLY_SIZE];\r\n        double m_inv_poly[SCARAMUZZA_INV_POLY_SIZE];\r\n        double m_C;\r\n        double m_D;\r\n        double m_E;\r\n        double m_center_x;\r\n        double m_center_y;\r\n    };\r\n\r\n    OCAMCamera();\r\n\r\n    /**\r\n    * \\brief Constructor from the projection model parameters\r\n    */\r\n    OCAMCamera(const Parameters& params);\r\n\r\n    Camera::ModelType modelType(void) const;\r\n    const std::string& cameraName(void) const;\r\n    int imageWidth(void) const;\r\n    int imageHeight(void) const;\r\n\r\n    void estimateIntrinsics(const cv::Size& boardSize,\r\n                            const std::vector< std::vector<cv::Point3f> >& objectPoints,\r\n                            const std::vector< std::vector<cv::Point2f> >& imagePoints);\r\n\r\n    // Lift points from the image plane to the sphere\r\n    void liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const;\r\n    //%output P\r\n\r\n    // Lift points from the image plane to the projective space\r\n    void liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const;\r\n    //%output P\r\n\r\n    // Projects 3D points to the image plane (Pi function)\r\n    void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const;\r\n    //%output p\r\n\r\n    // Projects 3D points to the image plane (Pi function)\r\n    // and calculates jacobian\r\n    //void spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p,\r\n    //                  Eigen::Matrix<double,2,3>& J) const;\r\n    //%output p\r\n    //%output J\r\n\r\n    void undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const;\r\n    //%output p\r\n\r\n    template <typename T>\r\n    static void spaceToPlane(const T* const params,\r\n                             const T* const q, const T* const t,\r\n                             const Eigen::Matrix<T, 3, 1>& P,\r\n                             Eigen::Matrix<T, 2, 1>& p);\r\n    template <typename T>\r\n    static void spaceToSphere(const T* const params,\r\n                              const T* const q, const T* const t,\r\n                              const Eigen::Matrix<T, 3, 1>& P,\r\n                              Eigen::Matrix<T, 3, 1>& P_s);\r\n    template <typename T>\r\n    static void LiftToSphere(const T* const params,\r\n                              const Eigen::Matrix<T, 2, 1>& p,\r\n                              Eigen::Matrix<T, 3, 1>& P);\r\n\r\n    template <typename T>\r\n    static void SphereToPlane(const T* const params, const Eigen::Matrix<T, 3, 1>& P,\r\n                               Eigen::Matrix<T, 2, 1>& p);\r\n\r\n\r\n    void initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale = 1.0) const;\r\n    cv::Mat initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\r\n                                    float fx = -1.0f, float fy = -1.0f,\r\n                                    cv::Size imageSize = cv::Size(0, 0),\r\n                                    float cx = -1.0f, float cy = -1.0f,\r\n                                    cv::Mat rmat = cv::Mat::eye(3, 3, CV_32F)) const;\r\n\r\n    int parameterCount(void) const;\r\n\r\n    const Parameters& getParameters(void) const;\r\n    void setParameters(const Parameters& parameters);\r\n\r\n    void readParameters(const std::vector<double>& parameterVec);\r\n    void writeParameters(std::vector<double>& parameterVec) const;\r\n\r\n    void writeParametersToYamlFile(const std::string& filename) const;\r\n\r\n    std::string parametersToString(void) const;\r\n\r\nprivate:\r\n    Parameters mParameters;\r\n\r\n    double m_inv_scale;\r\n};\r\n\r\ntypedef boost::shared_ptr<OCAMCamera> OCAMCameraPtr;\r\ntypedef boost::shared_ptr<const OCAMCamera> OCAMCameraConstPtr;\r\n\r\ntemplate <typename T>\r\nvoid\r\nOCAMCamera::spaceToPlane(const T* const params,\r\n                         const T* const q, const T* const t,\r\n                         const Eigen::Matrix<T, 3, 1>& P,\r\n                         Eigen::Matrix<T, 2, 1>& p)\r\n{\r\n    T P_c[3];\r\n    {\r\n        T P_w[3];\r\n        P_w[0] = T(P(0));\r\n        P_w[1] = T(P(1));\r\n        P_w[2] = T(P(2));\r\n\r\n        // Convert quaternion from Eigen convention (x, y, z, w)\r\n        // to Ceres convention (w, x, y, z)\r\n        T q_ceres[4] = {q[3], q[0], q[1], q[2]};\r\n\r\n        ceres::QuaternionRotatePoint(q_ceres, P_w, P_c);\r\n\r\n        P_c[0] += t[0];\r\n        P_c[1] += t[1];\r\n        P_c[2] += t[2];\r\n    }\r\n\r\n    T c = params[0];\r\n    T d = params[1];\r\n    T e = params[2];\r\n    T xc[2] = { params[3], params[4] };\r\n\r\n    //T poly[SCARAMUZZA_POLY_SIZE];\r\n    //for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++)\r\n    //    poly[i] = params[5+i];\r\n\r\n    T inv_poly[SCARAMUZZA_INV_POLY_SIZE];\r\n    for (int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\r\n        inv_poly[i] = params[5 + SCARAMUZZA_POLY_SIZE + i];\r\n\r\n    T norm_sqr = P_c[0] * P_c[0] + P_c[1] * P_c[1];\r\n    T norm = T(0.0);\r\n    if (norm_sqr > T(0.0))\r\n        norm = sqrt(norm_sqr);\r\n\r\n    T theta = atan2(-P_c[2], norm);\r\n    T rho = T(0.0);\r\n    T theta_i = T(1.0);\r\n\r\n    for (int i = 0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\r\n    {\r\n        rho += theta_i * inv_poly[i];\r\n        theta_i *= theta;\r\n    }\r\n\r\n    T invNorm = T(1.0) / norm;\r\n    T xn[2] = {\r\n        P_c[0] * invNorm * rho,\r\n        P_c[1] * invNorm * rho\r\n    };\r\n\r\n    p(0) = xn[0] * c + xn[1] * d + xc[0];\r\n    p(1) = xn[0] * e + xn[1]     + xc[1];\r\n}\r\n\r\ntemplate <typename T>\r\nvoid\r\nOCAMCamera::spaceToSphere(const T* const params,\r\n                          const T* const q, const T* const t,\r\n                          const Eigen::Matrix<T, 3, 1>& P,\r\n                          Eigen::Matrix<T, 3, 1>& P_s)\r\n{\r\n    T P_c[3];\r\n    {\r\n        T P_w[3];\r\n        P_w[0] = T(P(0));\r\n        P_w[1] = T(P(1));\r\n        P_w[2] = T(P(2));\r\n\r\n        // Convert quaternion from Eigen convention (x, y, z, w)\r\n        // to Ceres convention (w, x, y, z)\r\n        T q_ceres[4] = {q[3], q[0], q[1], q[2]};\r\n\r\n        ceres::QuaternionRotatePoint(q_ceres, P_w, P_c);\r\n\r\n        P_c[0] += t[0];\r\n        P_c[1] += t[1];\r\n        P_c[2] += t[2];\r\n    }\r\n\r\n    //T poly[SCARAMUZZA_POLY_SIZE];\r\n    //for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++)\r\n    //    poly[i] = params[5+i];\r\n\r\n    T norm_sqr = P_c[0] * P_c[0] + P_c[1] * P_c[1] + P_c[2] * P_c[2];\r\n    T norm = T(0.0);\r\n    if (norm_sqr > T(0.0))\r\n        norm = sqrt(norm_sqr);\r\n\r\n    P_s(0) = P_c[0] / norm;\r\n    P_s(1) = P_c[1] / norm;\r\n    P_s(2) = P_c[2] / norm;\r\n}\r\n\r\ntemplate <typename T>\r\nvoid\r\nOCAMCamera::LiftToSphere(const T* const params,\r\n                          const Eigen::Matrix<T, 2, 1>& p,\r\n                          Eigen::Matrix<T, 3, 1>& P)\r\n{\r\n    T c = params[0];\r\n    T d = params[1];\r\n    T e = params[2];\r\n    T cc[2] = { params[3], params[4] };\r\n    T poly[SCARAMUZZA_POLY_SIZE];\r\n    for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++)\r\n       poly[i] = params[5+i];\r\n\r\n    // Relative to Center\r\n    T p_2d[2];\r\n    p_2d[0] = T(p(0));\r\n    p_2d[1] = T(p(1));\r\n\r\n    T xc[2] = { p_2d[0] - cc[0], p_2d[1] - cc[1]};\r\n\r\n    T inv_scale = T(1.0) / (c - d * e);\r\n\r\n    // Affine Transformation\r\n    T xc_a[2];\r\n\r\n    xc_a[0] = inv_scale * (xc[0] - d * xc[1]);\r\n    xc_a[1] = inv_scale * (-e * xc[0] + c * xc[1]);\r\n\r\n    T norm_sqr = xc_a[0] * xc_a[0] + xc_a[1] * xc_a[1];\r\n    T phi = sqrt(norm_sqr);\r\n    T phi_i = T(1.0);\r\n    T z = T(0.0);\r\n\r\n    for (int i = 0; i < SCARAMUZZA_POLY_SIZE; i++)\r\n    {\r\n        if (i!=1) {\r\n            z += phi_i * poly[i];\r\n        }\r\n        phi_i *= phi;\r\n    }\r\n\r\n    T p_3d[3];\r\n    p_3d[0] = xc[0];\r\n    p_3d[1] = xc[1];\r\n    p_3d[2] = -z;\r\n\r\n    T p_3d_norm_sqr = p_3d[0] * p_3d[0] + p_3d[1] * p_3d[1] + p_3d[2] * p_3d[2];\r\n    T p_3d_norm = sqrt(p_3d_norm_sqr);\r\n\r\n    P << p_3d[0] / p_3d_norm, p_3d[1] / p_3d_norm, p_3d[2] / p_3d_norm;\r\n}\r\n\r\ntemplate <typename T>\r\nvoid OCAMCamera::SphereToPlane(const T* const params, const Eigen::Matrix<T, 3, 1>& P,\r\n                               Eigen::Matrix<T, 2, 1>& p) {\r\n    T P_c[3];\r\n    {\r\n        P_c[0] = T(P(0));\r\n        P_c[1] = T(P(1));\r\n        P_c[2] = T(P(2));\r\n    }\r\n\r\n    T c = params[0];\r\n    T d = params[1];\r\n    T e = params[2];\r\n    T xc[2] = {params[3], params[4]};\r\n\r\n    T inv_poly[SCARAMUZZA_INV_POLY_SIZE];\r\n    for (int i = 0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\r\n        inv_poly[i] = params[5 + SCARAMUZZA_POLY_SIZE + i];\r\n\r\n    T norm_sqr = P_c[0] * P_c[0] + P_c[1] * P_c[1];\r\n    T norm = T(0.0);\r\n    if (norm_sqr > T(0.0)) norm = sqrt(norm_sqr);\r\n\r\n    T theta = atan2(-P_c[2], norm);\r\n    T rho = T(0.0);\r\n    T theta_i = T(1.0);\r\n\r\n    for (int i = 0; i < SCARAMUZZA_INV_POLY_SIZE; i++) {\r\n        rho += theta_i * inv_poly[i];\r\n        theta_i *= theta;\r\n    }\r\n\r\n    T invNorm = T(1.0) / norm;\r\n    T xn[2] = {P_c[0] * invNorm * rho, P_c[1] * invNorm * rho};\r\n\r\n    p(0) = xn[0] * c + xn[1] * d + xc[0];\r\n    p(1) = xn[0] * e + xn[1] + xc[1];\r\n}\r\n}\r\n\r\n#endif\r\n"
  },
  {
    "path": "camera_model/include/camodocal/chessboard/Chessboard.h",
    "content": "#ifndef CHESSBOARD_H\n#define CHESSBOARD_H\n\n#include <boost/shared_ptr.hpp>\n#include <opencv2/core/core.hpp>\n\nnamespace camodocal\n{\n\n// forward declarations\nclass ChessboardCorner;\ntypedef boost::shared_ptr<ChessboardCorner> ChessboardCornerPtr;\nclass ChessboardQuad;\ntypedef boost::shared_ptr<ChessboardQuad> ChessboardQuadPtr;\n\nclass Chessboard\n{\npublic:\n    Chessboard(cv::Size boardSize, cv::Mat& image);\n\n    void findCorners(bool useOpenCV = false);\n    const std::vector<cv::Point2f>& getCorners(void) const;\n    bool cornersFound(void) const;\n\n    const cv::Mat& getImage(void) const;\n    const cv::Mat& getSketch(void) const;\n\nprivate:\n    bool findChessboardCorners(const cv::Mat& image,\n                               const cv::Size& patternSize,\n                               std::vector<cv::Point2f>& corners,\n                               int flags, bool useOpenCV);\n\n    bool findChessboardCornersImproved(const cv::Mat& image,\n                                       const cv::Size& patternSize,\n                                       std::vector<cv::Point2f>& corners,\n                                       int flags);\n\n    void cleanFoundConnectedQuads(std::vector<ChessboardQuadPtr>& quadGroup, cv::Size patternSize);\n\n    void findConnectedQuads(std::vector<ChessboardQuadPtr>& quads,\n                            std::vector<ChessboardQuadPtr>& group,\n                            int group_idx, int dilation);\n\n//    int checkQuadGroup(std::vector<ChessboardQuadPtr>& quadGroup,\n//                       std::vector<ChessboardCornerPtr>& outCorners,\n//                       cv::Size patternSize);\n\n    void labelQuadGroup(std::vector<ChessboardQuadPtr>& quad_group,\n                        cv::Size patternSize, bool firstRun);\n\n    void findQuadNeighbors(std::vector<ChessboardQuadPtr>& quads, int dilation);\n\n    int augmentBestRun(std::vector<ChessboardQuadPtr>& candidateQuads, int candidateDilation,\n                       std::vector<ChessboardQuadPtr>& existingQuads, int existingDilation);\n\n    void generateQuads(std::vector<ChessboardQuadPtr>& quads,\n                       cv::Mat& image, int flags,\n                       int dilation, bool firstRun);\n\n    bool checkQuadGroup(std::vector<ChessboardQuadPtr>& quads,\n                        std::vector<ChessboardCornerPtr>& corners,\n                        cv::Size patternSize);\n\n    void getQuadrangleHypotheses(const std::vector< std::vector<cv::Point> >& contours,\n                                 std::vector< std::pair<float, int> >& quads,\n                                 int classId) const;\n\n    bool checkChessboard(const cv::Mat& image, cv::Size patternSize) const;\n\n    bool checkBoardMonotony(std::vector<ChessboardCornerPtr>& corners,\n                            cv::Size patternSize);\n\n    bool matchCorners(ChessboardQuadPtr& quad1, int corner1,\n                      ChessboardQuadPtr& quad2, int corner2) const;\n\n    cv::Mat mImage;\n    cv::Mat mSketch;\n    std::vector<cv::Point2f> mCorners;\n    cv::Size mBoardSize;\n    bool mCornersFound;\n};\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/chessboard/ChessboardCorner.h",
    "content": "#ifndef CHESSBOARDCORNER_H\n#define CHESSBOARDCORNER_H\n\n#include <boost/shared_ptr.hpp>\n#include <opencv2/core/core.hpp>\n\nnamespace camodocal\n{\n\nclass ChessboardCorner;\ntypedef boost::shared_ptr<ChessboardCorner> ChessboardCornerPtr;\n\nclass ChessboardCorner\n{\npublic:\n    ChessboardCorner() : row(0), column(0), needsNeighbor(true), count(0) {}\n\n    float meanDist(int &n) const\n    {\n        float sum = 0;\n        n = 0;\n        for (int i = 0; i < 4; ++i)\n        {\n            if (neighbors[i].get())\n            {\n                float dx = neighbors[i]->pt.x - pt.x;\n                float dy = neighbors[i]->pt.y - pt.y;\n                sum += sqrt(dx*dx + dy*dy);\n                n++;\n            }\n        }\n        return sum / std::max(n, 1);\n    }\n\n    cv::Point2f pt;                     // X and y coordinates\n    int row;                            // Row and column of the corner\n    int column;                         // in the found pattern\n    bool needsNeighbor;                 // Does the corner require a neighbor?\n    int count;                          // number of corner neighbors\n    ChessboardCornerPtr neighbors[4];   // pointer to all corner neighbors\n};\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/chessboard/ChessboardQuad.h",
    "content": "#ifndef CHESSBOARDQUAD_H\n#define CHESSBOARDQUAD_H\n\n#include <boost/shared_ptr.hpp>\n\n#include \"camodocal/chessboard/ChessboardCorner.h\"\n\nnamespace camodocal\n{\n\nclass ChessboardQuad;\ntypedef boost::shared_ptr<ChessboardQuad> ChessboardQuadPtr;\n\nclass ChessboardQuad\n{\npublic:\n    ChessboardQuad() : count(0), group_idx(-1), edge_len(FLT_MAX), labeled(false) {}\n\n    int count;                         // Number of quad neighbors\n    int group_idx;                     // Quad group ID\n    float edge_len;                    // Smallest side length^2\n    ChessboardCornerPtr corners[4];    // Coordinates of quad corners\n    ChessboardQuadPtr neighbors[4];    // Pointers of quad neighbors\n    bool labeled;                      // Has this corner been labeled?\n};\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/chessboard/Spline.h",
    "content": "/*  dynamo:- Event driven molecular dynamics simulator\n    http://www.marcusbannerman.co.uk/dynamo\n    Copyright (C) 2011  Marcus N Campbell Bannerman <m.bannerman@gmail.com>\n\n    This program is free software: you can redistribute it and/or\n    modify it under the terms of the GNU General Public License\n    version 3 as published by the Free Software Foundation.\n\n    This program is distributed in the hope that it will be useful,\n    but WITHOUT ANY WARRANTY; without even the implied warranty of\n    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the\n    GNU General Public License for more details.\n\n    You should have received a copy of the GNU General Public License\n    along with this program.  If not, see <http://www.gnu.org/licenses/>.\n*/\n\n#pragma once\n\n#include <boost/numeric/ublas/vector.hpp>\n#include <boost/numeric/ublas/vector_proxy.hpp>\n#include <boost/numeric/ublas/matrix.hpp>\n#include <boost/numeric/ublas/triangular.hpp>\n#include <boost/numeric/ublas/lu.hpp>\n#include <exception>\n\nnamespace ublas = boost::numeric::ublas;\n\nclass Spline : private std::vector<std::pair<double, double> >\n{\npublic:\n  //The boundary conditions available\n  enum BC_type {\n\tFIXED_1ST_DERIV_BC,\n\tFIXED_2ND_DERIV_BC,\n\tPARABOLIC_RUNOUT_BC\n  };\n\n  enum Spline_type {\n\tLINEAR,\n\tCUBIC\n  };\n\n  //Constructor takes the boundary conditions as arguments, this\n  //sets the first derivative (gradient) at the lower and upper\n  //end points\n  Spline():\n\t_valid(false),\n\t_BCLow(FIXED_2ND_DERIV_BC), _BCHigh(FIXED_2ND_DERIV_BC),\n\t_BCLowVal(0), _BCHighVal(0),\n\t_type(CUBIC)\n  {}\n\n  typedef std::vector<std::pair<double, double> > base;\n  typedef base::const_iterator const_iterator;\n\n  //Standard STL read-only container stuff\n  const_iterator begin() const { return base::begin(); }\n  const_iterator end() const { return base::end(); }\n  void clear() { _valid = false; base::clear(); _data.clear(); }\n  size_t size() const { return base::size(); }\n  size_t max_size() const { return base::max_size(); }\n  size_t capacity() const { return base::capacity(); }\n  bool empty() const { return base::empty(); }\n\n  //Add a point to the spline, and invalidate it so its\n  //recalculated on the next access\n  inline void addPoint(double x, double y)\n  {\n\t_valid = false;\n\tbase::push_back(std::pair<double, double>(x,y));\n  }\n\n  //Reset the boundary conditions\n  inline void setLowBC(BC_type BC, double val = 0)\n  { _BCLow = BC; _BCLowVal = val; _valid = false; }\n\n  inline void setHighBC(BC_type BC, double val = 0)\n  { _BCHigh = BC; _BCHighVal = val; _valid = false; }\n\n  void setType(Spline_type type) { _type = type; _valid = false; }\n\n  //Check if the spline has been calculated, then generate the\n  //spline interpolated value\n  double operator()(double xval)\n  {\n\tif (!_valid) generate();\n\n\t//Special cases when we're outside the range of the spline points\n\tif (xval <= x(0)) return lowCalc(xval);\n\tif (xval >= x(size()-1)) return highCalc(xval);\n\n\t//Check all intervals except the last one\n\tfor (std::vector<SplineData>::const_iterator iPtr = _data.begin();\n\t\t iPtr != _data.end()-1; ++iPtr)\n\t\tif ((xval >= iPtr->x) && (xval <= (iPtr+1)->x))\n\t\t  return splineCalc(iPtr, xval);\n\n\treturn splineCalc(_data.end() - 1, xval);\n  }\n\nprivate:\n\n  ///////PRIVATE DATA MEMBERS\n  struct SplineData { double x,a,b,c,d; };\n  //vector of calculated spline data\n  std::vector<SplineData> _data;\n  //Second derivative at each point\n  ublas::vector<double> _ddy;\n  //Tracks whether the spline parameters have been calculated for\n  //the current set of points\n  bool _valid;\n  //The boundary conditions\n  BC_type _BCLow, _BCHigh;\n  //The values of the boundary conditions\n  double _BCLowVal, _BCHighVal;\n\n  Spline_type _type;\n\n  ///////PRIVATE FUNCTIONS\n  //Function to calculate the value of a given spline at a point xval\n  inline double splineCalc(std::vector<SplineData>::const_iterator i, double xval)\n  {\n\tconst double lx = xval - i->x;\n\treturn ((i->a * lx + i->b) * lx + i->c) * lx + i->d;\n  }\n\n  inline double lowCalc(double xval)\n  {\n\tconst double lx = xval - x(0);\n\n\tif (_type == LINEAR)\n\t  return lx * _BCHighVal + y(0);\n\n\tconst double firstDeriv = (y(1) - y(0)) / h(0) - 2 * h(0) * (_data[0].b + 2 * _data[1].b) / 6;\n\n\tswitch(_BCLow)\n\t  {\n\t  case FIXED_1ST_DERIV_BC:\n\t\treturn lx * _BCLowVal + y(0);\n\t  case FIXED_2ND_DERIV_BC:\n\t\t  return lx * lx * _BCLowVal + firstDeriv * lx + y(0);\n\t  case PARABOLIC_RUNOUT_BC:\n\t\treturn lx * lx * _ddy[0] + lx * firstDeriv  + y(0);\n\t  }\n\tthrow std::runtime_error(\"Unknown BC\");\n  }\n\n  inline double highCalc(double xval)\n  {\n\tconst double lx = xval - x(size() - 1);\n\n\tif (_type == LINEAR)\n\t  return lx * _BCHighVal + y(size() - 1);\n\n\tconst double firstDeriv = 2 * h(size() - 2) * (_ddy[size() - 2] + 2 * _ddy[size() - 1]) / 6 + (y(size() - 1) - y(size() - 2)) / h(size() - 2);\n\n\tswitch(_BCHigh)\n\t  {\n\t  case FIXED_1ST_DERIV_BC:\n\t\treturn lx * _BCHighVal + y(size() - 1);\n\t  case FIXED_2ND_DERIV_BC:\n\t\treturn lx * lx * _BCHighVal + firstDeriv * lx + y(size() - 1);\n\t  case PARABOLIC_RUNOUT_BC:\n\t\treturn lx * lx * _ddy[size()-1] + lx * firstDeriv  + y(size() - 1);\n\t  }\n\tthrow std::runtime_error(\"Unknown BC\");\n  }\n\n  //These just provide access to the point data in a clean way\n  inline double x(size_t i) const { return operator[](i).first; }\n  inline double y(size_t i) const { return operator[](i).second; }\n  inline double h(size_t i) const { return x(i+1) - x(i); }\n\n  //Invert a arbitrary matrix using the boost ublas library\n  template<class T>\n  bool InvertMatrix(ublas::matrix<T> A,\n\t\tublas::matrix<T>& inverse)\n  {\n\tusing namespace ublas;\n\n\t// create a permutation matrix for the LU-factorization\n\tpermutation_matrix<std::size_t> pm(A.size1());\n\n\t// perform LU-factorization\n\tint res = lu_factorize(A,pm);\n\t\tif( res != 0 ) return false;\n\n\t// create identity matrix of \"inverse\"\n\tinverse.assign(ublas::identity_matrix<T>(A.size1()));\n\n\t// backsubstitute to get the inverse\n\tlu_substitute(A, pm, inverse);\n\n\treturn true;\n  }\n\n  //This function will recalculate the spline parameters and store\n  //them in _data, ready for spline interpolation\n  void generate()\n  {\n\tif (size() < 2)\n\t  throw std::runtime_error(\"Spline requires at least 2 points\");\n\n\t//If any spline points are at the same x location, we have to\n\t//just slightly seperate them\n\t{\n\t  bool testPassed(false);\n\t  while (!testPassed)\n\t\t{\n\t\t  testPassed = true;\n\t\t  std::sort(base::begin(), base::end());\n\n\t\t  for (base::iterator iPtr = base::begin(); iPtr != base::end() - 1; ++iPtr)\n\t\tif (iPtr->first == (iPtr+1)->first)\n\t\t  {\n\t\t\tif ((iPtr+1)->first != 0)\n\t\t\t  (iPtr+1)->first += (iPtr+1)->first\n\t\t\t* std::numeric_limits<double>::epsilon() * 10;\n\t\t\telse\n\t\t\t  (iPtr+1)->first = std::numeric_limits<double>::epsilon() * 10;\n\t\t\ttestPassed = false;\n\t\t\tbreak;\n\t\t  }\n\t\t}\n\t}\n\n\tconst size_t e = size() - 1;\n\n\tswitch (_type)\n\t  {\n\t  case LINEAR:\n\t\t{\n\t\t  _data.resize(e);\n\t\t  for (size_t i(0); i < e; ++i)\n\t\t{\n\t\t  _data[i].x = x(i);\n\t\t  _data[i].a = 0;\n\t\t  _data[i].b = 0;\n\t\t  _data[i].c = (y(i+1) - y(i)) / (x(i+1) - x(i));\n\t\t  _data[i].d = y(i);\n\t\t}\n\t\t  break;\n\t\t}\n\t  case CUBIC:\n\t\t{\n\t\t  ublas::matrix<double> A(size(), size());\n\t\t  for (size_t yv(0); yv <= e; ++yv)\n\t\tfor (size_t xv(0); xv <= e; ++xv)\n\t\t  A(xv,yv) = 0;\n\n\t\t  for (size_t i(1); i < e; ++i)\n\t\t{\n\t\t  A(i-1,i) = h(i-1);\n\t\t  A(i,i) = 2 * (h(i-1) + h(i));\n\t\t  A(i+1,i) = h(i);\n\t\t}\n\n\t\t  ublas::vector<double> C(size());\n\t\t  for (size_t xv(0); xv <= e; ++xv)\n\t\tC(xv) = 0;\n\n\t\t  for (size_t i(1); i < e; ++i)\n\t\tC(i) = 6 *\n\t\t  ((y(i+1) - y(i)) / h(i)\n\t\t   - (y(i) - y(i-1)) / h(i-1));\n\n\t\t  //Boundary conditions\n\t\t  switch(_BCLow)\n\t\t{\n\t\tcase FIXED_1ST_DERIV_BC:\n\t\t  C(0) = 6 * ((y(1) - y(0)) / h(0) - _BCLowVal);\n\t\t  A(0,0) = 2 * h(0);\n\t\t  A(1,0) = h(0);\n\t\t  break;\n\t\tcase FIXED_2ND_DERIV_BC:\n\t\t  C(0) = _BCLowVal;\n\t\t  A(0,0) = 1;\n\t\t  break;\n\t\tcase PARABOLIC_RUNOUT_BC:\n\t\t  C(0) = 0; A(0,0) = 1; A(1,0) = -1;\n\t\t  break;\n\t\t}\n\n\t\t  switch(_BCHigh)\n\t\t{\n\t\tcase FIXED_1ST_DERIV_BC:\n\t\t  C(e) = 6 * (_BCHighVal - (y(e) - y(e-1)) / h(e-1));\n\t\t  A(e,e) = 2 * h(e - 1);\n\t\t  A(e-1,e) = h(e - 1);\n\t\t  break;\n\t\tcase FIXED_2ND_DERIV_BC:\n\t\t  C(e) = _BCHighVal;\n\t\t  A(e,e) = 1;\n\t\t  break;\n\t\tcase PARABOLIC_RUNOUT_BC:\n\t\t  C(e) = 0; A(e,e) = 1; A(e-1,e) = -1;\n\t\t  break;\n\t\t}\n\n\t\t  ublas::matrix<double> AInv(size(), size());\n\t\t  InvertMatrix(A,AInv);\n\n\t\t  _ddy = ublas::prod(C, AInv);\n\n\t\t  _data.resize(size()-1);\n\t\t  for (size_t i(0); i < e; ++i)\n\t\t{\n\t\t  _data[i].x = x(i);\n\t\t  _data[i].a = (_ddy(i+1) - _ddy(i)) / (6 * h(i));\n\t\t  _data[i].b = _ddy(i) / 2;\n\t\t  _data[i].c = (y(i+1) - y(i)) / h(i) - _ddy(i+1) * h(i) / 6 - _ddy(i) * h(i) / 3;\n\t\t  _data[i].d = y(i);\n\t\t}\n\t\t}\n\t  }\n\t_valid = true;\n  }\n};\n"
  },
  {
    "path": "camera_model/include/camodocal/gpl/EigenQuaternionParameterization.h",
    "content": "#ifndef EIGENQUATERNIONPARAMETERIZATION_H\n#define EIGENQUATERNIONPARAMETERIZATION_H\n\n#include \"ceres/local_parameterization.h\"\n\nnamespace camodocal\n{\n\nclass EigenQuaternionParameterization : public ceres::LocalParameterization\n{\npublic:\n    virtual ~EigenQuaternionParameterization() {}\n    virtual bool Plus(const double* x,\n                      const double* delta,\n                      double* x_plus_delta) const;\n    virtual bool ComputeJacobian(const double* x,\n                                 double* jacobian) const;\n    virtual int GlobalSize() const { return 4; }\n    virtual int LocalSize() const { return 3; }\n\nprivate:\n    template<typename T>\n    void EigenQuaternionProduct(const T z[4], const T w[4], T zw[4]) const;\n};\n\n\ntemplate<typename T>\nvoid\nEigenQuaternionParameterization::EigenQuaternionProduct(const T z[4], const T w[4], T zw[4]) const\n{\n    zw[0] = z[3] * w[0] + z[0] * w[3] + z[1] * w[2] - z[2] * w[1];\n    zw[1] = z[3] * w[1] - z[0] * w[2] + z[1] * w[3] + z[2] * w[0];\n    zw[2] = z[3] * w[2] + z[0] * w[1] - z[1] * w[0] + z[2] * w[3];\n    zw[3] = z[3] * w[3] - z[0] * w[0] - z[1] * w[1] - z[2] * w[2];\n}\n\n}\n\n#endif\n\n"
  },
  {
    "path": "camera_model/include/camodocal/gpl/EigenUtils.h",
    "content": "#ifndef EIGENUTILS_H\n#define EIGENUTILS_H\n\n#include <eigen3/Eigen/Dense>\n\n#include \"ceres/rotation.h\"\n#include \"camodocal/gpl/gpl.h\"\n\nnamespace camodocal\n{\n\n// Returns the 3D cross product skew symmetric matrix of a given 3D vector\ntemplate<typename T>\nEigen::Matrix<T, 3, 3> skew(const Eigen::Matrix<T, 3, 1>& vec)\n{\n    return (Eigen::Matrix<T, 3, 3>() << T(0), -vec(2), vec(1),\n                                        vec(2), T(0), -vec(0),\n                                        -vec(1), vec(0), T(0)).finished();\n}\n\ntemplate<typename Derived>\ntypename Eigen::MatrixBase<Derived>::PlainObject sqrtm(const Eigen::MatrixBase<Derived>& A)\n{\n    Eigen::SelfAdjointEigenSolver<typename Derived::PlainObject> es(A);\n\n    return es.operatorSqrt();\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 3, 3> AngleAxisToRotationMatrix(const Eigen::Matrix<T, 3, 1>& rvec)\n{\n    T angle = rvec.norm();\n    if (angle == T(0))\n    {\n        return Eigen::Matrix<T, 3, 3>::Identity();\n    }\n\n    Eigen::Matrix<T, 3, 1> axis;\n    axis = rvec.normalized();\n\n    Eigen::Matrix<T, 3, 3> rmat;\n    rmat = Eigen::AngleAxis<T>(angle, axis);\n\n    return rmat;\n}\n\ntemplate<typename T>\nEigen::Quaternion<T> AngleAxisToQuaternion(const Eigen::Matrix<T, 3, 1>& rvec)\n{\n    Eigen::Matrix<T, 3, 3> rmat = AngleAxisToRotationMatrix<T>(rvec);\n\n    return Eigen::Quaternion<T>(rmat);\n}\n\ntemplate<typename T>\nvoid AngleAxisToQuaternion(const Eigen::Matrix<T, 3, 1>& rvec, T* q)\n{\n    Eigen::Quaternion<T> quat = AngleAxisToQuaternion<T>(rvec);\n\n    q[0] = quat.x();\n    q[1] = quat.y();\n    q[2] = quat.z();\n    q[3] = quat.w();\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 3, 1> RotationToAngleAxis(const Eigen::Matrix<T, 3, 3> & rmat)\n{\n    Eigen::AngleAxis<T> angleaxis; \n    angleaxis.fromRotationMatrix(rmat); \n    return angleaxis.angle() * angleaxis.axis(); \n    \n}\n\ntemplate<typename T>\nvoid QuaternionToAngleAxis(const T* const q, Eigen::Matrix<T, 3, 1>& rvec)\n{\n    Eigen::Quaternion<T> quat(q[3], q[0], q[1], q[2]);\n\n    Eigen::Matrix<T, 3, 3> rmat = quat.toRotationMatrix();\n\n    Eigen::AngleAxis<T> angleaxis;\n    angleaxis.fromRotationMatrix(rmat);\n\n    rvec = angleaxis.angle() * angleaxis.axis();\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 3, 3> QuaternionToRotation(const T* const q)\n{\n    T R[9];\n    ceres::QuaternionToRotation(q, R);\n\n    Eigen::Matrix<T, 3, 3> rmat;\n    for (int i = 0; i < 3; ++i)\n    {\n        for (int j = 0; j < 3; ++j)\n        {\n            rmat(i,j) = R[i * 3 + j];\n        }\n    }\n\n    return rmat;\n}\n\ntemplate<typename T>\nvoid QuaternionToRotation(const T* const q, T* rot)\n{\n    ceres::QuaternionToRotation(q, rot);\n}\n\ntemplate<typename T>\nEigen::Matrix<T,4,4> QuaternionMultMatLeft(const Eigen::Quaternion<T>& q)\n{\n    return (Eigen::Matrix<T,4,4>() << q.w(), -q.z(), q.y(), q.x(),\n                                      q.z(), q.w(), -q.x(), q.y(),\n                                      -q.y(), q.x(), q.w(), q.z(),\n                                      -q.x(), -q.y(), -q.z(), q.w()).finished();\n}\n\ntemplate<typename T>\nEigen::Matrix<T,4,4> QuaternionMultMatRight(const Eigen::Quaternion<T>& q)\n{\n    return (Eigen::Matrix<T,4,4>() << q.w(), q.z(), -q.y(), q.x(),\n                                      -q.z(), q.w(), q.x(), q.y(),\n                                      q.y(), -q.x(), q.w(), q.z(),\n                                      -q.x(), -q.y(), -q.z(), q.w()).finished();\n}\n\n/// @param theta - rotation about screw axis\n/// @param d - projection of tvec on the rotation axis\n/// @param l - screw axis direction\n/// @param m - screw axis moment\ntemplate<typename T>\nvoid AngleAxisAndTranslationToScrew(const Eigen::Matrix<T, 3, 1>& rvec,\n                                    const Eigen::Matrix<T, 3, 1>& tvec,\n                                    T& theta, T& d,\n                                    Eigen::Matrix<T, 3, 1>& l,\n                                    Eigen::Matrix<T, 3, 1>& m)\n{\n\n    theta = rvec.norm();\n    if (theta == 0)\n    {\n        l.setZero(); \n        m.setZero(); \n        std::cout << \"Warning: Undefined screw! Returned 0. \" << std::endl; \n    }\n\n    l = rvec.normalized();\n\n    Eigen::Matrix<T, 3, 1> t = tvec;\n\n    d = t.transpose() * l;\n\n    // point on screw axis - projection of origin on screw axis\n    Eigen::Matrix<T, 3, 1> c;\n    c = 0.5 * (t - d * l + (1.0 / tan(theta / 2.0) * l).cross(t));\n\n    // c and hence the screw axis is not defined if theta is either 0 or M_PI\n    m = c.cross(l);\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 3, 3> RPY2mat(T roll, T pitch, T yaw)\n{\n    Eigen::Matrix<T, 3, 3> m;\n\n    T cr = cos(roll);\n    T sr = sin(roll);\n    T cp = cos(pitch);\n    T sp = sin(pitch);\n    T cy = cos(yaw);\n    T sy = sin(yaw);\n\n    m(0,0) = cy * cp;\n    m(0,1) = cy * sp * sr - sy * cr;\n    m(0,2) = cy * sp * cr + sy * sr;\n    m(1,0) = sy * cp;\n    m(1,1) = sy * sp * sr + cy * cr;\n    m(1,2) = sy * sp * cr - cy * sr;\n    m(2,0) = - sp;\n    m(2,1) = cp * sr;\n    m(2,2) = cp * cr;\n    return m; \n}\n\ntemplate<typename T>\nvoid mat2RPY(const Eigen::Matrix<T, 3, 3>& m, T& roll, T& pitch, T& yaw)\n{\n    roll = atan2(m(2,1), m(2,2));\n    pitch = atan2(-m(2,0), sqrt(m(2,1) * m(2,1) + m(2,2) * m(2,2)));\n    yaw = atan2(m(1,0), m(0,0));\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 4, 4> homogeneousTransform(const Eigen::Matrix<T, 3, 3>& R, const Eigen::Matrix<T, 3, 1>& t)\n{\n    Eigen::Matrix<T, 4, 4> H;\n    H.setIdentity();\n\n    H.block(0,0,3,3) = R;\n    H.block(0,3,3,1) = t;\n\n    return H;\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 4, 4> poseWithCartesianTranslation(const T* const q, const T* const p)\n{\n    Eigen::Matrix<T, 4, 4> pose = Eigen::Matrix<T, 4, 4>::Identity();\n\n    T rotation[9];\n    ceres::QuaternionToRotation(q, rotation);\n    for (int i = 0; i < 3; ++i)\n    {\n        for (int j = 0; j < 3; ++j)\n        {\n            pose(i,j) = rotation[i * 3 + j];\n        }\n    }\n\n    pose(0,3) = p[0];\n    pose(1,3) = p[1];\n    pose(2,3) = p[2];\n\n    return pose;\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 4, 4> poseWithSphericalTranslation(const T* const q, const T* const p, const T scale = T(1.0))\n{\n    Eigen::Matrix<T, 4, 4> pose = Eigen::Matrix<T, 4, 4>::Identity();\n\n    T rotation[9];\n    ceres::QuaternionToRotation(q, rotation);\n    for (int i = 0; i < 3; ++i)\n    {\n        for (int j = 0; j < 3; ++j)\n        {\n            pose(i,j) = rotation[i * 3 + j];\n        }\n    }\n\n    T theta = p[0];\n    T phi = p[1];\n    pose(0,3) = sin(theta) * cos(phi) * scale;\n    pose(1,3) = sin(theta) * sin(phi) * scale;\n    pose(2,3) = cos(theta) * scale;\n\n    return pose;\n}\n\n// Returns the Sampson error of a given essential matrix and 2 image points\ntemplate<typename T>\nT sampsonError(const Eigen::Matrix<T, 3, 3>& E,\n               const Eigen::Matrix<T, 3, 1>& p1,\n               const Eigen::Matrix<T, 3, 1>& p2)\n{\n    Eigen::Matrix<T, 3, 1> Ex1 = E * p1;\n    Eigen::Matrix<T, 3, 1> Etx2 = E.transpose() * p2;\n\n    T x2tEx1 = p2.dot(Ex1);\n\n    // compute Sampson error\n    T err = square(x2tEx1) / (square(Ex1(0,0)) + square(Ex1(1,0)) + square(Etx2(0,0)) + square(Etx2(1,0)));\n\n    return err;\n}\n\n// Returns the Sampson error of a given rotation/translation and 2 image points\ntemplate<typename T>\nT sampsonError(const Eigen::Matrix<T, 3, 3>& R,\n               const Eigen::Matrix<T, 3, 1>& t,\n               const Eigen::Matrix<T, 3, 1>& p1,\n               const Eigen::Matrix<T, 3, 1>& p2)\n{\n    // construct essential matrix\n    Eigen::Matrix<T, 3, 3> E = skew(t) * R;\n\n    Eigen::Matrix<T, 3, 1> Ex1 = E * p1;\n    Eigen::Matrix<T, 3, 1> Etx2 = E.transpose() * p2;\n\n    T x2tEx1 = p2.dot(Ex1);\n\n    // compute Sampson error\n    T err = square(x2tEx1) / (square(Ex1(0,0)) + square(Ex1(1,0)) + square(Etx2(0,0)) + square(Etx2(1,0)));\n\n    return err;\n}\n\n// Returns the Sampson error of a given rotation/translation and 2 image points\ntemplate<typename T>\nT sampsonError(const Eigen::Matrix<T, 4, 4>& H,\n               const Eigen::Matrix<T, 3, 1>& p1,\n               const Eigen::Matrix<T, 3, 1>& p2)\n{\n    Eigen::Matrix<T, 3, 3> R = H.block(0, 0, 3, 3);\n    Eigen::Matrix<T, 3, 1> t = H.block(0, 3, 3, 1);\n\n    return sampsonError(R, t, p1, p2);\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 3, 1>\ntransformPoint(const Eigen::Matrix<T, 4, 4>& H, const Eigen::Matrix<T, 3, 1>& P)\n{\n    Eigen::Matrix<T, 3, 1> P_trans = H.block(0, 0, 3, 3) * P + H.block(0, 3, 3, 1);\n\n    return P_trans;\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 4, 4>\nestimate3DRigidTransform(const std::vector<Eigen::Matrix<T, 3, 1>, Eigen::aligned_allocator<Eigen::Matrix<T, 3, 1> > >& points1,\n                         const std::vector<Eigen::Matrix<T, 3, 1>, Eigen::aligned_allocator<Eigen::Matrix<T, 3, 1> > >& points2)\n{\n    // compute centroids\n    Eigen::Matrix<T, 3, 1> c1, c2;\n    c1.setZero(); c2.setZero();\n\n    for (size_t i = 0; i < points1.size(); ++i)\n    {\n        c1 += points1.at(i);\n        c2 += points2.at(i);\n    }\n\n    c1 /= points1.size();\n    c2 /= points1.size();\n\n    Eigen::Matrix<T, Eigen::Dynamic, Eigen::Dynamic> X(3, points1.size());\n    Eigen::Matrix<T, Eigen::Dynamic, Eigen::Dynamic> Y(3, points1.size());\n    for (size_t i = 0; i < points1.size(); ++i)\n    {\n        X.col(i) = points1.at(i) - c1;\n        Y.col(i) = points2.at(i) - c2;\n    }\n\n    Eigen::Matrix<T, 3, 3> H = X * Y.transpose();\n\n    Eigen::JacobiSVD< Eigen::Matrix<T, Eigen::Dynamic, Eigen::Dynamic> > svd(H, Eigen::ComputeFullU | Eigen::ComputeFullV);\n\n    Eigen::Matrix<T, 3, 3> U = svd.matrixU();\n    Eigen::Matrix<T, 3, 3> V = svd.matrixV();\n    if (U.determinant() * V.determinant() < 0.0)\n    {\n        V.col(2) *= -1.0;\n    }\n\n    Eigen::Matrix<T, 3, 3> R = V * U.transpose();\n    Eigen::Matrix<T, 3, 1> t = c2 - R * c1;\n\n    return homogeneousTransform(R, t);\n}\n\ntemplate<typename T>\nEigen::Matrix<T, 4, 4>\nestimate3DRigidSimilarityTransform(const std::vector<Eigen::Matrix<T, 3, 1>, Eigen::aligned_allocator<Eigen::Matrix<T, 3, 1> > >& points1,\n                                   const std::vector<Eigen::Matrix<T, 3, 1>, Eigen::aligned_allocator<Eigen::Matrix<T, 3, 1> > >& points2)\n{\n    // compute centroids\n    Eigen::Matrix<T, 3, 1> c1, c2;\n    c1.setZero(); c2.setZero();\n\n    for (size_t i = 0; i < points1.size(); ++i)\n    {\n        c1 += points1.at(i);\n        c2 += points2.at(i);\n    }\n\n    c1 /= points1.size();\n    c2 /= points1.size();\n\n    Eigen::Matrix<T, Eigen::Dynamic, Eigen::Dynamic> X(3, points1.size());\n    Eigen::Matrix<T, Eigen::Dynamic, Eigen::Dynamic> Y(3, points1.size());\n    for (size_t i = 0; i < points1.size(); ++i)\n    {\n        X.col(i) = points1.at(i) - c1;\n        Y.col(i) = points2.at(i) - c2;\n    }\n\n    Eigen::Matrix<T, 3, 3> H = X * Y.transpose();\n\n    Eigen::JacobiSVD< Eigen::Matrix<T, Eigen::Dynamic, Eigen::Dynamic> > svd(H, Eigen::ComputeFullU | Eigen::ComputeFullV);\n\n    Eigen::Matrix<T, 3, 3> U = svd.matrixU();\n    Eigen::Matrix<T, 3, 3> V = svd.matrixV();\n    if (U.determinant() * V.determinant() < 0.0)\n    {\n        V.col(2) *= -1.0;\n    }\n\n    Eigen::Matrix<T, 3, 3> R = V * U.transpose();\n\n    std::vector<Eigen::Matrix<T, 3, 1>, Eigen::aligned_allocator<Eigen::Matrix<T, 3, 1> > > rotatedPoints1(points1.size());\n    for (size_t i = 0; i < points1.size(); ++i)\n    {\n        rotatedPoints1.at(i) = R * (points1.at(i) - c1);\n    }\n\n    double sum_ss = 0.0, sum_tt = 0.0;\n    for (size_t i = 0; i < points1.size(); ++i)\n    {\n        sum_ss += (points1.at(i) - c1).squaredNorm();\n        sum_tt += (points2.at(i) - c2).dot(rotatedPoints1.at(i));\n    }\n\n    double scale = sum_tt / sum_ss;\n\n    Eigen::Matrix<T, 3, 3> sR = scale * R;\n    Eigen::Matrix<T, 3, 1> t = c2 - sR * c1;\n\n    return homogeneousTransform(sR, t);\n}\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/include/camodocal/gpl/gpl.h",
    "content": "#ifndef GPL_H\r\n#define GPL_H\r\n\r\n#include <algorithm>\r\n#include <cmath>\r\n#include <opencv2/core/core.hpp>\r\n\r\nnamespace camodocal\r\n{\r\n\r\ntemplate<class T>\r\nconst T clamp(const T& v, const T& a, const T& b)\r\n{\r\n\treturn std::min(b, std::max(a, v));\r\n}\r\n\r\ndouble hypot3(double x, double y, double z);\r\nfloat hypot3f(float x, float y, float z);\r\n\r\ntemplate<class T>\r\nconst T normalizeTheta(const T& theta)\r\n{\r\n\tT normTheta = theta;\r\n\r\n\twhile (normTheta < - M_PI)\r\n\t{\r\n\t\tnormTheta += 2.0 * M_PI;\r\n\t}\r\n\twhile (normTheta > M_PI)\r\n\t{\r\n\t\tnormTheta -= 2.0 * M_PI;\r\n\t}\r\n\r\n\treturn normTheta;\r\n}\r\n\r\ndouble d2r(double deg);\r\nfloat d2r(float deg);\r\ndouble r2d(double rad);\r\nfloat r2d(float rad);\r\n\r\ndouble sinc(double theta);\r\n\r\ntemplate<class T>\r\nconst T square(const T& x)\r\n{\r\n\treturn x * x;\r\n}\r\n\r\ntemplate<class T>\r\nconst T cube(const T& x)\r\n{\r\n\treturn x * x * x;\r\n}\r\n\r\ntemplate<class T>\r\nconst T random(const T& a, const T& b)\r\n{\r\n\treturn static_cast<double>(rand()) / RAND_MAX * (b - a) + a;\r\n}\r\n\r\ntemplate<class T>\r\nconst T randomNormal(const T& sigma)\r\n{\r\n    T x1, x2, w;\r\n\r\n    do\r\n    {\r\n        x1 = 2.0 * random(0.0, 1.0) - 1.0;\r\n        x2 = 2.0 * random(0.0, 1.0) - 1.0;\r\n        w = x1 * x1 + x2 * x2;\r\n    }\r\n    while (w >= 1.0 || w == 0.0);\r\n\r\n    w = sqrt((-2.0 * log(w)) / w);\r\n\r\n    return x1 * w * sigma;\r\n}\r\n\r\nunsigned long long timeInMicroseconds(void);\r\n\r\ndouble timeInSeconds(void);\r\n\r\nvoid colorDepthImage(cv::Mat& imgDepth,\r\n                     cv::Mat& imgColoredDepth,\r\n                     float minRange, float maxRange);\r\n\r\nbool colormap(const std::string& name, unsigned char idx,\r\n              float& r, float& g, float& b);\r\n\r\nstd::vector<cv::Point2i> bresLine(int x0, int y0, int x1, int y1);\r\nstd::vector<cv::Point2i> bresCircle(int x0, int y0, int r);\r\n\r\nvoid fitCircle(const std::vector<cv::Point2d>& points,\r\n               double& centerX, double& centerY, double& radius);\r\n\r\nstd::vector<cv::Point2d> intersectCircles(double x1, double y1, double r1,\r\n                                          double x2, double y2, double r2);\r\n\r\nvoid LLtoUTM(double latitude, double longitude,\r\n             double& utmNorthing, double& utmEasting,\r\n             std::string& utmZone);\r\nvoid UTMtoLL(double utmNorthing, double utmEasting,\r\n             const std::string& utmZone,\r\n             double& latitude, double& longitude);\r\n\r\nlong int timestampDiff(uint64_t t1, uint64_t t2);\r\n\r\n}\r\n\r\n#endif\r\n"
  },
  {
    "path": "camera_model/include/camodocal/sparse_graph/Transform.h",
    "content": "#ifndef TRANSFORM_H\n#define TRANSFORM_H\n\n#include <boost/shared_ptr.hpp>\n#include <eigen3/Eigen/Dense>\n#include <stdint.h>\n\nnamespace camodocal\n{\n\nclass Transform\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n\n    Transform();\n    Transform(const Eigen::Matrix4d& H);\n\n    Eigen::Quaterniond& rotation(void);\n    const Eigen::Quaterniond& rotation(void) const;\n    double* rotationData(void);\n    const double* const rotationData(void) const;\n\n    Eigen::Vector3d& translation(void);\n    const Eigen::Vector3d& translation(void) const;\n    double* translationData(void);\n    const double* const translationData(void) const;\n\n    Eigen::Matrix4d toMatrix(void) const;\n\nprivate:\n    Eigen::Quaterniond m_q;\n    Eigen::Vector3d m_t;\n};\n\n}\n\n#endif\n"
  },
  {
    "path": "camera_model/instruction",
    "content": "rosrun camera_model Calibration -w 8 -h 11 -s 70 -i ~/bag/PX/calib/\n"
  },
  {
    "path": "camera_model/package.xml",
    "content": "<?xml version=\"1.0\"?>\n<package>\n  <name>camera_model</name>\n  <version>0.0.0</version>\n  <description>The camera_model package</description>\n\n  <!-- One maintainer tag required, multiple allowed, one person per tag --> \n  <!-- Example:  -->\n  <!-- <maintainer email=\"jane.doe@example.com\">Jane Doe</maintainer> -->\n  <maintainer email=\"ionel.heng@ieee.org\">lionel</maintainer>\n\n\n  <!-- One license tag required, multiple allowed, one license per tag -->\n  <!-- Commonly used license strings: -->\n  <!--   BSD, MIT, Boost Software License, GPLv2, GPLv3, LGPLv2.1, LGPLv3 -->\n  <license>GPLv3</license>\n\n\n  <!-- Url tags are optional, but mutiple are allowed, one per tag -->\n  <!-- Optional attribute type can be: website, bugtracker, or repository -->\n  <!-- Example: -->\n  <!-- <url type=\"website\">http://wiki.ros.org/camera_model</url> -->\n\n\n  <!-- Author tags are optional, mutiple are allowed, one per tag -->\n  <!-- Authors do not have to be maintianers, but could be -->\n  <!-- Example: -->\n  <!-- <author email=\"jane.doe@example.com\">Jane Doe</author> -->\n\n\n  <!-- The *_depend tags are used to specify dependencies -->\n  <!-- Dependencies can be catkin packages or system dependencies -->\n  <!-- Examples: -->\n  <!-- Use build_depend for packages you need at compile time: -->\n  <!--   <build_depend>message_generation</build_depend> -->\n  <!-- Use buildtool_depend for build tool packages: -->\n  <!--   <buildtool_depend>catkin</buildtool_depend> -->\n  <!-- Use run_depend for packages you need at runtime: -->\n  <!--   <run_depend>message_runtime</run_depend> -->\n  <!-- Use test_depend for packages you need only for testing: -->\n  <!--   <test_depend>gtest</test_depend> -->\n  <buildtool_depend>catkin</buildtool_depend>\n  <build_depend>roscpp</build_depend>\n  <build_depend>std_msgs</build_depend>\n  <run_depend>roscpp</run_depend>\n  <run_depend>std_msgs</run_depend>\n\n\n  <!-- The export tag contains other, unspecified, tags -->\n  <export>\n    <!-- Other tools can request additional information be placed here -->\n\n  </export>\n</package>\n"
  },
  {
    "path": "camera_model/readme.md",
    "content": "part of [camodocal](https://github.com/hengli/camodocal)\n\n[Google Ceres](http://ceres-solver.org) is needed.\n\n# Calibration:\n\nUse [intrinsic_calib.cc](https://github.com/dvorak0/camera_model/blob/master/src/intrinsic_calib.cc) to calibrate your camera.\n\n# Undistortion:\n\nSee [Camera.h](https://github.com/dvorak0/camera_model/blob/master/include/camodocal/camera_models/Camera.h) for general interface: \n\n - liftProjective: Lift points from the image plane to the projective space.\n - spaceToPlane: Projects 3D points to the image plane (Pi function)\n\n"
  },
  {
    "path": "camera_model/src/calib/CameraCalibration.cc",
    "content": "#include \"camodocal/calib/CameraCalibration.h\"\n\n#include <cstdio>\n#include <eigen3/Eigen/Dense>\n#include <iomanip>\n#include <iostream>\n#include <algorithm>\n#include <fstream>\n#include <opencv2/core/core.hpp>\n#include <opencv2/core/eigen.hpp>\n#include <opencv2/imgproc/imgproc.hpp>\n#include <opencv2/calib3d/calib3d.hpp>\n\n#include \"camodocal/camera_models/CameraFactory.h\"\n#include \"camodocal/sparse_graph/Transform.h\"\n#include \"camodocal/gpl/EigenQuaternionParameterization.h\"\n#include \"camodocal/gpl/EigenUtils.h\"\n#include \"camodocal/camera_models/CostFunctionFactory.h\"\n\n#include \"ceres/ceres.h\"\nnamespace camodocal\n{\n\nCameraCalibration::CameraCalibration()\n : m_boardSize(cv::Size(0,0))\n , m_squareSize(0.0f)\n , m_verbose(false)\n{\n\n}\n\nCameraCalibration::CameraCalibration(const Camera::ModelType modelType,\n                                     const std::string& cameraName,\n                                     const cv::Size& imageSize,\n                                     const cv::Size& boardSize,\n                                     float squareSize)\n : m_boardSize(boardSize)\n , m_squareSize(squareSize)\n , m_verbose(false)\n{\n    m_camera = CameraFactory::instance()->generateCamera(modelType, cameraName, imageSize);\n}\n\nvoid\nCameraCalibration::clear(void)\n{\n    m_imagePoints.clear();\n    m_scenePoints.clear();\n}\n\nvoid\nCameraCalibration::addChessboardData(const std::vector<cv::Point2f>& corners)\n{\n    m_imagePoints.push_back(corners);\n\n    std::vector<cv::Point3f> scenePointsInView;\n    for (int i = 0; i < m_boardSize.height; ++i)\n    {\n        for (int j = 0; j < m_boardSize.width; ++j)\n        {\n            scenePointsInView.push_back(cv::Point3f(i * m_squareSize, j * m_squareSize, 0.0));\n        }\n    }\n    m_scenePoints.push_back(scenePointsInView);\n}\n\nbool\nCameraCalibration::calibrate(void)\n{\n    int imageCount = m_imagePoints.size();\n\n    // compute intrinsic camera parameters and extrinsic parameters for each of the views\n    std::vector<cv::Mat> rvecs;\n    std::vector<cv::Mat> tvecs;\n    bool ret = calibrateHelper(m_camera, rvecs, tvecs);\n\n    m_cameraPoses = cv::Mat(imageCount, 6, CV_64F);\n    for (int i = 0; i < imageCount; ++i)\n    {\n        m_cameraPoses.at<double>(i,0) = rvecs.at(i).at<double>(0);\n        m_cameraPoses.at<double>(i,1) = rvecs.at(i).at<double>(1);\n        m_cameraPoses.at<double>(i,2) = rvecs.at(i).at<double>(2);\n        m_cameraPoses.at<double>(i,3) = tvecs.at(i).at<double>(0);\n        m_cameraPoses.at<double>(i,4) = tvecs.at(i).at<double>(1);\n        m_cameraPoses.at<double>(i,5) = tvecs.at(i).at<double>(2);\n    }\n\n    // Compute measurement covariance.\n    std::vector<std::vector<cv::Point2f> > errVec(m_imagePoints.size());\n    Eigen::Vector2d errSum = Eigen::Vector2d::Zero();\n    size_t errCount = 0;\n    for (size_t i = 0; i < m_imagePoints.size(); ++i)\n    {\n        std::vector<cv::Point2f> estImagePoints;\n        m_camera->projectPoints(m_scenePoints.at(i), rvecs.at(i), tvecs.at(i),\n                                estImagePoints);\n\n        for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j)\n        {\n            cv::Point2f pObs = m_imagePoints.at(i).at(j);\n            cv::Point2f pEst = estImagePoints.at(j);\n\n            cv::Point2f err = pObs - pEst;\n\n            errVec.at(i).push_back(err);\n\n            errSum += Eigen::Vector2d(err.x, err.y);\n        }\n\n        errCount += m_imagePoints.at(i).size();\n    }\n\n    Eigen::Vector2d errMean = errSum / static_cast<double>(errCount);\n\n    Eigen::Matrix2d measurementCovariance = Eigen::Matrix2d::Zero();\n    for (size_t i = 0; i < errVec.size(); ++i)\n    {\n        for (size_t j = 0; j < errVec.at(i).size(); ++j)\n        {\n            cv::Point2f err = errVec.at(i).at(j);\n            double d0 = err.x - errMean(0);\n            double d1 = err.y - errMean(1);\n\n            measurementCovariance(0,0) += d0 * d0;\n            measurementCovariance(0,1) += d0 * d1;\n            measurementCovariance(1,1) += d1 * d1;\n        }\n    }\n    measurementCovariance /= static_cast<double>(errCount);\n    measurementCovariance(1,0) = measurementCovariance(0,1);\n\n    m_measurementCovariance = measurementCovariance;\n\n    return ret;\n}\n\nint\nCameraCalibration::sampleCount(void) const\n{\n    return m_imagePoints.size();\n}\n\nstd::vector<std::vector<cv::Point2f> >&\nCameraCalibration::imagePoints(void)\n{\n    return m_imagePoints;\n}\n\nconst std::vector<std::vector<cv::Point2f> >&\nCameraCalibration::imagePoints(void) const\n{\n    return m_imagePoints;\n}\n\nstd::vector<std::vector<cv::Point3f> >&\nCameraCalibration::scenePoints(void)\n{\n    return m_scenePoints;\n}\n\nconst std::vector<std::vector<cv::Point3f> >&\nCameraCalibration::scenePoints(void) const\n{\n    return m_scenePoints;\n}\n\nCameraPtr&\nCameraCalibration::camera(void)\n{\n    return m_camera;\n}\n\nconst CameraConstPtr\nCameraCalibration::camera(void) const\n{\n    return m_camera;\n}\n\nEigen::Matrix2d&\nCameraCalibration::measurementCovariance(void)\n{\n    return m_measurementCovariance;\n}\n\nconst Eigen::Matrix2d&\nCameraCalibration::measurementCovariance(void) const\n{\n    return m_measurementCovariance;\n}\n\ncv::Mat&\nCameraCalibration::cameraPoses(void)\n{\n    return m_cameraPoses;\n}\n\nconst cv::Mat&\nCameraCalibration::cameraPoses(void) const\n{\n    return m_cameraPoses;\n}\n\nvoid\nCameraCalibration::drawResults(std::vector<cv::Mat>& images) const\n{\n    std::vector<cv::Mat> rvecs, tvecs;\n\n    for (size_t i = 0; i < images.size(); ++i)\n    {\n        cv::Mat rvec(3, 1, CV_64F);\n        rvec.at<double>(0) = m_cameraPoses.at<double>(i,0);\n        rvec.at<double>(1) = m_cameraPoses.at<double>(i,1);\n        rvec.at<double>(2) = m_cameraPoses.at<double>(i,2);\n\n        cv::Mat tvec(3, 1, CV_64F);\n        tvec.at<double>(0) = m_cameraPoses.at<double>(i,3);\n        tvec.at<double>(1) = m_cameraPoses.at<double>(i,4);\n        tvec.at<double>(2) = m_cameraPoses.at<double>(i,5);\n\n        rvecs.push_back(rvec);\n        tvecs.push_back(tvec);\n    }\n\n    int drawShiftBits = 4;\n    int drawMultiplier = 1 << drawShiftBits;\n\n    cv::Scalar green(0, 255, 0);\n    cv::Scalar red(0, 0, 255);\n\n    for (size_t i = 0; i < images.size(); ++i)\n    {\n        cv::Mat& image = images.at(i);\n        if (image.channels() == 1)\n        {\n            cv::cvtColor(image, image, CV_GRAY2RGB);\n        }\n\n        std::vector<cv::Point2f> estImagePoints;\n        m_camera->projectPoints(m_scenePoints.at(i), rvecs.at(i), tvecs.at(i),\n                                estImagePoints);\n\n        float errorSum = 0.0f;\n        float errorMax = std::numeric_limits<float>::min();\n\n        for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j)\n        {\n            cv::Point2f pObs = m_imagePoints.at(i).at(j);\n            cv::Point2f pEst = estImagePoints.at(j);\n\n            cv::circle(image,\n                       cv::Point(cvRound(pObs.x * drawMultiplier),\n                                 cvRound(pObs.y * drawMultiplier)),\n                       5, green, 2, CV_AA, drawShiftBits);\n\n            cv::circle(image,\n                       cv::Point(cvRound(pEst.x * drawMultiplier),\n                                 cvRound(pEst.y * drawMultiplier)),\n                       5, red, 2, CV_AA, drawShiftBits);\n\n            float error = cv::norm(pObs - pEst);\n\n            errorSum += error;\n            if (error > errorMax)\n            {\n                errorMax = error;\n            }\n        }\n\n        std::ostringstream oss;\n        oss << \"Reprojection error: avg = \" << errorSum / m_imagePoints.at(i).size()\n            << \"   max = \" << errorMax;\n\n        cv::putText(image, oss.str(), cv::Point(10, image.rows - 10),\n                    cv::FONT_HERSHEY_COMPLEX, 0.5, cv::Scalar(255, 255, 255),\n                    1, CV_AA);\n    }\n}\n\nvoid\nCameraCalibration::writeParams(const std::string& filename) const\n{\n    m_camera->writeParametersToYamlFile(filename);\n}\n\nbool\nCameraCalibration::writeChessboardData(const std::string& filename) const\n{\n    std::ofstream ofs(filename.c_str(), std::ios::out | std::ios::binary);\n    if (!ofs.is_open())\n    {\n        return false;\n    }\n\n    writeData(ofs, m_boardSize.width);\n    writeData(ofs, m_boardSize.height);\n    writeData(ofs, m_squareSize);\n\n    writeData(ofs, m_measurementCovariance(0,0));\n    writeData(ofs, m_measurementCovariance(0,1));\n    writeData(ofs, m_measurementCovariance(1,0));\n    writeData(ofs, m_measurementCovariance(1,1));\n\n    writeData(ofs, m_cameraPoses.rows);\n    writeData(ofs, m_cameraPoses.cols);\n    writeData(ofs, m_cameraPoses.type());\n    for (int i = 0; i < m_cameraPoses.rows; ++i)\n    {\n        for (int j = 0; j < m_cameraPoses.cols; ++j)\n        {\n            writeData(ofs, m_cameraPoses.at<double>(i,j));\n        }\n    }\n\n    writeData(ofs, m_imagePoints.size());\n    for (size_t i = 0; i < m_imagePoints.size(); ++i)\n    {\n        writeData(ofs, m_imagePoints.at(i).size());\n        for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j)\n        {\n            const cv::Point2f& ipt = m_imagePoints.at(i).at(j);\n\n            writeData(ofs, ipt.x);\n            writeData(ofs, ipt.y);\n        }\n    }\n\n    writeData(ofs, m_scenePoints.size());\n    for (size_t i = 0; i < m_scenePoints.size(); ++i)\n    {\n        writeData(ofs, m_scenePoints.at(i).size());\n        for (size_t j = 0; j < m_scenePoints.at(i).size(); ++j)\n        {\n            const cv::Point3f& spt = m_scenePoints.at(i).at(j);\n\n            writeData(ofs, spt.x);\n            writeData(ofs, spt.y);\n            writeData(ofs, spt.z);\n        }\n    }\n\n    return true;\n}\n\nbool\nCameraCalibration::readChessboardData(const std::string& filename)\n{\n    std::ifstream ifs(filename.c_str(), std::ios::in | std::ios::binary);\n    if (!ifs.is_open())\n    {\n        return false;\n    }\n\n    readData(ifs, m_boardSize.width);\n    readData(ifs, m_boardSize.height);\n    readData(ifs, m_squareSize);\n\n    readData(ifs, m_measurementCovariance(0,0));\n    readData(ifs, m_measurementCovariance(0,1));\n    readData(ifs, m_measurementCovariance(1,0));\n    readData(ifs, m_measurementCovariance(1,1));\n\n    int rows, cols, type;\n    readData(ifs, rows);\n    readData(ifs, cols);\n    readData(ifs, type);\n    m_cameraPoses = cv::Mat(rows, cols, type);\n\n    for (int i = 0; i < m_cameraPoses.rows; ++i)\n    {\n        for (int j = 0; j < m_cameraPoses.cols; ++j)\n        {\n            readData(ifs, m_cameraPoses.at<double>(i,j));\n        }\n    }\n\n    size_t nImagePointSets;\n    readData(ifs, nImagePointSets);\n\n    m_imagePoints.clear();\n    m_imagePoints.resize(nImagePointSets);\n    for (size_t i = 0; i < m_imagePoints.size(); ++i)\n    {\n        size_t nImagePoints;\n        readData(ifs, nImagePoints);\n        m_imagePoints.at(i).resize(nImagePoints);\n\n        for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j)\n        {\n            cv::Point2f& ipt = m_imagePoints.at(i).at(j);\n            readData(ifs, ipt.x);\n            readData(ifs, ipt.y);\n        }\n    }\n\n    size_t nScenePointSets;\n    readData(ifs, nScenePointSets);\n\n    m_scenePoints.clear();\n    m_scenePoints.resize(nScenePointSets);\n    for (size_t i = 0; i < m_scenePoints.size(); ++i)\n    {\n        size_t nScenePoints;\n        readData(ifs, nScenePoints);\n        m_scenePoints.at(i).resize(nScenePoints);\n\n        for (size_t j = 0; j < m_scenePoints.at(i).size(); ++j)\n        {\n            cv::Point3f& spt = m_scenePoints.at(i).at(j);\n            readData(ifs, spt.x);\n            readData(ifs, spt.y);\n            readData(ifs, spt.z);\n        }\n    }\n\n    return true;\n}\n\nvoid\nCameraCalibration::setVerbose(bool verbose)\n{\n    m_verbose = verbose;\n}\n\nbool\nCameraCalibration::calibrateHelper(CameraPtr& camera,\n                                   std::vector<cv::Mat>& rvecs, std::vector<cv::Mat>& tvecs) const\n{\n    rvecs.assign(m_scenePoints.size(), cv::Mat());\n    tvecs.assign(m_scenePoints.size(), cv::Mat());\n\n    // STEP 1: Estimate intrinsics\n    camera->estimateIntrinsics(m_boardSize, m_scenePoints, m_imagePoints);\n\n    // STEP 2: Estimate extrinsics\n    for (size_t i = 0; i < m_scenePoints.size(); ++i)\n    {\n        camera->estimateExtrinsics(m_scenePoints.at(i), m_imagePoints.at(i), rvecs.at(i), tvecs.at(i));\n    }\n\n    if (m_verbose)\n    {\n        std::cout << \"[\" << camera->cameraName() << \"] \"\n                  << \"# INFO: \" << \"Initial reprojection error: \"\n                  << std::fixed << std::setprecision(3)\n                  << camera->reprojectionError(m_scenePoints, m_imagePoints, rvecs, tvecs)\n                  << \" pixels\" << std::endl;\n    }\n\n    // STEP 3: optimization using ceres\n    optimize(camera, rvecs, tvecs);\n\n    if (m_verbose)\n    {\n        double err = camera->reprojectionError(m_scenePoints, m_imagePoints, rvecs, tvecs);\n        std::cout << \"[\" << camera->cameraName() << \"] \" << \"# INFO: Final reprojection error: \"\n                  << err << \" pixels\" << std::endl;\n        std::cout << \"[\" << camera->cameraName() << \"] \" << \"# INFO: \"\n                  << camera->parametersToString() << std::endl;\n    }\n\n    return true;\n}\n\nvoid\nCameraCalibration::optimize(CameraPtr& camera,\n                            std::vector<cv::Mat>& rvecs, std::vector<cv::Mat>& tvecs) const\n{\n    // Use ceres to do optimization\n    ceres::Problem problem;\n\n    std::vector<Transform, Eigen::aligned_allocator<Transform> > transformVec(rvecs.size());\n    for (size_t i = 0; i < rvecs.size(); ++i)\n    {\n        Eigen::Vector3d rvec;\n        cv::cv2eigen(rvecs.at(i), rvec);\n\n        transformVec.at(i).rotation() = Eigen::AngleAxisd(rvec.norm(), rvec.normalized());\n        transformVec.at(i).translation() << tvecs[i].at<double>(0),\n                                            tvecs[i].at<double>(1),\n                                            tvecs[i].at<double>(2);\n    }\n\n    std::vector<double> intrinsicCameraParams;\n    m_camera->writeParameters(intrinsicCameraParams);\n\n    // create residuals for each observation\n    for (size_t i = 0; i < m_imagePoints.size(); ++i)\n    {\n        for (size_t j = 0; j < m_imagePoints.at(i).size(); ++j)\n        {\n            const cv::Point3f& spt = m_scenePoints.at(i).at(j);\n            const cv::Point2f& ipt = m_imagePoints.at(i).at(j);\n\n            ceres::CostFunction* costFunction =\n                CostFunctionFactory::instance()->generateCostFunction(camera,\n                                                                      Eigen::Vector3d(spt.x, spt.y, spt.z),\n                                                                      Eigen::Vector2d(ipt.x, ipt.y),\n                                                                      CAMERA_INTRINSICS | CAMERA_POSE);\n\n            ceres::LossFunction* lossFunction = new ceres::CauchyLoss(1.0);\n            problem.AddResidualBlock(costFunction, lossFunction,\n                                     intrinsicCameraParams.data(),\n                                     transformVec.at(i).rotationData(),\n                                     transformVec.at(i).translationData());\n        }\n\n        ceres::LocalParameterization* quaternionParameterization =\n            new EigenQuaternionParameterization;\n\n        problem.SetParameterization(transformVec.at(i).rotationData(),\n                                    quaternionParameterization);\n    }\n\n    std::cout << \"begin ceres\" << std::endl;\n    ceres::Solver::Options options;\n    options.max_num_iterations = 1000;\n    options.num_threads = 1;\n\n    if (m_verbose)\n    {\n        options.minimizer_progress_to_stdout = true;\n    }\n\n    ceres::Solver::Summary summary;\n    ceres::Solve(options, &problem, &summary);\n    std::cout << \"end ceres\" << std::endl;\n\n    if (m_verbose)\n    {\n        std::cout << summary.FullReport() << std::endl;\n    }\n\n    camera->readParameters(intrinsicCameraParams);\n\n    for (size_t i = 0; i < rvecs.size(); ++i)\n    {\n        Eigen::AngleAxisd aa(transformVec.at(i).rotation());\n\n        Eigen::Vector3d rvec = aa.angle() * aa.axis();\n        cv::eigen2cv(rvec, rvecs.at(i));\n\n        cv::Mat& tvec = tvecs.at(i);\n        tvec.at<double>(0) = transformVec.at(i).translation()(0);\n        tvec.at<double>(1) = transformVec.at(i).translation()(1);\n        tvec.at<double>(2) = transformVec.at(i).translation()(2);\n    }\n}\n\ntemplate<typename T>\nvoid\nCameraCalibration::readData(std::ifstream& ifs, T& data) const\n{\n    char* buffer = new char[sizeof(T)];\n\n    ifs.read(buffer, sizeof(T));\n\n    data = *(reinterpret_cast<T*>(buffer));\n\n    delete[] buffer;\n}\n\ntemplate<typename T>\nvoid\nCameraCalibration::writeData(std::ofstream& ofs, T data) const\n{\n    char* pData = reinterpret_cast<char*>(&data);\n\n    ofs.write(pData, sizeof(T));\n}\n\n}\n"
  },
  {
    "path": "camera_model/src/camera_models/Camera.cc",
    "content": "#include \"camodocal/camera_models/Camera.h\"\n#include \"camodocal/camera_models/ScaramuzzaCamera.h\"\n\n#include <opencv2/calib3d/calib3d.hpp>\n\nnamespace camodocal\n{\n\nCamera::Parameters::Parameters(ModelType modelType)\n : m_modelType(modelType)\n , m_imageWidth(0)\n , m_imageHeight(0)\n{\n    switch (modelType)\n    {\n    case KANNALA_BRANDT:\n        m_nIntrinsics = 8;\n        break;\n    case PINHOLE:\n        m_nIntrinsics = 8;\n        break;\n    case SCARAMUZZA:\n        m_nIntrinsics = SCARAMUZZA_CAMERA_NUM_PARAMS;\n        break;\n    case MEI:\n    default:\n        m_nIntrinsics = 9;\n    }\n}\n\nCamera::Parameters::Parameters(ModelType modelType,\n                               const std::string& cameraName,\n                               int w, int h)\n : m_modelType(modelType)\n , m_cameraName(cameraName)\n , m_imageWidth(w)\n , m_imageHeight(h)\n{\n    switch (modelType)\n    {\n    case KANNALA_BRANDT:\n        m_nIntrinsics = 8;\n        break;\n    case PINHOLE:\n        m_nIntrinsics = 8;\n        break;\n    case SCARAMUZZA:\n        m_nIntrinsics = SCARAMUZZA_CAMERA_NUM_PARAMS;\n        break;\n    case MEI:\n    default:\n        m_nIntrinsics = 9;\n    }\n}\n\nCamera::ModelType&\nCamera::Parameters::modelType(void)\n{\n    return m_modelType;\n}\n\nstd::string&\nCamera::Parameters::cameraName(void)\n{\n    return m_cameraName;\n}\n\nint&\nCamera::Parameters::imageWidth(void)\n{\n    return m_imageWidth;\n}\n\nint&\nCamera::Parameters::imageHeight(void)\n{\n    return m_imageHeight;\n}\n\nCamera::ModelType\nCamera::Parameters::modelType(void) const\n{\n    return m_modelType;\n}\n\nconst std::string&\nCamera::Parameters::cameraName(void) const\n{\n    return m_cameraName;\n}\n\nint\nCamera::Parameters::imageWidth(void) const\n{\n    return m_imageWidth;\n}\n\nint\nCamera::Parameters::imageHeight(void) const\n{\n    return m_imageHeight;\n}\n\nint\nCamera::Parameters::nIntrinsics(void) const\n{\n    return m_nIntrinsics;\n}\n\ncv::Mat&\nCamera::mask(void)\n{\n    return m_mask;\n}\n\nconst cv::Mat&\nCamera::mask(void) const\n{\n    return m_mask;\n}\n\nvoid\nCamera::estimateExtrinsics(const std::vector<cv::Point3f>& objectPoints,\n                           const std::vector<cv::Point2f>& imagePoints,\n                           cv::Mat& rvec, cv::Mat& tvec) const\n{\n    std::vector<cv::Point2f> Ms(imagePoints.size());\n    for (size_t i = 0; i < Ms.size(); ++i)\n    {\n        Eigen::Vector3d P;\n        liftProjective(Eigen::Vector2d(imagePoints.at(i).x, imagePoints.at(i).y), P);\n\n        P /= P(2);\n\n        Ms.at(i).x = P(0);\n        Ms.at(i).y = P(1);\n    }\n\n    // assume unit focal length, zero principal point, and zero distortion\n    cv::solvePnP(objectPoints, Ms, cv::Mat::eye(3, 3, CV_64F), cv::noArray(), rvec, tvec);\n}\n\ndouble\nCamera::reprojectionDist(const Eigen::Vector3d& P1, const Eigen::Vector3d& P2) const\n{\n    Eigen::Vector2d p1, p2;\n\n    spaceToPlane(P1, p1);\n    spaceToPlane(P2, p2);\n\n    return (p1 - p2).norm();\n}\n\ndouble\nCamera::reprojectionError(const std::vector< std::vector<cv::Point3f> >& objectPoints,\n                          const std::vector< std::vector<cv::Point2f> >& imagePoints,\n                          const std::vector<cv::Mat>& rvecs,\n                          const std::vector<cv::Mat>& tvecs,\n                          cv::OutputArray _perViewErrors) const\n{\n    int imageCount = objectPoints.size();\n    size_t pointsSoFar = 0;\n    double totalErr = 0.0;\n\n    bool computePerViewErrors = _perViewErrors.needed();\n    cv::Mat perViewErrors;\n    if (computePerViewErrors)\n    {\n        _perViewErrors.create(imageCount, 1, CV_64F);\n        perViewErrors = _perViewErrors.getMat();\n    }\n\n    for (int i = 0; i < imageCount; ++i)\n    {\n        size_t pointCount = imagePoints.at(i).size();\n\n        pointsSoFar += pointCount;\n\n        std::vector<cv::Point2f> estImagePoints;\n        projectPoints(objectPoints.at(i), rvecs.at(i), tvecs.at(i),\n                      estImagePoints);\n\n        double err = 0.0;\n        for (size_t j = 0; j < imagePoints.at(i).size(); ++j)\n        {\n            err += cv::norm(imagePoints.at(i).at(j) - estImagePoints.at(j));\n        }\n\n        if (computePerViewErrors)\n        {\n            perViewErrors.at<double>(i) = err / pointCount;\n        }\n\n        totalErr += err;\n    }\n\n    return totalErr / pointsSoFar;\n}\n\ndouble\nCamera::reprojectionError(const Eigen::Vector3d& P,\n                          const Eigen::Quaterniond& camera_q,\n                          const Eigen::Vector3d& camera_t,\n                          const Eigen::Vector2d& observed_p) const\n{\n    Eigen::Vector3d P_cam = camera_q.toRotationMatrix() * P + camera_t;\n\n    Eigen::Vector2d p;\n    spaceToPlane(P_cam, p);\n\n    return (p - observed_p).norm();\n}\n\nvoid\nCamera::projectPoints(const std::vector<cv::Point3f>& objectPoints,\n                      const cv::Mat& rvec,\n                      const cv::Mat& tvec,\n                      std::vector<cv::Point2f>& imagePoints) const\n{\n    // project 3D object points to the image plane\n    imagePoints.reserve(objectPoints.size());\n\n    //double\n    cv::Mat R0;\n    cv::Rodrigues(rvec, R0);\n\n    Eigen::MatrixXd R(3,3);\n    R << R0.at<double>(0,0), R0.at<double>(0,1), R0.at<double>(0,2),\n         R0.at<double>(1,0), R0.at<double>(1,1), R0.at<double>(1,2),\n         R0.at<double>(2,0), R0.at<double>(2,1), R0.at<double>(2,2);\n\n    Eigen::Vector3d t;\n    t << tvec.at<double>(0), tvec.at<double>(1), tvec.at<double>(2);\n\n    for (size_t i = 0; i < objectPoints.size(); ++i)\n    {\n        const cv::Point3f& objectPoint = objectPoints.at(i);\n\n        // Rotate and translate\n        Eigen::Vector3d P;\n        P << objectPoint.x, objectPoint.y, objectPoint.z;\n\n        P = R * P + t;\n\n        Eigen::Vector2d p;\n        spaceToPlane(P, p);\n\n        imagePoints.push_back(cv::Point2f(p(0), p(1)));\n    }\n}\n\n}\n"
  },
  {
    "path": "camera_model/src/camera_models/CameraFactory.cc",
    "content": "#include \"camodocal/camera_models/CameraFactory.h\"\n\n#include <boost/algorithm/string.hpp>\n\n\n#include \"camodocal/camera_models/CataCamera.h\"\n#include \"camodocal/camera_models/EquidistantCamera.h\"\n#include \"camodocal/camera_models/PinholeCamera.h\"\n#include \"camodocal/camera_models/ScaramuzzaCamera.h\"\n\n#include \"ceres/ceres.h\"\n\nnamespace camodocal\n{\n\nboost::shared_ptr<CameraFactory> CameraFactory::m_instance;\n\nCameraFactory::CameraFactory()\n{\n\n}\n\nboost::shared_ptr<CameraFactory>\nCameraFactory::instance(void)\n{\n    if (m_instance.get() == 0)\n    {\n        m_instance.reset(new CameraFactory);\n    }\n\n    return m_instance;\n}\n\nCameraPtr\nCameraFactory::generateCamera(Camera::ModelType modelType,\n                              const std::string& cameraName,\n                              cv::Size imageSize) const\n{\n    switch (modelType)\n    {\n    case Camera::KANNALA_BRANDT:\n    {\n        EquidistantCameraPtr camera(new EquidistantCamera);\n\n        EquidistantCamera::Parameters params = camera->getParameters();\n        params.cameraName() = cameraName;\n        params.imageWidth() = imageSize.width;\n        params.imageHeight() = imageSize.height;\n        camera->setParameters(params);\n        return camera;\n    }\n    case Camera::PINHOLE:\n    {\n        PinholeCameraPtr camera(new PinholeCamera);\n\n        PinholeCamera::Parameters params = camera->getParameters();\n        params.cameraName() = cameraName;\n        params.imageWidth() = imageSize.width;\n        params.imageHeight() = imageSize.height;\n        camera->setParameters(params);\n        return camera;\n    }\n    case Camera::SCARAMUZZA:\n    {\n        OCAMCameraPtr camera(new OCAMCamera);\n\n        OCAMCamera::Parameters params = camera->getParameters();\n        params.cameraName() = cameraName;\n        params.imageWidth() = imageSize.width;\n        params.imageHeight() = imageSize.height;\n        camera->setParameters(params);\n        return camera;\n    }\n    case Camera::MEI:\n    default:\n    {\n        CataCameraPtr camera(new CataCamera);\n\n        CataCamera::Parameters params = camera->getParameters();\n        params.cameraName() = cameraName;\n        params.imageWidth() = imageSize.width;\n        params.imageHeight() = imageSize.height;\n        camera->setParameters(params);\n        return camera;\n    }\n    }\n}\n\nCameraPtr\nCameraFactory::generateCameraFromYamlFile(const std::string& filename)\n{\n    cv::FileStorage fs(filename, cv::FileStorage::READ);\n\n    if (!fs.isOpened())\n    {\n        return CameraPtr();\n    }\n\n    Camera::ModelType modelType = Camera::MEI;\n    if (!fs[\"model_type\"].isNone())\n    {\n        std::string sModelType;\n        fs[\"model_type\"] >> sModelType;\n\n        if (boost::iequals(sModelType, \"kannala_brandt\"))\n        {\n            modelType = Camera::KANNALA_BRANDT;\n        }\n        else if (boost::iequals(sModelType, \"mei\"))\n        {\n            modelType = Camera::MEI;\n        }\n        else if (boost::iequals(sModelType, \"scaramuzza\"))\n        {\n            modelType = Camera::SCARAMUZZA;\n        }\n        else if (boost::iequals(sModelType, \"pinhole\"))\n        {\n            modelType = Camera::PINHOLE;\n        }\n        else\n        {\n            std::cerr << \"# ERROR: Unknown camera model: \" << sModelType << std::endl;\n            return CameraPtr();\n        }\n    }\n\n    switch (modelType)\n    {\n    case Camera::KANNALA_BRANDT:\n    {\n        EquidistantCameraPtr camera(new EquidistantCamera);\n\n        EquidistantCamera::Parameters params = camera->getParameters();\n        params.readFromYamlFile(filename);\n        camera->setParameters(params);\n        return camera;\n    }\n    case Camera::PINHOLE:\n    {\n        PinholeCameraPtr camera(new PinholeCamera);\n\n        PinholeCamera::Parameters params = camera->getParameters();\n        params.readFromYamlFile(filename);\n        camera->setParameters(params);\n        return camera;\n    }\n    case Camera::SCARAMUZZA:\n    {\n        OCAMCameraPtr camera(new OCAMCamera);\n\n        OCAMCamera::Parameters params = camera->getParameters();\n        params.readFromYamlFile(filename);\n        camera->setParameters(params);\n        return camera;\n    }\n    case Camera::MEI:\n    default:\n    {\n        CataCameraPtr camera(new CataCamera);\n\n        CataCamera::Parameters params = camera->getParameters();\n        params.readFromYamlFile(filename);\n        camera->setParameters(params);\n        return camera;\n    }\n    }\n\n    return CameraPtr();\n}\n\n}\n\n"
  },
  {
    "path": "camera_model/src/camera_models/CataCamera.cc",
    "content": "#include \"camodocal/camera_models/CataCamera.h\"\n\n#include <cmath>\n#include <cstdio>\n#include <eigen3/Eigen/Dense>\n#include <iomanip>\n#include <iostream>\n#include <opencv2/calib3d/calib3d.hpp>\n#include <opencv2/core/eigen.hpp>\n#include <opencv2/imgproc/imgproc.hpp>\n\n#include \"camodocal/gpl/gpl.h\"\n\nnamespace camodocal\n{\n\nCataCamera::Parameters::Parameters()\n : Camera::Parameters(MEI)\n , m_xi(0.0)\n , m_k1(0.0)\n , m_k2(0.0)\n , m_p1(0.0)\n , m_p2(0.0)\n , m_gamma1(0.0)\n , m_gamma2(0.0)\n , m_u0(0.0)\n , m_v0(0.0)\n{\n\n}\n\nCataCamera::Parameters::Parameters(const std::string& cameraName,\n                                   int w, int h,\n                                   double xi,\n                                   double k1, double k2,\n                                   double p1, double p2,\n                                   double gamma1, double gamma2,\n                                   double u0, double v0)\n : Camera::Parameters(MEI, cameraName, w, h)\n , m_xi(xi)\n , m_k1(k1)\n , m_k2(k2)\n , m_p1(p1)\n , m_p2(p2)\n , m_gamma1(gamma1)\n , m_gamma2(gamma2)\n , m_u0(u0)\n , m_v0(v0)\n{\n}\n\ndouble&\nCataCamera::Parameters::xi(void)\n{\n    return m_xi;\n}\n\ndouble&\nCataCamera::Parameters::k1(void)\n{\n    return m_k1;\n}\n\ndouble&\nCataCamera::Parameters::k2(void)\n{\n    return m_k2;\n}\n\ndouble&\nCataCamera::Parameters::p1(void)\n{\n    return m_p1;\n}\n\ndouble&\nCataCamera::Parameters::p2(void)\n{\n    return m_p2;\n}\n\ndouble&\nCataCamera::Parameters::gamma1(void)\n{\n    return m_gamma1;\n}\n\ndouble&\nCataCamera::Parameters::gamma2(void)\n{\n    return m_gamma2;\n}\n\ndouble&\nCataCamera::Parameters::u0(void)\n{\n    return m_u0;\n}\n\ndouble&\nCataCamera::Parameters::v0(void)\n{\n    return m_v0;\n}\n\ndouble\nCataCamera::Parameters::xi(void) const\n{\n    return m_xi;\n}\n\ndouble\nCataCamera::Parameters::k1(void) const\n{\n    return m_k1;\n}\n\ndouble\nCataCamera::Parameters::k2(void) const\n{\n    return m_k2;\n}\n\ndouble\nCataCamera::Parameters::p1(void) const\n{\n    return m_p1;\n}\n\ndouble\nCataCamera::Parameters::p2(void) const\n{\n    return m_p2;\n}\n\ndouble\nCataCamera::Parameters::gamma1(void) const\n{\n    return m_gamma1;\n}\n\ndouble\nCataCamera::Parameters::gamma2(void) const\n{\n    return m_gamma2;\n}\n\ndouble\nCataCamera::Parameters::u0(void) const\n{\n    return m_u0;\n}\n\ndouble\nCataCamera::Parameters::v0(void) const\n{\n    return m_v0;\n}\n\nbool\nCataCamera::Parameters::readFromYamlFile(const std::string& filename)\n{\n    cv::FileStorage fs(filename, cv::FileStorage::READ);\n\n    if (!fs.isOpened())\n    {\n        return false;\n    }\n\n    if (!fs[\"model_type\"].isNone())\n    {\n        std::string sModelType;\n        fs[\"model_type\"] >> sModelType;\n\n        if (sModelType.compare(\"MEI\") != 0)\n        {\n            return false;\n        }\n    }\n\n    m_modelType = MEI;\n    fs[\"camera_name\"] >> m_cameraName;\n    m_imageWidth = static_cast<int>(fs[\"image_width\"]);\n    m_imageHeight = static_cast<int>(fs[\"image_height\"]);\n\n    cv::FileNode n = fs[\"mirror_parameters\"];\n    m_xi = static_cast<double>(n[\"xi\"]);\n\n    n = fs[\"distortion_parameters\"];\n    m_k1 = static_cast<double>(n[\"k1\"]);\n    m_k2 = static_cast<double>(n[\"k2\"]);\n    m_p1 = static_cast<double>(n[\"p1\"]);\n    m_p2 = static_cast<double>(n[\"p2\"]);\n\n    n = fs[\"projection_parameters\"];\n    m_gamma1 = static_cast<double>(n[\"gamma1\"]);\n    m_gamma2 = static_cast<double>(n[\"gamma2\"]);\n    m_u0 = static_cast<double>(n[\"u0\"]);\n    m_v0 = static_cast<double>(n[\"v0\"]);\n\n    return true;\n}\n\nvoid\nCataCamera::Parameters::writeToYamlFile(const std::string& filename) const\n{\n    cv::FileStorage fs(filename, cv::FileStorage::WRITE);\n\n    fs << \"model_type\" << \"MEI\";\n    fs << \"camera_name\" << m_cameraName;\n    fs << \"image_width\" << m_imageWidth;\n    fs << \"image_height\" << m_imageHeight;\n\n    // mirror: xi\n    fs << \"mirror_parameters\";\n    fs << \"{\" << \"xi\" << m_xi << \"}\";\n\n    // radial distortion: k1, k2\n    // tangential distortion: p1, p2\n    fs << \"distortion_parameters\";\n    fs << \"{\" << \"k1\" << m_k1\n              << \"k2\" << m_k2\n              << \"p1\" << m_p1\n              << \"p2\" << m_p2 << \"}\";\n\n    // projection: gamma1, gamma2, u0, v0\n    fs << \"projection_parameters\";\n    fs << \"{\" << \"gamma1\" << m_gamma1\n              << \"gamma2\" << m_gamma2\n              << \"u0\" << m_u0\n              << \"v0\" << m_v0 << \"}\";\n\n    fs.release();\n}\n\nCataCamera::Parameters&\nCataCamera::Parameters::operator=(const CataCamera::Parameters& other)\n{\n    if (this != &other)\n    {\n        m_modelType = other.m_modelType;\n        m_cameraName = other.m_cameraName;\n        m_imageWidth = other.m_imageWidth;\n        m_imageHeight = other.m_imageHeight;\n        m_xi = other.m_xi;\n        m_k1 = other.m_k1;\n        m_k2 = other.m_k2;\n        m_p1 = other.m_p1;\n        m_p2 = other.m_p2;\n        m_gamma1 = other.m_gamma1;\n        m_gamma2 = other.m_gamma2;\n        m_u0 = other.m_u0;\n        m_v0 = other.m_v0;\n    }\n\n    return *this;\n}\n\nstd::ostream&\noperator<< (std::ostream& out, const CataCamera::Parameters& params)\n{\n    out << \"Camera Parameters:\" << std::endl;\n    out << \"    model_type \" << \"MEI\" << std::endl;\n    out << \"   camera_name \" << params.m_cameraName << std::endl;\n    out << \"   image_width \" << params.m_imageWidth << std::endl;\n    out << \"  image_height \" << params.m_imageHeight << std::endl;\n\n    out << \"Mirror Parameters\" << std::endl;\n    out << std::fixed << std::setprecision(10);\n    out << \"            xi \" << params.m_xi << std::endl;\n\n    // radial distortion: k1, k2\n    // tangential distortion: p1, p2\n    out << \"Distortion Parameters\" << std::endl;\n    out << \"            k1 \" << params.m_k1 << std::endl\n        << \"            k2 \" << params.m_k2 << std::endl\n        << \"            p1 \" << params.m_p1 << std::endl\n        << \"            p2 \" << params.m_p2 << std::endl;\n\n    // projection: gamma1, gamma2, u0, v0\n    out << \"Projection Parameters\" << std::endl;\n    out << \"        gamma1 \" << params.m_gamma1 << std::endl\n        << \"        gamma2 \" << params.m_gamma2 << std::endl\n        << \"            u0 \" << params.m_u0 << std::endl\n        << \"            v0 \" << params.m_v0 << std::endl;\n\n    return out;\n}\n\nCataCamera::CataCamera()\n : m_inv_K11(1.0)\n , m_inv_K13(0.0)\n , m_inv_K22(1.0)\n , m_inv_K23(0.0)\n , m_noDistortion(true)\n{\n\n}\n\nCataCamera::CataCamera(const std::string& cameraName,\n                       int imageWidth, int imageHeight,\n                       double xi, double k1, double k2, double p1, double p2,\n                       double gamma1, double gamma2, double u0, double v0)\n : mParameters(cameraName, imageWidth, imageHeight,\n               xi, k1, k2, p1, p2, gamma1, gamma2, u0, v0)\n{\n    if ((mParameters.k1() == 0.0) &&\n        (mParameters.k2() == 0.0) &&\n        (mParameters.p1() == 0.0) &&\n        (mParameters.p2() == 0.0))\n    {\n        m_noDistortion = true;\n    }\n    else\n    {\n        m_noDistortion = false;\n    }\n\n    // Inverse camera projection matrix parameters\n    m_inv_K11 = 1.0 / mParameters.gamma1();\n    m_inv_K13 = -mParameters.u0() / mParameters.gamma1();\n    m_inv_K22 = 1.0 / mParameters.gamma2();\n    m_inv_K23 = -mParameters.v0() / mParameters.gamma2();\n}\n\nCataCamera::CataCamera(const CataCamera::Parameters& params)\n : mParameters(params)\n{\n    if ((mParameters.k1() == 0.0) &&\n        (mParameters.k2() == 0.0) &&\n        (mParameters.p1() == 0.0) &&\n        (mParameters.p2() == 0.0))\n    {\n        m_noDistortion = true;\n    }\n    else\n    {\n        m_noDistortion = false;\n    }\n\n    // Inverse camera projection matrix parameters\n    m_inv_K11 = 1.0 / mParameters.gamma1();\n    m_inv_K13 = -mParameters.u0() / mParameters.gamma1();\n    m_inv_K22 = 1.0 / mParameters.gamma2();\n    m_inv_K23 = -mParameters.v0() / mParameters.gamma2();\n}\n\nCamera::ModelType\nCataCamera::modelType(void) const\n{\n    return mParameters.modelType();\n}\n\nconst std::string&\nCataCamera::cameraName(void) const\n{\n    return mParameters.cameraName();\n}\n\nint\nCataCamera::imageWidth(void) const\n{\n    return mParameters.imageWidth();\n}\n\nint\nCataCamera::imageHeight(void) const\n{\n    return mParameters.imageHeight();\n}\n\nvoid\nCataCamera::estimateIntrinsics(const cv::Size& boardSize,\n                               const std::vector< std::vector<cv::Point3f> >& objectPoints,\n                               const std::vector< std::vector<cv::Point2f> >& imagePoints)\n{\n    Parameters params = getParameters();\n\n    double u0 = params.imageWidth() / 2.0;\n    double v0 = params.imageHeight() / 2.0;\n\n    double gamma0 = 0.0;\n    double minReprojErr = std::numeric_limits<double>::max();\n\n    std::vector<cv::Mat> rvecs, tvecs;\n    rvecs.assign(objectPoints.size(), cv::Mat());\n    tvecs.assign(objectPoints.size(), cv::Mat());\n\n    params.xi() = 1.0;\n    params.k1() = 0.0;\n    params.k2() = 0.0;\n    params.p1() = 0.0;\n    params.p2() = 0.0;\n    params.u0() = u0;\n    params.v0() = v0;\n\n    // Initialize gamma (focal length)\n    // Use non-radial line image and xi = 1\n    for (size_t i = 0; i < imagePoints.size(); ++i)\n    {\n        for (int r = 0; r < boardSize.height; ++r)\n        {\n            cv::Mat P(boardSize.width, 4, CV_64F);\n            for (int c = 0; c < boardSize.width; ++c)\n            {\n                const cv::Point2f& imagePoint = imagePoints.at(i).at(r * boardSize.width + c);\n\n                double u = imagePoint.x - u0;\n                double v = imagePoint.y - v0;\n\n                P.at<double>(c, 0) = u;\n                P.at<double>(c, 1) = v;\n                P.at<double>(c, 2) = 0.5;\n                P.at<double>(c, 3) = -0.5 * (square(u) + square(v));\n            }\n\n            cv::Mat C;\n            cv::SVD::solveZ(P, C);\n\n            double t = square(C.at<double>(0)) + square(C.at<double>(1)) + C.at<double>(2) * C.at<double>(3);\n            if (t < 0.0)\n            {\n                continue;\n            }\n\n            // check that line image is not radial\n            double d = sqrt(1.0 / t);\n            double nx = C.at<double>(0) * d;\n            double ny = C.at<double>(1) * d;\n            if (hypot(nx, ny) > 0.95)\n            {\n                continue;\n            }\n\n            double gamma = sqrt(C.at<double>(2) / C.at<double>(3));\n\n            params.gamma1() = gamma;\n            params.gamma2() = gamma;\n            setParameters(params);\n\n            for (size_t j = 0; j < objectPoints.size(); ++j)\n            {\n                estimateExtrinsics(objectPoints.at(j), imagePoints.at(j), rvecs.at(j), tvecs.at(j));\n            }\n\n            double reprojErr = reprojectionError(objectPoints, imagePoints, rvecs, tvecs, cv::noArray());\n\n            if (reprojErr < minReprojErr)\n            {\n                minReprojErr = reprojErr;\n                gamma0 = gamma;\n            }\n        }\n    }\n\n    if (gamma0 <= 0.0 && minReprojErr >= std::numeric_limits<double>::max())\n    {\n        std::cout << \"[\" << params.cameraName() << \"] \"\n                  << \"# INFO: CataCamera model fails with given data. \" << std::endl;\n\n        return;\n    }\n\n    params.gamma1() = gamma0;\n    params.gamma2() = gamma0;\n    setParameters(params);\n}\n\n/** \n * \\brief Lifts a point from the image plane to the unit sphere\n *\n * \\param p image coordinates\n * \\param P coordinates of the point on the sphere\n */\nvoid\nCataCamera::liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const\n{\n    double mx_d, my_d,mx2_d, mxy_d, my2_d, mx_u, my_u;\n    double rho2_d, rho4_d, radDist_d, Dx_d, Dy_d, inv_denom_d;\n    double lambda;\n\n    // Lift points to normalised plane\n    mx_d = m_inv_K11 * p(0) + m_inv_K13;\n    my_d = m_inv_K22 * p(1) + m_inv_K23;\n\n    if (m_noDistortion)\n    {\n        mx_u = mx_d;\n        my_u = my_d;\n    }\n    else\n    {\n        // Apply inverse distortion model\n        if (0)\n        {\n            double k1 = mParameters.k1();\n            double k2 = mParameters.k2();\n            double p1 = mParameters.p1();\n            double p2 = mParameters.p2();\n\n            // Inverse distortion model\n            // proposed by Heikkila\n            mx2_d = mx_d*mx_d;\n            my2_d = my_d*my_d;\n            mxy_d = mx_d*my_d;\n            rho2_d = mx2_d+my2_d;\n            rho4_d = rho2_d*rho2_d;\n            radDist_d = k1*rho2_d+k2*rho4_d;\n            Dx_d = mx_d*radDist_d + p2*(rho2_d+2*mx2_d) + 2*p1*mxy_d;\n            Dy_d = my_d*radDist_d + p1*(rho2_d+2*my2_d) + 2*p2*mxy_d;\n            inv_denom_d = 1/(1+4*k1*rho2_d+6*k2*rho4_d+8*p1*my_d+8*p2*mx_d);\n\n            mx_u = mx_d - inv_denom_d*Dx_d;\n            my_u = my_d - inv_denom_d*Dy_d;\n        }\n        else\n        {\n            // Recursive distortion model\n            int n = 6;\n            Eigen::Vector2d d_u;\n            distortion(Eigen::Vector2d(mx_d, my_d), d_u);\n            // Approximate value\n            mx_u = mx_d - d_u(0);\n            my_u = my_d - d_u(1);\n\n            for (int i = 1; i < n; ++i)\n            {\n                distortion(Eigen::Vector2d(mx_u, my_u), d_u);\n                mx_u = mx_d - d_u(0);\n                my_u = my_d - d_u(1);\n            }\n        }\n    }\n\n    // Lift normalised points to the sphere (inv_hslash)\n    double xi = mParameters.xi();\n    if (xi == 1.0)\n    {\n        lambda = 2.0 / (mx_u * mx_u + my_u * my_u + 1.0);\n        P << lambda * mx_u, lambda * my_u, lambda - 1.0;\n    }\n    else\n    {\n        lambda = (xi + sqrt(1.0 + (1.0 - xi * xi) * (mx_u * mx_u + my_u * my_u))) / (1.0 + mx_u * mx_u + my_u * my_u);\n        P << lambda * mx_u, lambda * my_u, lambda - xi;\n    }\n}\n\n/** \n * \\brief Lifts a point from the image plane to its projective ray\n *\n * \\param p image coordinates\n * \\param P coordinates of the projective ray\n */\nvoid\nCataCamera::liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const\n{\n    double mx_d, my_d,mx2_d, mxy_d, my2_d, mx_u, my_u;\n    double rho2_d, rho4_d, radDist_d, Dx_d, Dy_d, inv_denom_d;\n    //double lambda;\n\n    // Lift points to normalised plane\n    mx_d = m_inv_K11 * p(0) + m_inv_K13;\n    my_d = m_inv_K22 * p(1) + m_inv_K23;\n\n    if (m_noDistortion)\n    {\n        mx_u = mx_d;\n        my_u = my_d;\n    }\n    else\n    {\n        if (0)\n        {\n            double k1 = mParameters.k1();\n            double k2 = mParameters.k2();\n            double p1 = mParameters.p1();\n            double p2 = mParameters.p2();\n\n            // Apply inverse distortion model\n            // proposed by Heikkila\n            mx2_d = mx_d*mx_d;\n            my2_d = my_d*my_d;\n            mxy_d = mx_d*my_d;\n            rho2_d = mx2_d+my2_d;\n            rho4_d = rho2_d*rho2_d;\n            radDist_d = k1*rho2_d+k2*rho4_d;\n            Dx_d = mx_d*radDist_d + p2*(rho2_d+2*mx2_d) + 2*p1*mxy_d;\n            Dy_d = my_d*radDist_d + p1*(rho2_d+2*my2_d) + 2*p2*mxy_d;\n            inv_denom_d = 1/(1+4*k1*rho2_d+6*k2*rho4_d+8*p1*my_d+8*p2*mx_d);\n\n            mx_u = mx_d - inv_denom_d*Dx_d;\n            my_u = my_d - inv_denom_d*Dy_d;\n        }\n        else\n        {\n            // Recursive distortion model\n            int n = 8;\n            Eigen::Vector2d d_u;\n            distortion(Eigen::Vector2d(mx_d, my_d), d_u);\n            // Approximate value\n            mx_u = mx_d - d_u(0);\n            my_u = my_d - d_u(1);\n\n            for (int i = 1; i < n; ++i)\n            {\n                distortion(Eigen::Vector2d(mx_u, my_u), d_u);\n                mx_u = mx_d - d_u(0);\n                my_u = my_d - d_u(1);\n            }\n        }\n    }\n\n    // Obtain a projective ray\n    double xi = mParameters.xi();\n    if (xi == 1.0)\n    {\n        P << mx_u, my_u, (1.0 - mx_u * mx_u - my_u * my_u) / 2.0;\n    }\n    else\n    {\n        // Reuse variable\n        rho2_d = mx_u * mx_u + my_u * my_u;\n        P << mx_u, my_u, 1.0 - xi * (rho2_d + 1.0) / (xi + sqrt(1.0 + (1.0 - xi * xi) * rho2_d));\n    }\n}\n\n\n/** \n * \\brief Project a 3D point (\\a x,\\a y,\\a z) to the image plane in (\\a u,\\a v)\n *\n * \\param P 3D point coordinates\n * \\param p return value, contains the image point coordinates\n */\nvoid\nCataCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const\n{\n    Eigen::Vector2d p_u, p_d;\n\n    // Project points to the normalised plane\n    double z = P(2) + mParameters.xi() * P.norm();\n    p_u << P(0) / z, P(1) / z;\n\n    if (m_noDistortion)\n    {\n        p_d = p_u;\n    }\n    else\n    {\n        // Apply distortion\n        Eigen::Vector2d d_u;\n        distortion(p_u, d_u);\n        p_d = p_u + d_u;\n    }\n\n    // Apply generalised projection matrix\n    p << mParameters.gamma1() * p_d(0) + mParameters.u0(),\n         mParameters.gamma2() * p_d(1) + mParameters.v0();\n}\n\n#if 0\n/** \n * \\brief Project a 3D point to the image plane and calculate Jacobian\n *\n * \\param P 3D point coordinates\n * \\param p return value, contains the image point coordinates\n */\nvoid\nCataCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p,\n                        Eigen::Matrix<double,2,3>& J) const\n{\n    double xi = mParameters.xi();\n\n    Eigen::Vector2d p_u, p_d;\n    double norm, inv_denom;\n    double dxdmx, dydmx, dxdmy, dydmy;\n\n    norm = P.norm();\n    // Project points to the normalised plane\n    inv_denom = 1.0 / (P(2) + xi * norm);\n    p_u << inv_denom * P(0), inv_denom * P(1);\n\n    // Calculate jacobian\n    inv_denom = inv_denom * inv_denom / norm;\n    double dudx = inv_denom * (norm * P(2) + xi * (P(1) * P(1) + P(2) * P(2)));\n    double dvdx = -inv_denom * xi * P(0) * P(1);\n    double dudy = dvdx;\n    double dvdy = inv_denom * (norm * P(2) + xi * (P(0) * P(0) + P(2) * P(2)));\n    inv_denom = inv_denom * (-xi * P(2) - norm); // reuse variable\n    double dudz = P(0) * inv_denom;\n    double dvdz = P(1) * inv_denom;\n\n    if (m_noDistortion)\n    {\n        p_d = p_u;\n    }\n    else\n    {\n        // Apply distortion\n        Eigen::Vector2d d_u;\n        distortion(p_u, d_u);\n        p_d = p_u + d_u;\n    }\n\n    double gamma1 = mParameters.gamma1();\n    double gamma2 = mParameters.gamma2();\n\n    // Make the product of the jacobians\n    // and add projection matrix jacobian\n    inv_denom = gamma1 * (dudx * dxdmx + dvdx * dxdmy); // reuse\n    dvdx = gamma2 * (dudx * dydmx + dvdx * dydmy);\n    dudx = inv_denom;\n\n    inv_denom = gamma1 * (dudy * dxdmx + dvdy * dxdmy); // reuse\n    dvdy = gamma2 * (dudy * dydmx + dvdy * dydmy);\n    dudy = inv_denom;\n\n    inv_denom = gamma1 * (dudz * dxdmx + dvdz * dxdmy); // reuse\n    dvdz = gamma2 * (dudz * dydmx + dvdz * dydmy);\n    dudz = inv_denom;\n    \n    // Apply generalised projection matrix\n    p << gamma1 * p_d(0) + mParameters.u0(),\n         gamma2 * p_d(1) + mParameters.v0();\n\n    J << dudx, dudy, dudz,\n         dvdx, dvdy, dvdz;\n}\n#endif\n\n/** \n * \\brief Projects an undistorted 2D point p_u to the image plane\n *\n * \\param p_u 2D point coordinates\n * \\return image point coordinates\n */\nvoid\nCataCamera::undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const\n{\n    Eigen::Vector2d p_d;\n\n    if (m_noDistortion)\n    {\n        p_d = p_u;\n    }\n    else\n    {\n        // Apply distortion\n        Eigen::Vector2d d_u;\n        distortion(p_u, d_u);\n        p_d = p_u + d_u;\n    }\n\n    // Apply generalised projection matrix\n    p << mParameters.gamma1() * p_d(0) + mParameters.u0(),\n         mParameters.gamma2() * p_d(1) + mParameters.v0();\n}\n\n/** \n * \\brief Apply distortion to input point (from the normalised plane)\n *  \n * \\param p_u undistorted coordinates of point on the normalised plane\n * \\return to obtain the distorted point: p_d = p_u + d_u\n */\nvoid\nCataCamera::distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u) const\n{\n    double k1 = mParameters.k1();\n    double k2 = mParameters.k2();\n    double p1 = mParameters.p1();\n    double p2 = mParameters.p2();\n\n    double mx2_u, my2_u, mxy_u, rho2_u, rad_dist_u;\n\n    mx2_u = p_u(0) * p_u(0);\n    my2_u = p_u(1) * p_u(1);\n    mxy_u = p_u(0) * p_u(1);\n    rho2_u = mx2_u + my2_u;\n    rad_dist_u = k1 * rho2_u + k2 * rho2_u * rho2_u;\n    d_u << p_u(0) * rad_dist_u + 2.0 * p1 * mxy_u + p2 * (rho2_u + 2.0 * mx2_u),\n           p_u(1) * rad_dist_u + 2.0 * p2 * mxy_u + p1 * (rho2_u + 2.0 * my2_u);\n}\n\n/** \n * \\brief Apply distortion to input point (from the normalised plane)\n *        and calculate Jacobian\n *\n * \\param p_u undistorted coordinates of point on the normalised plane\n * \\return to obtain the distorted point: p_d = p_u + d_u\n */\nvoid\nCataCamera::distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u,\n                       Eigen::Matrix2d& J) const\n{\n    double k1 = mParameters.k1();\n    double k2 = mParameters.k2();\n    double p1 = mParameters.p1();\n    double p2 = mParameters.p2();\n\n    double mx2_u, my2_u, mxy_u, rho2_u, rad_dist_u;\n\n    mx2_u = p_u(0) * p_u(0);\n    my2_u = p_u(1) * p_u(1);\n    mxy_u = p_u(0) * p_u(1);\n    rho2_u = mx2_u + my2_u;\n    rad_dist_u = k1 * rho2_u + k2 * rho2_u * rho2_u;\n    d_u << p_u(0) * rad_dist_u + 2.0 * p1 * mxy_u + p2 * (rho2_u + 2.0 * mx2_u),\n           p_u(1) * rad_dist_u + 2.0 * p2 * mxy_u + p1 * (rho2_u + 2.0 * my2_u);\n\n    double dxdmx = 1.0 + rad_dist_u + k1 * 2.0 * mx2_u + k2 * rho2_u * 4.0 * mx2_u + 2.0 * p1 * p_u(1) + 6.0 * p2 * p_u(0);\n    double dydmx = k1 * 2.0 * p_u(0) * p_u(1) + k2 * 4.0 * rho2_u * p_u(0) * p_u(1) + p1 * 2.0 * p_u(0) + 2.0 * p2 * p_u(1);\n    double dxdmy = dydmx;\n    double dydmy = 1.0 + rad_dist_u + k1 * 2.0 * my2_u + k2 * rho2_u * 4.0 * my2_u + 6.0 * p1 * p_u(1) + 2.0 * p2 * p_u(0);\n\n    J << dxdmx, dxdmy,\n         dydmx, dydmy;\n}\n\nvoid\nCataCamera::initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale) const\n{\n    cv::Size imageSize(mParameters.imageWidth(), mParameters.imageHeight());\n\n    cv::Mat mapX = cv::Mat::zeros(imageSize, CV_32F);\n    cv::Mat mapY = cv::Mat::zeros(imageSize, CV_32F);\n\n    for (int v = 0; v < imageSize.height; ++v)\n    {\n        for (int u = 0; u < imageSize.width; ++u)\n        {\n            double mx_u = m_inv_K11 / fScale * u + m_inv_K13 / fScale;\n            double my_u = m_inv_K22 / fScale * v + m_inv_K23 / fScale;\n\n            double xi = mParameters.xi();\n            double d2 = mx_u * mx_u + my_u * my_u;\n\n            Eigen::Vector3d P;\n            P << mx_u, my_u, 1.0 - xi * (d2 + 1.0) / (xi + sqrt(1.0 + (1.0 - xi * xi) * d2));\n\n            Eigen::Vector2d p;\n            spaceToPlane(P, p);\n\n            mapX.at<float>(v,u) = p(0);\n            mapY.at<float>(v,u) = p(1);\n        }\n    }\n\n    cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false);\n}\n\ncv::Mat\nCataCamera::initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\n                                    float fx, float fy,\n                                    cv::Size imageSize,\n                                    float cx, float cy,\n                                    cv::Mat rmat) const\n{\n    if (imageSize == cv::Size(0, 0))\n    {\n        imageSize = cv::Size(mParameters.imageWidth(), mParameters.imageHeight());\n    }\n\n    cv::Mat mapX = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F);\n    cv::Mat mapY = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F);\n\n    Eigen::Matrix3f K_rect;\n\n    if (cx == -1.0f && cy == -1.0f)\n    {\n        K_rect << fx, 0, imageSize.width / 2,\n                  0, fy, imageSize.height / 2,\n                  0, 0, 1;\n    }\n    else\n    {\n        K_rect << fx, 0, cx,\n                  0, fy, cy,\n                  0, 0, 1;\n    }\n\n    if (fx == -1.0f || fy == -1.0f)\n    {\n        K_rect(0,0) = mParameters.gamma1();\n        K_rect(1,1) = mParameters.gamma2();\n    }\n\n    Eigen::Matrix3f K_rect_inv = K_rect.inverse();\n\n    Eigen::Matrix3f R, R_inv;\n    cv::cv2eigen(rmat, R);\n    R_inv = R.inverse();\n\n    for (int v = 0; v < imageSize.height; ++v)\n    {\n        for (int u = 0; u < imageSize.width; ++u)\n        {\n            Eigen::Vector3f xo;\n            xo << u, v, 1;\n\n            Eigen::Vector3f uo = R_inv * K_rect_inv * xo;\n\n            Eigen::Vector2d p;\n            spaceToPlane(uo.cast<double>(), p);\n\n            mapX.at<float>(v,u) = p(0);\n            mapY.at<float>(v,u) = p(1);\n        }\n    }\n\n    cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false);\n\n    cv::Mat K_rect_cv;\n    cv::eigen2cv(K_rect, K_rect_cv);\n    return K_rect_cv;\n}\n\nint\nCataCamera::parameterCount(void) const\n{\n    return 9;\n}\n\nconst CataCamera::Parameters&\nCataCamera::getParameters(void) const\n{\n    return mParameters;\n}\n\nvoid\nCataCamera::setParameters(const CataCamera::Parameters& parameters)\n{\n    mParameters = parameters;\n\n    if ((mParameters.k1() == 0.0) &&\n        (mParameters.k2() == 0.0) &&\n        (mParameters.p1() == 0.0) &&\n        (mParameters.p2() == 0.0))\n    {\n        m_noDistortion = true;\n    }\n    else\n    {\n        m_noDistortion = false;\n    }\n\n    m_inv_K11 = 1.0 / mParameters.gamma1();\n    m_inv_K13 = -mParameters.u0() / mParameters.gamma1();\n    m_inv_K22 = 1.0 / mParameters.gamma2();\n    m_inv_K23 = -mParameters.v0() / mParameters.gamma2();\n}\n\nvoid\nCataCamera::readParameters(const std::vector<double>& parameterVec)\n{\n    if ((int)parameterVec.size() != parameterCount())\n    {\n        return;\n    }\n\n    Parameters params = getParameters();\n\n    params.xi() = parameterVec.at(0);\n    params.k1() = parameterVec.at(1);\n    params.k2() = parameterVec.at(2);\n    params.p1() = parameterVec.at(3);\n    params.p2() = parameterVec.at(4);\n    params.gamma1() = parameterVec.at(5);\n    params.gamma2() = parameterVec.at(6);\n    params.u0() = parameterVec.at(7);\n    params.v0() = parameterVec.at(8);\n\n    setParameters(params);\n}\n\nvoid\nCataCamera::writeParameters(std::vector<double>& parameterVec) const\n{\n    parameterVec.resize(parameterCount());\n    parameterVec.at(0) = mParameters.xi();\n    parameterVec.at(1) = mParameters.k1();\n    parameterVec.at(2) = mParameters.k2();\n    parameterVec.at(3) = mParameters.p1();\n    parameterVec.at(4) = mParameters.p2();\n    parameterVec.at(5) = mParameters.gamma1();\n    parameterVec.at(6) = mParameters.gamma2();\n    parameterVec.at(7) = mParameters.u0();\n    parameterVec.at(8) = mParameters.v0();\n}\n\nvoid\nCataCamera::writeParametersToYamlFile(const std::string& filename) const\n{\n    mParameters.writeToYamlFile(filename);\n}\n\nstd::string\nCataCamera::parametersToString(void) const\n{\n    std::ostringstream oss;\n    oss << mParameters;\n\n    return oss.str();\n}\n\n}\n"
  },
  {
    "path": "camera_model/src/camera_models/CostFunctionFactory.cc",
    "content": "#include \"camodocal/camera_models/CostFunctionFactory.h\"\n\n#include \"ceres/ceres.h\"\n#include \"camodocal/camera_models/CataCamera.h\"\n#include \"camodocal/camera_models/EquidistantCamera.h\"\n#include \"camodocal/camera_models/PinholeCamera.h\"\n#include \"camodocal/camera_models/ScaramuzzaCamera.h\"\n\nnamespace camodocal\n{\n\ntemplate<typename T>\nvoid\nworldToCameraTransform(const T* const q_cam_odo, const T* const t_cam_odo,\n                       const T* const p_odo, const T* const att_odo,\n                       T* q, T* t, bool optimize_cam_odo_z = true)\n{\n    Eigen::Quaternion<T> q_z_inv(cos(att_odo[0] / T(2)), T(0), T(0), -sin(att_odo[0] / T(2)));\n    Eigen::Quaternion<T> q_y_inv(cos(att_odo[1] / T(2)), T(0), -sin(att_odo[1] / T(2)), T(0));\n    Eigen::Quaternion<T> q_x_inv(cos(att_odo[2] / T(2)), -sin(att_odo[2] / T(2)), T(0), T(0));\n\n    Eigen::Quaternion<T> q_zyx_inv = q_x_inv * q_y_inv * q_z_inv;\n\n    T q_odo[4] = {q_zyx_inv.w(), q_zyx_inv.x(), q_zyx_inv.y(), q_zyx_inv.z()};\n\n    T q_odo_cam[4] = {q_cam_odo[3], -q_cam_odo[0], -q_cam_odo[1], -q_cam_odo[2]};\n\n    T q0[4];\n    ceres::QuaternionProduct(q_odo_cam, q_odo, q0);\n\n    T t0[3];\n    T t_odo[3] = {p_odo[0], p_odo[1], p_odo[2]};\n\n    ceres::QuaternionRotatePoint(q_odo, t_odo, t0);\n\n    t0[0] += t_cam_odo[0];\n    t0[1] += t_cam_odo[1];\n\n    if (optimize_cam_odo_z)\n    {\n        t0[2] += t_cam_odo[2];\n    }\n\n    ceres::QuaternionRotatePoint(q_odo_cam, t0, t);\n    t[0] = -t[0];\n    t[1] = -t[1];\n    t[2] = -t[2];\n\n    // Convert quaternion from Ceres convention (w, x, y, z)\n    // to Eigen convention (x, y, z, w)\n    q[0] = q0[1];\n    q[1] = q0[2];\n    q[2] = q0[3];\n    q[3] = q0[0];\n}\n\ntemplate<class CameraT>\nclass ReprojectionError1\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n\n    ReprojectionError1(const Eigen::Vector3d& observed_P,\n                       const Eigen::Vector2d& observed_p)\n        : m_observed_P(observed_P), m_observed_p(observed_p)\n        , m_sqrtPrecisionMat(Eigen::Matrix2d::Identity()) {}\n\n    ReprojectionError1(const Eigen::Vector3d& observed_P,\n                       const Eigen::Vector2d& observed_p,\n                       const Eigen::Matrix2d& sqrtPrecisionMat)\n        : m_observed_P(observed_P), m_observed_p(observed_p)\n        , m_sqrtPrecisionMat(sqrtPrecisionMat) {}\n\n    ReprojectionError1(const std::vector<double>& intrinsic_params,\n                       const Eigen::Vector3d& observed_P,\n                       const Eigen::Vector2d& observed_p)\n        : m_intrinsic_params(intrinsic_params)\n        , m_observed_P(observed_P), m_observed_p(observed_p) {}\n\n    // variables: camera intrinsics and camera extrinsics\n    template <typename T>\n    bool operator()(const T* const intrinsic_params,\n                    const T* const q,\n                    const T* const t,\n                    T* residuals) const\n    {\n        Eigen::Matrix<T, 3, 1> P = m_observed_P.cast<T>();\n\n        Eigen::Matrix<T, 2, 1> predicted_p;\n        CameraT::spaceToPlane(intrinsic_params, q, t, P, predicted_p);\n\n        Eigen::Matrix<T, 2, 1> e = predicted_p - m_observed_p.cast<T>();\n\n        Eigen::Matrix<T, 2, 1> e_weighted = m_sqrtPrecisionMat.cast<T>() * e;\n\n        residuals[0] = e_weighted(0);\n        residuals[1] = e_weighted(1);\n\n        return true;\n    }\n\n    // variables: camera-odometry transforms and odometry poses\n    template <typename T>\n    bool operator()(const T* const q_cam_odo, const T* const t_cam_odo,\n                    const T* const p_odo, const T* const att_odo,\n                    T* residuals) const\n    {\n        T q[4], t[3];\n        worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t);\n\n        Eigen::Matrix<T, 3, 1> P = m_observed_P.cast<T>();\n\n        std::vector<T> intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end());\n\n        // project 3D object point to the image plane\n        Eigen::Matrix<T, 2, 1> predicted_p;\n        CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p);\n\n        residuals[0] = predicted_p(0) - T(m_observed_p(0));\n        residuals[1] = predicted_p(1) - T(m_observed_p(1));\n\n        return true;\n    }\n\n//private:\n    // camera intrinsics\n    std::vector<double> m_intrinsic_params;\n\n    // observed 3D point\n    Eigen::Vector3d m_observed_P;\n\n    // observed 2D point\n    Eigen::Vector2d m_observed_p;\n\n    // square root of precision matrix\n    Eigen::Matrix2d m_sqrtPrecisionMat;\n};\n\n// variables: camera extrinsics, 3D point\ntemplate<class CameraT>\nclass ReprojectionError2\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n\n    ReprojectionError2(const std::vector<double>& intrinsic_params,\n                       const Eigen::Vector2d& observed_p)\n        : m_intrinsic_params(intrinsic_params), m_observed_p(observed_p) {}\n\n    template <typename T>\n    bool operator()(const T* const q, const T* const t,\n                    const T* const point, T* residuals) const\n    {\n        Eigen::Matrix<T, 3, 1> P;\n        P(0) = T(point[0]);\n        P(1) = T(point[1]);\n        P(2) = T(point[2]);\n\n        std::vector<T> intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end());\n\n        // project 3D object point to the image plane\n        Eigen::Matrix<T, 2, 1> predicted_p;\n        CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p);\n\n        residuals[0] = predicted_p(0) - T(m_observed_p(0));\n        residuals[1] = predicted_p(1) - T(m_observed_p(1));\n\n        return true;\n    }\n\nprivate:\n    // camera intrinsics\n    std::vector<double> m_intrinsic_params;\n\n    // observed 2D point\n    Eigen::Vector2d m_observed_p;\n};\n\ntemplate<class CameraT>\nclass ReprojectionError3\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n\n    ReprojectionError3(const Eigen::Vector2d& observed_p)\n        : m_observed_p(observed_p)\n        , m_sqrtPrecisionMat(Eigen::Matrix2d::Identity())\n        , m_optimize_cam_odo_z(true) {}\n\n    ReprojectionError3(const Eigen::Vector2d& observed_p,\n                       const Eigen::Matrix2d& sqrtPrecisionMat)\n        : m_observed_p(observed_p)\n        , m_sqrtPrecisionMat(sqrtPrecisionMat)\n        , m_optimize_cam_odo_z(true) {}\n\n    ReprojectionError3(const std::vector<double>& intrinsic_params,\n                       const Eigen::Vector2d& observed_p)\n        : m_intrinsic_params(intrinsic_params)\n        , m_observed_p(observed_p)\n        , m_sqrtPrecisionMat(Eigen::Matrix2d::Identity())\n        , m_optimize_cam_odo_z(true) {}\n\n    ReprojectionError3(const std::vector<double>& intrinsic_params,\n                       const Eigen::Vector2d& observed_p,\n                       const Eigen::Matrix2d& sqrtPrecisionMat)\n        : m_intrinsic_params(intrinsic_params)\n        , m_observed_p(observed_p)\n        , m_sqrtPrecisionMat(sqrtPrecisionMat)\n        , m_optimize_cam_odo_z(true) {}\n\n\n    ReprojectionError3(const std::vector<double>& intrinsic_params,\n                       const Eigen::Vector3d& odo_pos,\n                       const Eigen::Vector3d& odo_att,\n                       const Eigen::Vector2d& observed_p,\n                       bool optimize_cam_odo_z)\n        : m_intrinsic_params(intrinsic_params)\n        , m_odo_pos(odo_pos), m_odo_att(odo_att)\n        , m_observed_p(observed_p)\n        , m_optimize_cam_odo_z(optimize_cam_odo_z) {}\n\n    ReprojectionError3(const std::vector<double>& intrinsic_params,\n                       const Eigen::Quaterniond& cam_odo_q,\n                       const Eigen::Vector3d& cam_odo_t,\n                       const Eigen::Vector3d& odo_pos,\n                       const Eigen::Vector3d& odo_att,\n                       const Eigen::Vector2d& observed_p)\n        : m_intrinsic_params(intrinsic_params)\n        , m_cam_odo_q(cam_odo_q), m_cam_odo_t(cam_odo_t)\n        , m_odo_pos(odo_pos), m_odo_att(odo_att)\n        , m_observed_p(observed_p)\n        , m_optimize_cam_odo_z(true) {}\n\n    // variables: camera intrinsics, camera-to-odometry transform,\n    //            odometry extrinsics, 3D point\n    template <typename T>\n    bool operator()(const T* const intrinsic_params,\n                    const T* const q_cam_odo, const T* const t_cam_odo,\n                    const T* const p_odo, const T* const att_odo,\n                    const T* const point, T* residuals) const\n    {\n        T q[4], t[3];\n        worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t, m_optimize_cam_odo_z);\n\n        Eigen::Matrix<T, 3, 1> P(point[0], point[1], point[2]);\n\n        // project 3D object point to the image plane\n        Eigen::Matrix<T, 2, 1> predicted_p;\n        CameraT::spaceToPlane(intrinsic_params, q, t, P, predicted_p);\n\n        Eigen::Matrix<T, 2, 1> err = predicted_p - m_observed_p.cast<T>();\n        Eigen::Matrix<T, 2, 1> err_weighted = m_sqrtPrecisionMat.cast<T>() * err;\n\n        residuals[0] = err_weighted(0);\n        residuals[1] = err_weighted(1);\n\n        return true;\n    }\n\n    // variables: camera-to-odometry transform, 3D point\n    template <typename T>\n    bool operator()(const T* const q_cam_odo, const T* const t_cam_odo,\n                    const T* const point, T* residuals) const\n    {\n        T p_odo[3] = {T(m_odo_pos(0)), T(m_odo_pos(1)), T(m_odo_pos(2))};\n        T att_odo[3] = {T(m_odo_att(0)), T(m_odo_att(1)), T(m_odo_att(2))};\n        T q[4], t[3];\n\n        worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t, m_optimize_cam_odo_z);\n\n        std::vector<T> intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end());\n        Eigen::Matrix<T, 3, 1> P(point[0], point[1], point[2]);\n\n        // project 3D object point to the image plane\n        Eigen::Matrix<T, 2, 1> predicted_p;\n        CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p);\n\n        residuals[0] = predicted_p(0) - T(m_observed_p(0));\n        residuals[1] = predicted_p(1) - T(m_observed_p(1));\n\n        return true;\n    }\n\n    // variables: camera-to-odometry transform, odometry extrinsics, 3D point\n    template <typename T>\n    bool operator()(const T* const q_cam_odo, const T* const t_cam_odo,\n                    const T* const p_odo, const T* const att_odo,\n                    const T* const point, T* residuals) const\n    {\n        T q[4], t[3];\n        worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t, m_optimize_cam_odo_z);\n\n        std::vector<T> intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end());\n        Eigen::Matrix<T, 3, 1> P(point[0], point[1], point[2]);\n\n        // project 3D object point to the image plane\n        Eigen::Matrix<T, 2, 1> predicted_p;\n        CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p);\n\n        Eigen::Matrix<T, 2, 1> err = predicted_p - m_observed_p.cast<T>();\n        Eigen::Matrix<T, 2, 1> err_weighted = m_sqrtPrecisionMat.cast<T>() * err;\n\n        residuals[0] = err_weighted(0);\n        residuals[1] = err_weighted(1);\n\n        return true;\n    }\n\n    // variables: 3D point\n    template <typename T>\n    bool operator()(const T* const point, T* residuals) const\n    {\n        T q_cam_odo[4] = {T(m_cam_odo_q.coeffs()(0)), T(m_cam_odo_q.coeffs()(1)), T(m_cam_odo_q.coeffs()(2)), T(m_cam_odo_q.coeffs()(3))};\n        T t_cam_odo[3] = {T(m_cam_odo_t(0)), T(m_cam_odo_t(1)), T(m_cam_odo_t(2))};\n        T p_odo[3] = {T(m_odo_pos(0)), T(m_odo_pos(1)), T(m_odo_pos(2))};\n        T att_odo[3] = {T(m_odo_att(0)), T(m_odo_att(1)), T(m_odo_att(2))};\n        T q[4], t[3];\n\n        worldToCameraTransform(q_cam_odo, t_cam_odo, p_odo, att_odo, q, t, m_optimize_cam_odo_z);\n\n        std::vector<T> intrinsic_params(m_intrinsic_params.begin(), m_intrinsic_params.end());\n        Eigen::Matrix<T, 3, 1> P(point[0], point[1], point[2]);\n\n        // project 3D object point to the image plane\n        Eigen::Matrix<T, 2, 1> predicted_p;\n        CameraT::spaceToPlane(intrinsic_params.data(), q, t, P, predicted_p);\n\n        residuals[0] = predicted_p(0) - T(m_observed_p(0));\n        residuals[1] = predicted_p(1) - T(m_observed_p(1));\n\n        return true;\n    }\n\nprivate:\n    // camera intrinsics\n    std::vector<double> m_intrinsic_params;\n\n    // observed camera-odometry transform\n    Eigen::Quaterniond m_cam_odo_q;\n    Eigen::Vector3d m_cam_odo_t;\n\n    // observed odometry\n    Eigen::Vector3d m_odo_pos;\n    Eigen::Vector3d m_odo_att;\n\n    // observed 2D point\n    Eigen::Vector2d m_observed_p;\n\n    Eigen::Matrix2d m_sqrtPrecisionMat;\n\n    bool m_optimize_cam_odo_z;\n};\n\n// variables: camera intrinsics and camera extrinsics\ntemplate<class CameraT>\nclass StereoReprojectionError\n{\npublic:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n\n    StereoReprojectionError(const Eigen::Vector3d& observed_P,\n                            const Eigen::Vector2d& observed_p_l,\n                            const Eigen::Vector2d& observed_p_r)\n        : m_observed_P(observed_P)\n        , m_observed_p_l(observed_p_l)\n        , m_observed_p_r(observed_p_r)\n    {\n\n    }\n\n    template <typename T>\n    bool operator()(const T* const intrinsic_params_l,\n                    const T* const intrinsic_params_r,\n                    const T* const q_l,\n                    const T* const t_l,\n                    const T* const q_l_r,\n                    const T* const t_l_r,\n                    T* residuals) const\n    {\n        Eigen::Matrix<T, 3, 1> P;\n        P(0) = T(m_observed_P(0));\n        P(1) = T(m_observed_P(1));\n        P(2) = T(m_observed_P(2));\n\n        Eigen::Matrix<T, 2, 1> predicted_p_l;\n        CameraT::spaceToPlane(intrinsic_params_l, q_l, t_l, P, predicted_p_l);\n\n        Eigen::Quaternion<T> q_r = Eigen::Quaternion<T>(q_l_r) * Eigen::Quaternion<T>(q_l);\n\n        Eigen::Matrix<T, 3, 1> t_r;\n        t_r(0) = t_l[0];\n        t_r(1) = t_l[1];\n        t_r(2) = t_l[2];\n\n        t_r = Eigen::Quaternion<T>(q_l_r) * t_r;\n        t_r(0) += t_l_r[0];\n        t_r(1) += t_l_r[1];\n        t_r(2) += t_l_r[2];\n\n        Eigen::Matrix<T, 2, 1> predicted_p_r;\n        CameraT::spaceToPlane(intrinsic_params_r, q_r.coeffs().data(), t_r.data(), P, predicted_p_r);\n\n        residuals[0] = predicted_p_l(0) - T(m_observed_p_l(0));\n        residuals[1] = predicted_p_l(1) - T(m_observed_p_l(1));\n        residuals[2] = predicted_p_r(0) - T(m_observed_p_r(0));\n        residuals[3] = predicted_p_r(1) - T(m_observed_p_r(1));\n\n        return true;\n    }\n\nprivate:\n    // observed 3D point\n    Eigen::Vector3d m_observed_P;\n\n    // observed 2D point\n    Eigen::Vector2d m_observed_p_l;\n    Eigen::Vector2d m_observed_p_r;\n};\n\ntemplate <class CameraT>\nclass ComprehensionError {\n  public:\n    EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n\n    ComprehensionError(const Eigen::Vector3d& observed_P, const Eigen::Vector2d& observed_p)\n        : m_observed_P(observed_P),\n          m_observed_p(observed_p),\n          m_sqrtPrecisionMat(Eigen::Matrix2d::Identity()) {\n    }\n\n    template <typename T>\n    bool operator()(const T* const intrinsic_params, const T* const q, const T* const t,\n                    T* residuals) const {\n        {\n            Eigen::Matrix<T, 2, 1> p = m_observed_p.cast<T>();\n        \n            Eigen::Matrix<T, 3, 1> predicted_img_P;\n            CameraT::LiftToSphere(intrinsic_params, p, predicted_img_P);\n                \n            Eigen::Matrix<T, 2, 1> predicted_p;\n            CameraT::SphereToPlane(intrinsic_params, predicted_img_P, predicted_p);\n\n            Eigen::Matrix<T, 2, 1> e = predicted_p - m_observed_p.cast<T>();\n\n            Eigen::Matrix<T, 2, 1> e_weighted = m_sqrtPrecisionMat.cast<T>() * e;\n\n            residuals[0] = e_weighted(0);\n            residuals[1] = e_weighted(1);\n        }\n\n        {\n            Eigen::Matrix<T, 3, 1> P = m_observed_P.cast<T>();\n\n            Eigen::Matrix<T, 2, 1> predicted_p;\n            CameraT::spaceToPlane(intrinsic_params, q, t, P, predicted_p);\n\n            Eigen::Matrix<T, 2, 1> e = predicted_p - m_observed_p.cast<T>();\n\n            Eigen::Matrix<T, 2, 1> e_weighted = m_sqrtPrecisionMat.cast<T>() * e;\n\n            residuals[2] = e_weighted(0);\n            residuals[3] = e_weighted(1);\n        }\n\n        return true;\n    }\n\n    // private:\n    // camera intrinsics\n    // std::vector<double> m_intrinsic_params;\n\n    // observed 3D point\n    Eigen::Vector3d m_observed_P;\n\n    // observed 2D point\n    Eigen::Vector2d m_observed_p;\n\n    // square root of precision matrix\n    Eigen::Matrix2d m_sqrtPrecisionMat;\n};\n\nboost::shared_ptr<CostFunctionFactory> CostFunctionFactory::m_instance;\n\nCostFunctionFactory::CostFunctionFactory()\n{\n\n}\n\nboost::shared_ptr<CostFunctionFactory>\nCostFunctionFactory::instance(void)\n{\n    if (m_instance.get() == 0)\n    {\n        m_instance.reset(new CostFunctionFactory);\n    }\n\n    return m_instance;\n}\n\nceres::CostFunction*\nCostFunctionFactory::generateCostFunction(const CameraConstPtr& camera,\n        const Eigen::Vector3d& observed_P,\n        const Eigen::Vector2d& observed_p,\n        int flags) const\n{\n    ceres::CostFunction* costFunction = 0;\n\n    std::vector<double> intrinsic_params;\n    camera->writeParameters(intrinsic_params);\n\n    switch (flags)\n    {\n    case CAMERA_INTRINSICS | CAMERA_POSE:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<EquidistantCamera>, 2, 8, 4, 3>(\n                new ReprojectionError1<EquidistantCamera>(observed_P, observed_p));\n            break;\n        case Camera::PINHOLE:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<PinholeCamera>, 2, 8, 4, 3>(\n                new ReprojectionError1<PinholeCamera>(observed_P, observed_p));\n            break;\n        case Camera::MEI:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<CataCamera>, 2, 9, 4, 3>(\n                new ReprojectionError1<CataCamera>(observed_P, observed_p));\n            break;\n        case Camera::SCARAMUZZA:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ComprehensionError<OCAMCamera>, 4, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3>(\n                new ComprehensionError<OCAMCamera>(observed_P, observed_p));\n                // new ceres::AutoDiffCostFunction<ReprojectionError1<OCAMCamera>, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3>(\n                // new ReprojectionError1<OCAMCamera>(observed_P, observed_p));\n            break;\n        }\n        break;\n    case CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<EquidistantCamera>, 2, 4, 3, 3, 3>(\n                new ReprojectionError1<EquidistantCamera>(intrinsic_params, observed_P, observed_p));\n            break;\n        case Camera::PINHOLE:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<PinholeCamera>, 2, 4, 3, 3, 3>(\n                new ReprojectionError1<PinholeCamera>(intrinsic_params, observed_P, observed_p));\n            break;\n        case Camera::MEI:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<CataCamera>, 2, 4, 3, 3, 3>(\n                new ReprojectionError1<CataCamera>(intrinsic_params, observed_P, observed_p));\n            break;\n        case Camera::SCARAMUZZA:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<OCAMCamera>, 2, 4, 3, 3, 3>(\n                new ReprojectionError1<OCAMCamera>(intrinsic_params, observed_P, observed_p));\n            break;\n        }\n        break;\n    }\n\n    return costFunction;\n}\n\nceres::CostFunction*\nCostFunctionFactory::generateCostFunction(const CameraConstPtr& camera,\n        const Eigen::Vector3d& observed_P,\n        const Eigen::Vector2d& observed_p,\n        const Eigen::Matrix2d& sqrtPrecisionMat,\n        int flags) const\n{\n    ceres::CostFunction* costFunction = 0;\n\n    std::vector<double> intrinsic_params;\n    camera->writeParameters(intrinsic_params);\n\n    switch (flags)\n    {\n    case CAMERA_INTRINSICS | CAMERA_POSE:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<EquidistantCamera>, 2, 8, 4, 3>(\n                new ReprojectionError1<EquidistantCamera>(observed_P, observed_p, sqrtPrecisionMat));\n            break;\n        case Camera::PINHOLE:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<PinholeCamera>, 2, 8, 4, 3>(\n                new ReprojectionError1<PinholeCamera>(observed_P, observed_p, sqrtPrecisionMat));\n            break;\n        case Camera::MEI:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<CataCamera>, 2, 9, 4, 3>(\n                new ReprojectionError1<CataCamera>(observed_P, observed_p, sqrtPrecisionMat));\n            break;\n        case Camera::SCARAMUZZA:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError1<OCAMCamera>, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3>(\n                new ReprojectionError1<OCAMCamera>(observed_P, observed_p, sqrtPrecisionMat));\n            break;\n        }\n        break;\n    }\n\n    return costFunction;\n}\n\nceres::CostFunction*\nCostFunctionFactory::generateCostFunction(const CameraConstPtr& camera,\n        const Eigen::Vector2d& observed_p,\n        int flags, bool optimize_cam_odo_z) const\n{\n    ceres::CostFunction* costFunction = 0;\n\n    std::vector<double> intrinsic_params;\n    camera->writeParameters(intrinsic_params);\n\n    switch (flags)\n    {\n    case CAMERA_POSE | POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError2<EquidistantCamera>, 2, 4, 3, 3>(\n                new ReprojectionError2<EquidistantCamera>(intrinsic_params, observed_p));\n            break;\n        case Camera::PINHOLE:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError2<PinholeCamera>, 2, 4, 3, 3>(\n                new ReprojectionError2<PinholeCamera>(intrinsic_params, observed_p));\n            break;\n        case Camera::MEI:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError2<CataCamera>, 2, 4, 3, 3>(\n                new ReprojectionError2<CataCamera>(intrinsic_params, observed_p));\n            break;\n        case Camera::SCARAMUZZA:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError2<OCAMCamera>, 2, 4, 3, 3>(\n                new ReprojectionError2<OCAMCamera>(intrinsic_params, observed_p));\n            break;\n        }\n        break;\n    case CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_3D_POSE | POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 4, 3, 2, 1, 3>(\n                    new ReprojectionError3<EquidistantCamera>(intrinsic_params, observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 2, 2, 1, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p));\n            }\n            break;\n        case Camera::PINHOLE:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 3, 2, 1, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 2, 2, 1, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p));\n            }\n            break;\n        case Camera::MEI:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 4, 3, 2, 1, 3>(\n                    new ReprojectionError3<CataCamera>(intrinsic_params, observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 4, 2, 2, 1, 3>(\n                    new ReprojectionError3<CataCamera>(intrinsic_params, observed_p));\n            }\n            break;\n        case Camera::SCARAMUZZA:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 4, 3, 2, 1, 3>(\n                    new ReprojectionError3<OCAMCamera>(intrinsic_params, observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 4, 2, 2, 1, 3>(\n                    new ReprojectionError3<OCAMCamera>(intrinsic_params, observed_p));\n            }\n            break;\n        }\n        break;\n    case CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE | POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<EquidistantCamera>(intrinsic_params, observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p));\n            }\n            break;\n        case Camera::PINHOLE:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p));\n            }\n            break;\n        case Camera::MEI:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<CataCamera>(intrinsic_params, observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<CataCamera>(intrinsic_params, observed_p));\n            }\n            break;\n        case Camera::SCARAMUZZA:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(intrinsic_params, observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(intrinsic_params, observed_p));\n            }\n            break;\n        }\n        break;\n    case CAMERA_INTRINSICS | CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_3D_POSE | POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 8, 4, 3, 2, 1, 3>(\n                    new ReprojectionError3<EquidistantCamera>(observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 2, 2, 1, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p));\n            }\n            break;\n        case Camera::PINHOLE:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 3, 2, 1, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 2, 2, 1, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p));\n            }\n            break;\n        case Camera::MEI:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 9, 4, 3, 2, 1, 3>(\n                    new ReprojectionError3<CataCamera>(observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 9, 4, 2, 2, 1, 3>(\n                    new ReprojectionError3<CataCamera>(observed_p));\n            }\n            break;\n        case Camera::SCARAMUZZA:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3, 2, 1, 3>(\n                    new ReprojectionError3<OCAMCamera>(observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 2, 2, 1, 3>(\n                    new ReprojectionError3<OCAMCamera>(observed_p));\n            }\n            break;\n        }\n        break;\n    case CAMERA_INTRINSICS | CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE | POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 8, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<EquidistantCamera>(observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p));\n            }\n            break;\n        case Camera::PINHOLE:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p));\n            }\n            break;\n        case Camera::MEI:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 9, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<CataCamera>(observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 9, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<CataCamera>(observed_p));\n            }\n            break;\n        case Camera::SCARAMUZZA:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(observed_p));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(observed_p));\n            }\n            break;\n        }\n        break;\n    }\n\n    return costFunction;\n}\n\nceres::CostFunction*\nCostFunctionFactory::generateCostFunction(const CameraConstPtr& camera,\n        const Eigen::Vector2d& observed_p,\n        const Eigen::Matrix2d& sqrtPrecisionMat,\n        int flags, bool optimize_cam_odo_z) const\n{\n    ceres::CostFunction* costFunction = 0;\n\n    std::vector<double> intrinsic_params;\n    camera->writeParameters(intrinsic_params);\n\n    switch (flags)\n    {\n    case CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE | POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<EquidistantCamera>(intrinsic_params, observed_p, sqrtPrecisionMat));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p, sqrtPrecisionMat));\n            }\n            break;\n        case Camera::PINHOLE:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p, sqrtPrecisionMat));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, observed_p, sqrtPrecisionMat));\n            }\n            break;\n        case Camera::MEI:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<CataCamera>(intrinsic_params, observed_p, sqrtPrecisionMat));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<CataCamera>(intrinsic_params, observed_p, sqrtPrecisionMat));\n            }\n            break;\n        case Camera::SCARAMUZZA:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(intrinsic_params, observed_p, sqrtPrecisionMat));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(intrinsic_params, observed_p, sqrtPrecisionMat));\n            }\n            break;\n        }\n        break;\n    case CAMERA_INTRINSICS | CAMERA_ODOMETRY_TRANSFORM | ODOMETRY_6D_POSE | POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 8, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<EquidistantCamera>(observed_p, sqrtPrecisionMat));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p, sqrtPrecisionMat));\n            }\n            break;\n        case Camera::PINHOLE:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p, sqrtPrecisionMat));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 8, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(observed_p, sqrtPrecisionMat));\n            }\n            break;\n        case Camera::MEI:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 9, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<CataCamera>(observed_p, sqrtPrecisionMat));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 9, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<CataCamera>(observed_p, sqrtPrecisionMat));\n            }\n            break;\n        case Camera::SCARAMUZZA:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3, 3, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(observed_p, sqrtPrecisionMat));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 2, 3, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(observed_p, sqrtPrecisionMat));\n            }\n            break;\n        }\n        break;\n    }\n\n    return costFunction;\n}\n\nceres::CostFunction*\nCostFunctionFactory::generateCostFunction(const CameraConstPtr& camera,\n        const Eigen::Vector3d& odo_pos,\n        const Eigen::Vector3d& odo_att,\n        const Eigen::Vector2d& observed_p,\n        int flags, bool optimize_cam_odo_z) const\n{\n    ceres::CostFunction* costFunction = 0;\n\n    std::vector<double> intrinsic_params;\n    camera->writeParameters(intrinsic_params);\n\n    switch (flags)\n    {\n    case CAMERA_ODOMETRY_TRANSFORM | POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 4, 3, 3>(\n                    new ReprojectionError3<EquidistantCamera>(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 4, 2, 3>(\n                    new ReprojectionError3<EquidistantCamera>(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z));\n            }\n            break;\n        case Camera::PINHOLE:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 3, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 4, 2, 3>(\n                    new ReprojectionError3<PinholeCamera>(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z));\n            }\n            break;\n        case Camera::MEI:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 4, 3, 3>(\n                    new ReprojectionError3<CataCamera>(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 4, 2, 3>(\n                    new ReprojectionError3<CataCamera>(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z));\n            }\n            break;\n        case Camera::SCARAMUZZA:\n            if (optimize_cam_odo_z)\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 4, 3, 3>(\n                    new ReprojectionError3<OCAMCamera>(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z));\n            }\n            else\n            {\n                costFunction =\n                    new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 4, 2, 3>(\n                    new ReprojectionError3<OCAMCamera>(intrinsic_params, odo_pos, odo_att, observed_p, optimize_cam_odo_z));\n            }\n            break;\n        }\n        break;\n    }\n\n    return costFunction;\n}\n\nceres::CostFunction*\nCostFunctionFactory::generateCostFunction(const CameraConstPtr& camera,\n        const Eigen::Quaterniond& cam_odo_q,\n        const Eigen::Vector3d& cam_odo_t,\n        const Eigen::Vector3d& odo_pos,\n        const Eigen::Vector3d& odo_att,\n        const Eigen::Vector2d& observed_p,\n        int flags) const\n{\n    ceres::CostFunction* costFunction = 0;\n\n    std::vector<double> intrinsic_params;\n    camera->writeParameters(intrinsic_params);\n\n    switch (flags)\n    {\n    case POINT_3D:\n        switch (camera->modelType())\n        {\n        case Camera::KANNALA_BRANDT:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError3<EquidistantCamera>, 2, 3>(\n                new ReprojectionError3<EquidistantCamera>(intrinsic_params, cam_odo_q, cam_odo_t, odo_pos, odo_att, observed_p));\n            break;\n        case Camera::PINHOLE:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError3<PinholeCamera>, 2, 3>(\n                new ReprojectionError3<PinholeCamera>(intrinsic_params, cam_odo_q, cam_odo_t, odo_pos, odo_att, observed_p));\n            break;\n        case Camera::MEI:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError3<CataCamera>, 2, 3>(\n                new ReprojectionError3<CataCamera>(intrinsic_params, cam_odo_q, cam_odo_t, odo_pos, odo_att, observed_p));\n            break;\n        case Camera::SCARAMUZZA:\n            costFunction =\n                new ceres::AutoDiffCostFunction<ReprojectionError3<OCAMCamera>, 2, 3>(\n                new ReprojectionError3<OCAMCamera>(intrinsic_params, cam_odo_q, cam_odo_t, odo_pos, odo_att, observed_p));\n            break;\n        }\n        break;\n    }\n\n    return costFunction;\n}\n\nceres::CostFunction*\nCostFunctionFactory::generateCostFunction(const CameraConstPtr& cameraL,\n        const CameraConstPtr& cameraR,\n        const Eigen::Vector3d& observed_P,\n        const Eigen::Vector2d& observed_p_l,\n        const Eigen::Vector2d& observed_p_r) const\n{\n    ceres::CostFunction* costFunction = 0;\n\n    if (cameraL->modelType() != cameraR->modelType())\n    {\n        return costFunction;\n    }\n\n    switch (cameraL->modelType())\n    {\n    case Camera::KANNALA_BRANDT:\n        costFunction =\n            new ceres::AutoDiffCostFunction<StereoReprojectionError<EquidistantCamera>, 4, 8, 8, 4, 3, 4, 3>(\n            new StereoReprojectionError<EquidistantCamera>(observed_P, observed_p_l, observed_p_r));\n        break;\n    case Camera::PINHOLE:\n        costFunction =\n            new ceres::AutoDiffCostFunction<StereoReprojectionError<PinholeCamera>, 4, 8, 8, 4, 3, 4, 3>(\n            new StereoReprojectionError<PinholeCamera>(observed_P, observed_p_l, observed_p_r));\n        break;\n    case Camera::MEI:\n        costFunction =\n            new ceres::AutoDiffCostFunction<StereoReprojectionError<CataCamera>, 4, 9, 9, 4, 3, 4, 3>(\n            new StereoReprojectionError<CataCamera>(observed_P, observed_p_l, observed_p_r));\n        break;\n    case Camera::SCARAMUZZA:\n        costFunction =\n            new ceres::AutoDiffCostFunction<StereoReprojectionError<OCAMCamera>, 4, SCARAMUZZA_CAMERA_NUM_PARAMS, SCARAMUZZA_CAMERA_NUM_PARAMS, 4, 3, 4, 3>(\n            new StereoReprojectionError<OCAMCamera>(observed_P, observed_p_l, observed_p_r));\n        break;\n    }\n\n    return costFunction;\n}\n\n}\n\n"
  },
  {
    "path": "camera_model/src/camera_models/EquidistantCamera.cc",
    "content": "#include \"camodocal/camera_models/EquidistantCamera.h\"\n\n#include <cmath>\n#include <cstdio>\n#include <eigen3/Eigen/Dense>\n#include <iomanip>\n#include <iostream>\n#include <opencv2/calib3d/calib3d.hpp>\n#include <opencv2/core/eigen.hpp>\n#include <opencv2/imgproc/imgproc.hpp>\n\n#include \"camodocal/gpl/gpl.h\"\n\nnamespace camodocal\n{\n\nEquidistantCamera::Parameters::Parameters()\n : Camera::Parameters(KANNALA_BRANDT)\n , m_k2(0.0)\n , m_k3(0.0)\n , m_k4(0.0)\n , m_k5(0.0)\n , m_mu(0.0)\n , m_mv(0.0)\n , m_u0(0.0)\n , m_v0(0.0)\n{\n\n}\n\nEquidistantCamera::Parameters::Parameters(const std::string& cameraName,\n                                          int w, int h,\n                                          double k2, double k3, double k4, double k5,\n                                          double mu, double mv,\n                                          double u0, double v0)\n : Camera::Parameters(KANNALA_BRANDT, cameraName, w, h)\n , m_k2(k2)\n , m_k3(k3)\n , m_k4(k4)\n , m_k5(k5)\n , m_mu(mu)\n , m_mv(mv)\n , m_u0(u0)\n , m_v0(v0)\n{\n\n}\n\ndouble&\nEquidistantCamera::Parameters::k2(void)\n{\n    return m_k2;\n}\n\ndouble&\nEquidistantCamera::Parameters::k3(void)\n{\n    return m_k3;\n}\n\ndouble&\nEquidistantCamera::Parameters::k4(void)\n{\n    return m_k4;\n}\n\ndouble&\nEquidistantCamera::Parameters::k5(void)\n{\n    return m_k5;\n}\n\ndouble&\nEquidistantCamera::Parameters::mu(void)\n{\n    return m_mu;\n}\n\ndouble&\nEquidistantCamera::Parameters::mv(void)\n{\n    return m_mv;\n}\n\ndouble&\nEquidistantCamera::Parameters::u0(void)\n{\n    return m_u0;\n}\n\ndouble&\nEquidistantCamera::Parameters::v0(void)\n{\n    return m_v0;\n}\n\ndouble\nEquidistantCamera::Parameters::k2(void) const\n{\n    return m_k2;\n}\n\ndouble\nEquidistantCamera::Parameters::k3(void) const\n{\n    return m_k3;\n}\n\ndouble\nEquidistantCamera::Parameters::k4(void) const\n{\n    return m_k4;\n}\n\ndouble\nEquidistantCamera::Parameters::k5(void) const\n{\n    return m_k5;\n}\n\ndouble\nEquidistantCamera::Parameters::mu(void) const\n{\n    return m_mu;\n}\n\ndouble\nEquidistantCamera::Parameters::mv(void) const\n{\n    return m_mv;\n}\n\ndouble\nEquidistantCamera::Parameters::u0(void) const\n{\n    return m_u0;\n}\n\ndouble\nEquidistantCamera::Parameters::v0(void) const\n{\n    return m_v0;\n}\n\nbool\nEquidistantCamera::Parameters::readFromYamlFile(const std::string& filename)\n{\n    cv::FileStorage fs(filename, cv::FileStorage::READ);\n\n    if (!fs.isOpened())\n    {\n        return false;\n    }\n\n    if (!fs[\"model_type\"].isNone())\n    {\n        std::string sModelType;\n        fs[\"model_type\"] >> sModelType;\n\n        if (sModelType.compare(\"KANNALA_BRANDT\") != 0)\n        {\n            return false;\n        }\n    }\n\n    m_modelType = KANNALA_BRANDT;\n    fs[\"camera_name\"] >> m_cameraName;\n    m_imageWidth = static_cast<int>(fs[\"image_width\"]);\n    m_imageHeight = static_cast<int>(fs[\"image_height\"]);\n\n    cv::FileNode n = fs[\"projection_parameters\"];\n    m_k2 = static_cast<double>(n[\"k2\"]);\n    m_k3 = static_cast<double>(n[\"k3\"]);\n    m_k4 = static_cast<double>(n[\"k4\"]);\n    m_k5 = static_cast<double>(n[\"k5\"]);\n    m_mu = static_cast<double>(n[\"mu\"]);\n    m_mv = static_cast<double>(n[\"mv\"]);\n    m_u0 = static_cast<double>(n[\"u0\"]);\n    m_v0 = static_cast<double>(n[\"v0\"]);\n\n    return true;\n}\n\nvoid\nEquidistantCamera::Parameters::writeToYamlFile(const std::string& filename) const\n{\n    cv::FileStorage fs(filename, cv::FileStorage::WRITE);\n\n    fs << \"model_type\" << \"KANNALA_BRANDT\";\n    fs << \"camera_name\" << m_cameraName;\n    fs << \"image_width\" << m_imageWidth;\n    fs << \"image_height\" << m_imageHeight;\n\n    // projection: k2, k3, k4, k5, mu, mv, u0, v0\n    fs << \"projection_parameters\";\n    fs << \"{\" << \"k2\" << m_k2\n              << \"k3\" << m_k3\n              << \"k4\" << m_k4\n              << \"k5\" << m_k5\n              << \"mu\" << m_mu\n              << \"mv\" << m_mv\n              << \"u0\" << m_u0\n              << \"v0\" << m_v0 << \"}\";\n\n    fs.release();\n}\n\nEquidistantCamera::Parameters&\nEquidistantCamera::Parameters::operator=(const EquidistantCamera::Parameters& other)\n{\n    if (this != &other)\n    {\n        m_modelType = other.m_modelType;\n        m_cameraName = other.m_cameraName;\n        m_imageWidth = other.m_imageWidth;\n        m_imageHeight = other.m_imageHeight;\n        m_k2 = other.m_k2;\n        m_k3 = other.m_k3;\n        m_k4 = other.m_k4;\n        m_k5 = other.m_k5;\n        m_mu = other.m_mu;\n        m_mv = other.m_mv;\n        m_u0 = other.m_u0;\n        m_v0 = other.m_v0;\n    }\n\n    return *this;\n}\n\nstd::ostream&\noperator<< (std::ostream& out, const EquidistantCamera::Parameters& params)\n{\n    out << \"Camera Parameters:\" << std::endl;\n    out << \"    model_type \" << \"KANNALA_BRANDT\" << std::endl;\n    out << \"   camera_name \" << params.m_cameraName << std::endl;\n    out << \"   image_width \" << params.m_imageWidth << std::endl;\n    out << \"  image_height \" << params.m_imageHeight << std::endl;\n\n    // projection: k2, k3, k4, k5, mu, mv, u0, v0\n    out << \"Projection Parameters\" << std::endl;\n    out << \"            k2 \" << params.m_k2 << std::endl\n        << \"            k3 \" << params.m_k3 << std::endl\n        << \"            k4 \" << params.m_k4 << std::endl\n        << \"            k5 \" << params.m_k5 << std::endl\n        << \"            mu \" << params.m_mu << std::endl\n        << \"            mv \" << params.m_mv << std::endl\n        << \"            u0 \" << params.m_u0 << std::endl\n        << \"            v0 \" << params.m_v0 << std::endl;\n\n    return out;\n}\n\nEquidistantCamera::EquidistantCamera()\n : m_inv_K11(1.0)\n , m_inv_K13(0.0)\n , m_inv_K22(1.0)\n , m_inv_K23(0.0)\n{\n\n}\n\nEquidistantCamera::EquidistantCamera(const std::string& cameraName,\n                                     int imageWidth, int imageHeight,\n                                     double k2, double k3, double k4, double k5,\n                                     double mu, double mv,\n                                     double u0, double v0)\n : mParameters(cameraName, imageWidth, imageHeight,\n               k2, k3, k4, k5, mu, mv, u0, v0)\n{\n    // Inverse camera projection matrix parameters\n    m_inv_K11 = 1.0 / mParameters.mu();\n    m_inv_K13 = -mParameters.u0() / mParameters.mu();\n    m_inv_K22 = 1.0 / mParameters.mv();\n    m_inv_K23 = -mParameters.v0() / mParameters.mv();\n}\n\nEquidistantCamera::EquidistantCamera(const EquidistantCamera::Parameters& params)\n : mParameters(params)\n{\n    // Inverse camera projection matrix parameters\n    m_inv_K11 = 1.0 / mParameters.mu();\n    m_inv_K13 = -mParameters.u0() / mParameters.mu();\n    m_inv_K22 = 1.0 / mParameters.mv();\n    m_inv_K23 = -mParameters.v0() / mParameters.mv();\n}\n\nCamera::ModelType\nEquidistantCamera::modelType(void) const\n{\n    return mParameters.modelType();\n}\n\nconst std::string&\nEquidistantCamera::cameraName(void) const\n{\n    return mParameters.cameraName();\n}\n\nint\nEquidistantCamera::imageWidth(void) const\n{\n    return mParameters.imageWidth();\n}\n\nint\nEquidistantCamera::imageHeight(void) const\n{\n    return mParameters.imageHeight();\n}\n\nvoid\nEquidistantCamera::estimateIntrinsics(const cv::Size& boardSize,\n                                      const std::vector< std::vector<cv::Point3f> >& objectPoints,\n                                      const std::vector< std::vector<cv::Point2f> >& imagePoints)\n{\n    Parameters params = getParameters();\n\n    double u0 = params.imageWidth() / 2.0;\n    double v0 = params.imageHeight() / 2.0;\n\n    double minReprojErr = std::numeric_limits<double>::max();\n\n    std::vector<cv::Mat> rvecs, tvecs;\n    rvecs.assign(objectPoints.size(), cv::Mat());\n    tvecs.assign(objectPoints.size(), cv::Mat());\n\n    params.k2() = 0.0;\n    params.k3() = 0.0;\n    params.k4() = 0.0;\n    params.k5() = 0.0;\n    params.u0() = u0;\n    params.v0() = v0;\n\n    // Initialize focal length\n    // C. Hughes, P. Denny, M. Glavin, and E. Jones,\n    // Equidistant Fish-Eye Calibration and Rectification by Vanishing Point\n    // Extraction, PAMI 2010\n    // Find circles from rows of chessboard corners, and for each pair\n    // of circles, find vanishing points: v1 and v2.\n    // f = ||v1 - v2|| / PI;\n    double f0 = 0.0;\n    for (size_t i = 0; i < imagePoints.size(); ++i)\n    {\n        std::vector<Eigen::Vector2d> center(boardSize.height);\n        double radius[boardSize.height];\n        for (int r = 0; r < boardSize.height; ++r)\n        {\n            std::vector<cv::Point2d> circle;\n            for (int c = 0; c < boardSize.width; ++c)\n            {\n                circle.push_back(imagePoints.at(i).at(r * boardSize.width + c));\n            }\n\n            fitCircle(circle, center[r](0), center[r](1), radius[r]);\n        }\n\n        for (int j = 0; j < boardSize.height; ++j)\n        {\n            for (int k = j + 1; k < boardSize.height; ++k)\n            {\n                // find distance between pair of vanishing points which\n                // correspond to intersection points of 2 circles\n                std::vector<cv::Point2d> ipts;\n                ipts = intersectCircles(center[j](0), center[j](1), radius[j],\n                                        center[k](0), center[k](1), radius[k]);\n\n                if (ipts.size() < 2)\n                {\n                    continue;\n                }\n\n                double f = cv::norm(ipts.at(0) - ipts.at(1)) / M_PI;\n\n                params.mu() = f;\n                params.mv() = f;\n\n                setParameters(params);\n\n                for (size_t l = 0; l < objectPoints.size(); ++l)\n                {\n                    estimateExtrinsics(objectPoints.at(l), imagePoints.at(l), rvecs.at(l), tvecs.at(l));\n                }\n\n                double reprojErr = reprojectionError(objectPoints, imagePoints, rvecs, tvecs, cv::noArray());\n\n                if (reprojErr < minReprojErr)\n                {\n                    minReprojErr = reprojErr;\n                    f0 = f;\n                }\n            }\n        }\n    }\n\n    if (f0 <= 0.0 && minReprojErr >= std::numeric_limits<double>::max())\n    {\n        std::cout << \"[\" << params.cameraName() << \"] \"\n                  << \"# INFO: kannala-Brandt model fails with given data. \" << std::endl;\n\n        return;\n    }\n\n    params.mu() = f0;\n    params.mv() = f0;\n\n    setParameters(params);\n}\n\n/**\n * \\brief Lifts a point from the image plane to the unit sphere\n *\n * \\param p image coordinates\n * \\param P coordinates of the point on the sphere\n */\nvoid\nEquidistantCamera::liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const\n{\n    liftProjective(p, P);\n}\n\n/** \n * \\brief Lifts a point from the image plane to its projective ray\n *\n * \\param p image coordinates\n * \\param P coordinates of the projective ray\n */\nvoid\nEquidistantCamera::liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const\n{\n    // Lift points to normalised plane\n    Eigen::Vector2d p_u;\n    p_u << m_inv_K11 * p(0) + m_inv_K13,\n           m_inv_K22 * p(1) + m_inv_K23;\n\n    // Obtain a projective ray\n    double theta, phi;\n    backprojectSymmetric(p_u, theta, phi);\n\n    P(0) = sin(theta) * cos(phi);\n    P(1) = sin(theta) * sin(phi);\n    P(2) = cos(theta);\n}\n\n/** \n * \\brief Project a 3D point (\\a x,\\a y,\\a z) to the image plane in (\\a u,\\a v)\n *\n * \\param P 3D point coordinates\n * \\param p return value, contains the image point coordinates\n */\nvoid\nEquidistantCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const\n{\n    double theta = acos(P(2) / P.norm());\n    double phi = atan2(P(1), P(0));\n\n    Eigen::Vector2d p_u = r(mParameters.k2(), mParameters.k3(), mParameters.k4(), mParameters.k5(), theta) * Eigen::Vector2d(cos(phi), sin(phi));\n\n    // Apply generalised projection matrix\n    p << mParameters.mu() * p_u(0) + mParameters.u0(),\n         mParameters.mv() * p_u(1) + mParameters.v0();\n}\n\n\n/** \n * \\brief Project a 3D point to the image plane and calculate Jacobian\n *\n * \\param P 3D point coordinates\n * \\param p return value, contains the image point coordinates\n */\nvoid\nEquidistantCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p,\n                                Eigen::Matrix<double,2,3>& J) const\n{\n    double theta = acos(P(2) / P.norm());\n    double phi = atan2(P(1), P(0));\n\n    Eigen::Vector2d p_u = r(mParameters.k2(), mParameters.k3(), mParameters.k4(), mParameters.k5(), theta) * Eigen::Vector2d(cos(phi), sin(phi));\n\n    // Apply generalised projection matrix\n    p << mParameters.mu() * p_u(0) + mParameters.u0(),\n         mParameters.mv() * p_u(1) + mParameters.v0();\n}\n\n/** \n * \\brief Projects an undistorted 2D point p_u to the image plane\n *\n * \\param p_u 2D point coordinates\n * \\return image point coordinates\n */\nvoid\nEquidistantCamera::undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const\n{\n//    Eigen::Vector2d p_d;\n//\n//    if (m_noDistortion)\n//    {\n//        p_d = p_u;\n//    }\n//    else\n//    {\n//        // Apply distortion\n//        Eigen::Vector2d d_u;\n//        distortion(p_u, d_u);\n//        p_d = p_u + d_u;\n//    }\n//\n//    // Apply generalised projection matrix\n//    p << mParameters.gamma1() * p_d(0) + mParameters.u0(),\n//         mParameters.gamma2() * p_d(1) + mParameters.v0();\n}\n\nvoid\nEquidistantCamera::initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale) const\n{\n    cv::Size imageSize(mParameters.imageWidth(), mParameters.imageHeight());\n\n    cv::Mat mapX = cv::Mat::zeros(imageSize, CV_32F);\n    cv::Mat mapY = cv::Mat::zeros(imageSize, CV_32F);\n\n    for (int v = 0; v < imageSize.height; ++v)\n    {\n        for (int u = 0; u < imageSize.width; ++u)\n        {\n            double mx_u = m_inv_K11 / fScale * u + m_inv_K13 / fScale;\n            double my_u = m_inv_K22 / fScale * v + m_inv_K23 / fScale;\n\n            double theta, phi;\n            backprojectSymmetric(Eigen::Vector2d(mx_u, my_u), theta, phi);\n\n            Eigen::Vector3d P;\n            P << sin(theta) * cos(phi), sin(theta) * sin(phi), cos(theta);\n\n            Eigen::Vector2d p;\n            spaceToPlane(P, p);\n\n            mapX.at<float>(v,u) = p(0);\n            mapY.at<float>(v,u) = p(1);\n        }\n    }\n\n    cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false);\n}\n\ncv::Mat\nEquidistantCamera::initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\n                                           float fx, float fy,\n                                           cv::Size imageSize,\n                                           float cx, float cy,\n                                           cv::Mat rmat) const\n{\n    if (imageSize == cv::Size(0, 0))\n    {\n        imageSize = cv::Size(mParameters.imageWidth(), mParameters.imageHeight());\n    }\n\n    cv::Mat mapX = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F);\n    cv::Mat mapY = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F);\n\n    Eigen::Matrix3f K_rect;\n\n    if (cx == -1.0f && cy == -1.0f)\n    {\n        K_rect << fx, 0, imageSize.width / 2,\n                  0, fy, imageSize.height / 2,\n                  0, 0, 1;\n    }\n    else\n    {\n        K_rect << fx, 0, cx,\n                  0, fy, cy,\n                  0, 0, 1;\n    }\n\n    if (fx == -1.0f || fy == -1.0f)\n    {\n        K_rect(0,0) = mParameters.mu();\n        K_rect(1,1) = mParameters.mv();\n    }\n\n    Eigen::Matrix3f K_rect_inv = K_rect.inverse();\n\n    Eigen::Matrix3f R, R_inv;\n    cv::cv2eigen(rmat, R);\n    R_inv = R.inverse();\n\n    for (int v = 0; v < imageSize.height; ++v)\n    {\n        for (int u = 0; u < imageSize.width; ++u)\n        {\n            Eigen::Vector3f xo;\n            xo << u, v, 1;\n\n            Eigen::Vector3f uo = R_inv * K_rect_inv * xo;\n\n            Eigen::Vector2d p;\n            spaceToPlane(uo.cast<double>(), p);\n\n            mapX.at<float>(v,u) = p(0);\n            mapY.at<float>(v,u) = p(1);\n        }\n    }\n\n    cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false);\n\n    cv::Mat K_rect_cv;\n    cv::eigen2cv(K_rect, K_rect_cv);\n    return K_rect_cv;\n}\n\nint\nEquidistantCamera::parameterCount(void) const\n{\n    return 8;\n}\n\nconst EquidistantCamera::Parameters&\nEquidistantCamera::getParameters(void) const\n{\n    return mParameters;\n}\n\nvoid\nEquidistantCamera::setParameters(const EquidistantCamera::Parameters& parameters)\n{\n    mParameters = parameters;\n\n    // Inverse camera projection matrix parameters\n    m_inv_K11 = 1.0 / mParameters.mu();\n    m_inv_K13 = -mParameters.u0() / mParameters.mu();\n    m_inv_K22 = 1.0 / mParameters.mv();\n    m_inv_K23 = -mParameters.v0() / mParameters.mv();\n}\n\nvoid\nEquidistantCamera::readParameters(const std::vector<double>& parameterVec)\n{\n    if (parameterVec.size() != parameterCount())\n    {\n        return;\n    }\n\n    Parameters params = getParameters();\n\n    params.k2() = parameterVec.at(0);\n    params.k3() = parameterVec.at(1);\n    params.k4() = parameterVec.at(2);\n    params.k5() = parameterVec.at(3);\n    params.mu() = parameterVec.at(4);\n    params.mv() = parameterVec.at(5);\n    params.u0() = parameterVec.at(6);\n    params.v0() = parameterVec.at(7);\n\n    setParameters(params);\n}\n\nvoid\nEquidistantCamera::writeParameters(std::vector<double>& parameterVec) const\n{\n    parameterVec.resize(parameterCount());\n    parameterVec.at(0) = mParameters.k2();\n    parameterVec.at(1) = mParameters.k3();\n    parameterVec.at(2) = mParameters.k4();\n    parameterVec.at(3) = mParameters.k5();\n    parameterVec.at(4) = mParameters.mu();\n    parameterVec.at(5) = mParameters.mv();\n    parameterVec.at(6) = mParameters.u0();\n    parameterVec.at(7) = mParameters.v0();\n}\n\nvoid\nEquidistantCamera::writeParametersToYamlFile(const std::string& filename) const\n{\n    mParameters.writeToYamlFile(filename);\n}\n\nstd::string\nEquidistantCamera::parametersToString(void) const\n{\n    std::ostringstream oss;\n    oss << mParameters;\n\n    return oss.str();\n}\n\nvoid\nEquidistantCamera::fitOddPoly(const std::vector<double>& x, const std::vector<double>& y,\n                              int n, std::vector<double>& coeffs) const\n{\n    std::vector<int> pows;\n    for (int i = 1; i <= n; i += 2)\n    {\n        pows.push_back(i);\n    }\n\n    Eigen::MatrixXd X(x.size(), pows.size());\n    Eigen::MatrixXd Y(y.size(), 1);\n    for (size_t i = 0; i < x.size(); ++i)\n    {\n        for (size_t j = 0; j < pows.size(); ++j)\n        {\n            X(i,j) = pow(x.at(i), pows.at(j));\n        }\n        Y(i,0) = y.at(i);\n    }\n\n    Eigen::MatrixXd A = (X.transpose() * X).inverse() * X.transpose() * Y;\n\n    coeffs.resize(A.rows());\n    for (int i = 0; i < A.rows(); ++i)\n    {\n        coeffs.at(i) = A(i,0);\n    }\n}\n\nvoid\nEquidistantCamera::backprojectSymmetric(const Eigen::Vector2d& p_u,\n                                        double& theta, double& phi) const\n{\n    double tol = 1e-10;\n    double p_u_norm = p_u.norm();\n\n    if (p_u_norm < 1e-10)\n    {\n        phi = 0.0;\n    }\n    else\n    {\n        phi = atan2(p_u(1), p_u(0));\n    }\n\n    int npow = 9;\n    if (mParameters.k5() == 0.0)\n    {\n        npow -= 2;\n    }\n    if (mParameters.k4() == 0.0)\n    {\n        npow -= 2;\n    }\n    if (mParameters.k3() == 0.0)\n    {\n        npow -= 2;\n    }\n    if (mParameters.k2() == 0.0)\n    {\n        npow -= 2;\n    }\n\n    Eigen::MatrixXd coeffs(npow + 1, 1);\n    coeffs.setZero();\n    coeffs(0) = -p_u_norm;\n    coeffs(1) = 1.0;\n\n    if (npow >= 3)\n    {\n        coeffs(3) = mParameters.k2();\n    }\n    if (npow >= 5)\n    {\n        coeffs(5) = mParameters.k3();\n    }\n    if (npow >= 7)\n    {\n        coeffs(7) = mParameters.k4();\n    }\n    if (npow >= 9)\n    {\n        coeffs(9) = mParameters.k5();\n    }\n\n    if (npow == 1)\n    {\n        theta = p_u_norm;\n    }\n    else\n    {\n        // Get eigenvalues of companion matrix corresponding to polynomial.\n        // Eigenvalues correspond to roots of polynomial.\n        Eigen::MatrixXd A(npow, npow);\n        A.setZero();\n        A.block(1, 0, npow - 1, npow - 1).setIdentity();\n        A.col(npow - 1) = - coeffs.block(0, 0, npow, 1) / coeffs(npow);\n\n        Eigen::EigenSolver<Eigen::MatrixXd> es(A);\n        Eigen::MatrixXcd eigval = es.eigenvalues();\n\n        std::vector<double> thetas;\n        for (int i = 0; i < eigval.rows(); ++i)\n        {\n            if (fabs(eigval(i).imag()) > tol)\n            {\n                continue;\n            }\n\n            double t = eigval(i).real();\n\n            if (t < -tol)\n            {\n                continue;\n            }\n            else if (t < 0.0)\n            {\n                t = 0.0;\n            }\n\n            thetas.push_back(t);\n        }\n\n        if (thetas.empty())\n        {\n            theta = p_u_norm;\n        }\n        else\n        {\n            theta = *std::min_element(thetas.begin(), thetas.end());\n        }\n    }\n}\n\n}\n"
  },
  {
    "path": "camera_model/src/camera_models/PinholeCamera.cc",
    "content": "#include \"camodocal/camera_models/PinholeCamera.h\"\n\n#include <cmath>\n#include <cstdio>\n#include <eigen3/Eigen/Dense>\n#include <iomanip>\n#include <opencv2/calib3d/calib3d.hpp>\n#include <opencv2/core/eigen.hpp>\n#include <opencv2/imgproc/imgproc.hpp>\n\n#include \"camodocal/gpl/gpl.h\"\n\nnamespace camodocal\n{\n\nPinholeCamera::Parameters::Parameters()\n : Camera::Parameters(PINHOLE)\n , m_k1(0.0)\n , m_k2(0.0)\n , m_p1(0.0)\n , m_p2(0.0)\n , m_fx(0.0)\n , m_fy(0.0)\n , m_cx(0.0)\n , m_cy(0.0)\n{\n\n}\n\nPinholeCamera::Parameters::Parameters(const std::string& cameraName,\n                                      int w, int h,\n                                      double k1, double k2,\n                                      double p1, double p2,\n                                      double fx, double fy,\n                                      double cx, double cy)\n : Camera::Parameters(PINHOLE, cameraName, w, h)\n , m_k1(k1)\n , m_k2(k2)\n , m_p1(p1)\n , m_p2(p2)\n , m_fx(fx)\n , m_fy(fy)\n , m_cx(cx)\n , m_cy(cy)\n{\n}\n\ndouble&\nPinholeCamera::Parameters::k1(void)\n{\n    return m_k1;\n}\n\ndouble&\nPinholeCamera::Parameters::k2(void)\n{\n    return m_k2;\n}\n\ndouble&\nPinholeCamera::Parameters::p1(void)\n{\n    return m_p1;\n}\n\ndouble&\nPinholeCamera::Parameters::p2(void)\n{\n    return m_p2;\n}\n\ndouble&\nPinholeCamera::Parameters::fx(void)\n{\n    return m_fx;\n}\n\ndouble&\nPinholeCamera::Parameters::fy(void)\n{\n    return m_fy;\n}\n\ndouble&\nPinholeCamera::Parameters::cx(void)\n{\n    return m_cx;\n}\n\ndouble&\nPinholeCamera::Parameters::cy(void)\n{\n    return m_cy;\n}\n\ndouble\nPinholeCamera::Parameters::k1(void) const\n{\n    return m_k1;\n}\n\ndouble\nPinholeCamera::Parameters::k2(void) const\n{\n    return m_k2;\n}\n\ndouble\nPinholeCamera::Parameters::p1(void) const\n{\n    return m_p1;\n}\n\ndouble\nPinholeCamera::Parameters::p2(void) const\n{\n    return m_p2;\n}\n\ndouble\nPinholeCamera::Parameters::fx(void) const\n{\n    return m_fx;\n}\n\ndouble\nPinholeCamera::Parameters::fy(void) const\n{\n    return m_fy;\n}\n\ndouble\nPinholeCamera::Parameters::cx(void) const\n{\n    return m_cx;\n}\n\ndouble\nPinholeCamera::Parameters::cy(void) const\n{\n    return m_cy;\n}\n\nbool\nPinholeCamera::Parameters::readFromYamlFile(const std::string& filename)\n{\n    cv::FileStorage fs(filename, cv::FileStorage::READ);\n\n    if (!fs.isOpened())\n    {\n        return false;\n    }\n\n    if (!fs[\"model_type\"].isNone())\n    {\n        std::string sModelType;\n        fs[\"model_type\"] >> sModelType;\n\n        if (sModelType.compare(\"PINHOLE\") != 0)\n        {\n            return false;\n        }\n    }\n\n    m_modelType = PINHOLE;\n    fs[\"camera_name\"] >> m_cameraName;\n    m_imageWidth = static_cast<int>(fs[\"image_width\"]);\n    m_imageHeight = static_cast<int>(fs[\"image_height\"]);\n\n    cv::FileNode n = fs[\"distortion_parameters\"];\n    m_k1 = static_cast<double>(n[\"k1\"]);\n    m_k2 = static_cast<double>(n[\"k2\"]);\n    m_p1 = static_cast<double>(n[\"p1\"]);\n    m_p2 = static_cast<double>(n[\"p2\"]);\n\n    n = fs[\"projection_parameters\"];\n    m_fx = static_cast<double>(n[\"fx\"]);\n    m_fy = static_cast<double>(n[\"fy\"]);\n    m_cx = static_cast<double>(n[\"cx\"]);\n    m_cy = static_cast<double>(n[\"cy\"]);\n\n    return true;\n}\n\nvoid\nPinholeCamera::Parameters::writeToYamlFile(const std::string& filename) const\n{\n    cv::FileStorage fs(filename, cv::FileStorage::WRITE);\n\n    fs << \"model_type\" << \"PINHOLE\";\n    fs << \"camera_name\" << m_cameraName;\n    fs << \"image_width\" << m_imageWidth;\n    fs << \"image_height\" << m_imageHeight;\n\n    // radial distortion: k1, k2\n    // tangential distortion: p1, p2\n    fs << \"distortion_parameters\";\n    fs << \"{\" << \"k1\" << m_k1\n              << \"k2\" << m_k2\n              << \"p1\" << m_p1\n              << \"p2\" << m_p2 << \"}\";\n\n    // projection: fx, fy, cx, cy\n    fs << \"projection_parameters\";\n    fs << \"{\" << \"fx\" << m_fx\n              << \"fy\" << m_fy\n              << \"cx\" << m_cx\n              << \"cy\" << m_cy << \"}\";\n\n    fs.release();\n}\n\nPinholeCamera::Parameters&\nPinholeCamera::Parameters::operator=(const PinholeCamera::Parameters& other)\n{\n    if (this != &other)\n    {\n        m_modelType = other.m_modelType;\n        m_cameraName = other.m_cameraName;\n        m_imageWidth = other.m_imageWidth;\n        m_imageHeight = other.m_imageHeight;\n        m_k1 = other.m_k1;\n        m_k2 = other.m_k2;\n        m_p1 = other.m_p1;\n        m_p2 = other.m_p2;\n        m_fx = other.m_fx;\n        m_fy = other.m_fy;\n        m_cx = other.m_cx;\n        m_cy = other.m_cy;\n    }\n\n    return *this;\n}\n\nstd::ostream&\noperator<< (std::ostream& out, const PinholeCamera::Parameters& params)\n{\n    out << \"Camera Parameters:\" << std::endl;\n    out << \"    model_type \" << \"PINHOLE\" << std::endl;\n    out << \"   camera_name \" << params.m_cameraName << std::endl;\n    out << \"   image_width \" << params.m_imageWidth << std::endl;\n    out << \"  image_height \" << params.m_imageHeight << std::endl;\n\n    // radial distortion: k1, k2\n    // tangential distortion: p1, p2\n    out << \"Distortion Parameters\" << std::endl;\n    out << \"            k1 \" << params.m_k1 << std::endl\n        << \"            k2 \" << params.m_k2 << std::endl\n        << \"            p1 \" << params.m_p1 << std::endl\n        << \"            p2 \" << params.m_p2 << std::endl;\n\n    // projection: fx, fy, cx, cy\n    out << \"Projection Parameters\" << std::endl;\n    out << \"            fx \" << params.m_fx << std::endl\n        << \"            fy \" << params.m_fy << std::endl\n        << \"            cx \" << params.m_cx << std::endl\n        << \"            cy \" << params.m_cy << std::endl;\n\n    return out;\n}\n\nPinholeCamera::PinholeCamera()\n : m_inv_K11(1.0)\n , m_inv_K13(0.0)\n , m_inv_K22(1.0)\n , m_inv_K23(0.0)\n , m_noDistortion(true)\n{\n\n}\n\nPinholeCamera::PinholeCamera(const std::string& cameraName,\n                             int imageWidth, int imageHeight,\n                             double k1, double k2, double p1, double p2,\n                             double fx, double fy, double cx, double cy)\n : mParameters(cameraName, imageWidth, imageHeight,\n               k1, k2, p1, p2, fx, fy, cx, cy)\n{\n    if ((mParameters.k1() == 0.0) &&\n        (mParameters.k2() == 0.0) &&\n        (mParameters.p1() == 0.0) &&\n        (mParameters.p2() == 0.0))\n    {\n        m_noDistortion = true;\n    }\n    else\n    {\n        m_noDistortion = false;\n    }\n\n    // Inverse camera projection matrix parameters\n    m_inv_K11 = 1.0 / mParameters.fx();\n    m_inv_K13 = -mParameters.cx() / mParameters.fx();\n    m_inv_K22 = 1.0 / mParameters.fy();\n    m_inv_K23 = -mParameters.cy() / mParameters.fy();\n}\n\nPinholeCamera::PinholeCamera(const PinholeCamera::Parameters& params)\n : mParameters(params)\n{\n    if ((mParameters.k1() == 0.0) &&\n        (mParameters.k2() == 0.0) &&\n        (mParameters.p1() == 0.0) &&\n        (mParameters.p2() == 0.0))\n    {\n        m_noDistortion = true;\n    }\n    else\n    {\n        m_noDistortion = false;\n    }\n\n    // Inverse camera projection matrix parameters\n    m_inv_K11 = 1.0 / mParameters.fx();\n    m_inv_K13 = -mParameters.cx() / mParameters.fx();\n    m_inv_K22 = 1.0 / mParameters.fy();\n    m_inv_K23 = -mParameters.cy() / mParameters.fy();\n}\n\nCamera::ModelType\nPinholeCamera::modelType(void) const\n{\n    return mParameters.modelType();\n}\n\nconst std::string&\nPinholeCamera::cameraName(void) const\n{\n    return mParameters.cameraName();\n}\n\nint\nPinholeCamera::imageWidth(void) const\n{\n    return mParameters.imageWidth();\n}\n\nint\nPinholeCamera::imageHeight(void) const\n{\n    return mParameters.imageHeight();\n}\n\nvoid\nPinholeCamera::estimateIntrinsics(const cv::Size& boardSize,\n                                  const std::vector< std::vector<cv::Point3f> >& objectPoints,\n                                  const std::vector< std::vector<cv::Point2f> >& imagePoints)\n{\n    // Z. Zhang, A Flexible New Technique for Camera Calibration, PAMI 2000\n\n    Parameters params = getParameters();\n\n    params.k1() = 0.0;\n    params.k2() = 0.0;\n    params.p1() = 0.0;\n    params.p2() = 0.0;\n\n    double cx = params.imageWidth() / 2.0;\n    double cy = params.imageHeight() / 2.0;\n    params.cx() = cx;\n    params.cy() = cy;\n\n    size_t nImages = imagePoints.size();\n\n    cv::Mat A(nImages * 2, 2, CV_64F);\n    cv::Mat b(nImages * 2, 1, CV_64F);\n\n    for (size_t i = 0; i < nImages; ++i)\n    {\n        const std::vector<cv::Point3f>& oPoints = objectPoints.at(i);\n\n        std::vector<cv::Point2f> M(oPoints.size());\n        for (size_t j = 0; j < M.size(); ++j)\n        {\n            M.at(j) = cv::Point2f(oPoints.at(j).x, oPoints.at(j).y);\n        }\n\n        cv::Mat H = cv::findHomography(M, imagePoints.at(i));\n\n        H.at<double>(0,0) -= H.at<double>(2,0) * cx;\n        H.at<double>(0,1) -= H.at<double>(2,1) * cx;\n        H.at<double>(0,2) -= H.at<double>(2,2) * cx;\n        H.at<double>(1,0) -= H.at<double>(2,0) * cy;\n        H.at<double>(1,1) -= H.at<double>(2,1) * cy;\n        H.at<double>(1,2) -= H.at<double>(2,2) * cy;\n\n        double h[3], v[3], d1[3], d2[3];\n        double n[4] = {0,0,0,0};\n\n        for (int j = 0; j < 3; ++j)\n        {\n            double t0 = H.at<double>(j,0);\n            double t1 = H.at<double>(j,1);\n            h[j] = t0; v[j] = t1;\n            d1[j] = (t0 + t1) * 0.5;\n            d2[j] = (t0 - t1) * 0.5;\n            n[0] += t0 * t0; n[1] += t1 * t1;\n            n[2] += d1[j] * d1[j]; n[3] += d2[j] * d2[j];\n        }\n\n        for (int j = 0; j < 4; ++j)\n        {\n            n[j] = 1.0 / sqrt(n[j]);\n        }\n\n        for (int j = 0; j < 3; ++j)\n        {\n            h[j] *= n[0]; v[j] *= n[1];\n            d1[j] *= n[2]; d2[j] *= n[3];\n        }\n\n        A.at<double>(i * 2, 0) = h[0] * v[0];\n        A.at<double>(i * 2, 1) = h[1] * v[1];\n        A.at<double>(i * 2 + 1, 0) = d1[0] * d2[0];\n        A.at<double>(i * 2 + 1, 1) = d1[1] * d2[1];\n        b.at<double>(i * 2, 0) = -h[2] * v[2];\n        b.at<double>(i * 2 + 1, 0) = -d1[2] * d2[2];\n    }\n\n    cv::Mat f(2, 1, CV_64F);\n    cv::solve(A, b, f, cv::DECOMP_NORMAL | cv::DECOMP_LU);\n\n    params.fx() = sqrt(fabs(1.0 / f.at<double>(0)));\n    params.fy() = sqrt(fabs(1.0 / f.at<double>(1)));\n\n    setParameters(params);\n}\n\n/**\n * \\brief Lifts a point from the image plane to the unit sphere\n *\n * \\param p image coordinates\n * \\param P coordinates of the point on the sphere\n */\nvoid\nPinholeCamera::liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const\n{\n    liftProjective(p, P);\n\n    P.normalize();\n}\n\n/**\n * \\brief Lifts a point from the image plane to its projective ray\n *\n * \\param p image coordinates\n * \\param P coordinates of the projective ray\n */\nvoid\nPinholeCamera::liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const\n{\n    double mx_d, my_d,mx2_d, mxy_d, my2_d, mx_u, my_u;\n    double rho2_d, rho4_d, radDist_d, Dx_d, Dy_d, inv_denom_d;\n    //double lambda;\n\n    // Lift points to normalised plane\n    mx_d = m_inv_K11 * p(0) + m_inv_K13;\n    my_d = m_inv_K22 * p(1) + m_inv_K23;\n\n    if (m_noDistortion)\n    {\n        mx_u = mx_d;\n        my_u = my_d;\n    }\n    else\n    {\n        if (0)\n        {\n            double k1 = mParameters.k1();\n            double k2 = mParameters.k2();\n            double p1 = mParameters.p1();\n            double p2 = mParameters.p2();\n\n            // Apply inverse distortion model\n            // proposed by Heikkila\n            mx2_d = mx_d*mx_d;\n            my2_d = my_d*my_d;\n            mxy_d = mx_d*my_d;\n            rho2_d = mx2_d+my2_d;\n            rho4_d = rho2_d*rho2_d;\n            radDist_d = k1*rho2_d+k2*rho4_d;\n            Dx_d = mx_d*radDist_d + p2*(rho2_d+2*mx2_d) + 2*p1*mxy_d;\n            Dy_d = my_d*radDist_d + p1*(rho2_d+2*my2_d) + 2*p2*mxy_d;\n            inv_denom_d = 1/(1+4*k1*rho2_d+6*k2*rho4_d+8*p1*my_d+8*p2*mx_d);\n\n            mx_u = mx_d - inv_denom_d*Dx_d;\n            my_u = my_d - inv_denom_d*Dy_d;\n        }\n        else\n        {\n            // Recursive distortion model\n            int n = 8;\n            Eigen::Vector2d d_u;\n            distortion(Eigen::Vector2d(mx_d, my_d), d_u);\n            // Approximate value\n            mx_u = mx_d - d_u(0);\n            my_u = my_d - d_u(1);\n\n            for (int i = 1; i < n; ++i)\n            {\n                distortion(Eigen::Vector2d(mx_u, my_u), d_u);\n                mx_u = mx_d - d_u(0);\n                my_u = my_d - d_u(1);\n            }\n        }\n    }\n\n    // Obtain a projective ray\n    P << mx_u, my_u, 1.0;\n}\n\n\n/**\n * \\brief Project a 3D point (\\a x,\\a y,\\a z) to the image plane in (\\a u,\\a v)\n *\n * \\param P 3D point coordinates\n * \\param p return value, contains the image point coordinates\n */\nvoid\nPinholeCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const\n{\n    Eigen::Vector2d p_u, p_d;\n\n    // Project points to the normalised plane\n    p_u << P(0) / P(2), P(1) / P(2);\n\n    if (m_noDistortion)\n    {\n        p_d = p_u;\n    }\n    else\n    {\n        // Apply distortion\n        Eigen::Vector2d d_u;\n        distortion(p_u, d_u);\n        p_d = p_u + d_u;\n    }\n\n    // Apply generalised projection matrix\n    p << mParameters.fx() * p_d(0) + mParameters.cx(),\n         mParameters.fy() * p_d(1) + mParameters.cy();\n}\n\n#if 0\n/**\n * \\brief Project a 3D point to the image plane and calculate Jacobian\n *\n * \\param P 3D point coordinates\n * \\param p return value, contains the image point coordinates\n */\nvoid\nPinholeCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p,\n                            Eigen::Matrix<double,2,3>& J) const\n{\n    Eigen::Vector2d p_u, p_d;\n    double norm, inv_denom;\n    double dxdmx, dydmx, dxdmy, dydmy;\n\n    norm = P.norm();\n    // Project points to the normalised plane\n    inv_denom = 1.0 / P(2);\n    p_u << inv_denom * P(0), inv_denom * P(1);\n\n    // Calculate jacobian\n    double dudx = inv_denom;\n    double dvdx = 0.0;\n    double dudy = 0.0;\n    double dvdy = inv_denom;\n    inv_denom = - inv_denom * inv_denom;\n    double dudz = P(0) * inv_denom;\n    double dvdz = P(1) * inv_denom;\n\n    if (m_noDistortion)\n    {\n        p_d = p_u;\n    }\n    else\n    {\n        // Apply distortion\n        Eigen::Vector2d d_u;\n        distortion(p_u, d_u);\n        p_d = p_u + d_u;\n    }\n\n    double fx = mParameters.fx();\n    double fy = mParameters.fy();\n\n    // Make the product of the jacobians\n    // and add projection matrix jacobian\n    inv_denom = fx * (dudx * dxdmx + dvdx * dxdmy); // reuse\n    dvdx = fy * (dudx * dydmx + dvdx * dydmy);\n    dudx = inv_denom;\n\n    inv_denom = fx * (dudy * dxdmx + dvdy * dxdmy); // reuse\n    dvdy = fy * (dudy * dydmx + dvdy * dydmy);\n    dudy = inv_denom;\n\n    inv_denom = fx * (dudz * dxdmx + dvdz * dxdmy); // reuse\n    dvdz = fy * (dudz * dydmx + dvdz * dydmy);\n    dudz = inv_denom;\n\n    // Apply generalised projection matrix\n    p << fx * p_d(0) + mParameters.cx(),\n         fy * p_d(1) + mParameters.cy();\n\n    J << dudx, dudy, dudz,\n         dvdx, dvdy, dvdz;\n}\n#endif\n\n/**\n * \\brief Projects an undistorted 2D point p_u to the image plane\n *\n * \\param p_u 2D point coordinates\n * \\return image point coordinates\n */\nvoid\nPinholeCamera::undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const\n{\n    Eigen::Vector2d p_d;\n\n    if (m_noDistortion)\n    {\n        p_d = p_u;\n    }\n    else\n    {\n        // Apply distortion\n        Eigen::Vector2d d_u;\n        distortion(p_u, d_u);\n        p_d = p_u + d_u;\n    }\n\n    // Apply generalised projection matrix\n    p << mParameters.fx() * p_d(0) + mParameters.cx(),\n         mParameters.fy() * p_d(1) + mParameters.cy();\n}\n\n/**\n * \\brief Apply distortion to input point (from the normalised plane)\n *\n * \\param p_u undistorted coordinates of point on the normalised plane\n * \\return to obtain the distorted point: p_d = p_u + d_u\n */\nvoid\nPinholeCamera::distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u) const\n{\n    double k1 = mParameters.k1();\n    double k2 = mParameters.k2();\n    double p1 = mParameters.p1();\n    double p2 = mParameters.p2();\n\n    double mx2_u, my2_u, mxy_u, rho2_u, rad_dist_u;\n\n    mx2_u = p_u(0) * p_u(0);\n    my2_u = p_u(1) * p_u(1);\n    mxy_u = p_u(0) * p_u(1);\n    rho2_u = mx2_u + my2_u;\n    rad_dist_u = k1 * rho2_u + k2 * rho2_u * rho2_u;\n    d_u << p_u(0) * rad_dist_u + 2.0 * p1 * mxy_u + p2 * (rho2_u + 2.0 * mx2_u),\n           p_u(1) * rad_dist_u + 2.0 * p2 * mxy_u + p1 * (rho2_u + 2.0 * my2_u);\n}\n\n/**\n * \\brief Apply distortion to input point (from the normalised plane)\n *        and calculate Jacobian\n *\n * \\param p_u undistorted coordinates of point on the normalised plane\n * \\return to obtain the distorted point: p_d = p_u + d_u\n */\nvoid\nPinholeCamera::distortion(const Eigen::Vector2d& p_u, Eigen::Vector2d& d_u,\n                          Eigen::Matrix2d& J) const\n{\n    double k1 = mParameters.k1();\n    double k2 = mParameters.k2();\n    double p1 = mParameters.p1();\n    double p2 = mParameters.p2();\n\n    double mx2_u, my2_u, mxy_u, rho2_u, rad_dist_u;\n\n    mx2_u = p_u(0) * p_u(0);\n    my2_u = p_u(1) * p_u(1);\n    mxy_u = p_u(0) * p_u(1);\n    rho2_u = mx2_u + my2_u;\n    rad_dist_u = k1 * rho2_u + k2 * rho2_u * rho2_u;\n    d_u << p_u(0) * rad_dist_u + 2.0 * p1 * mxy_u + p2 * (rho2_u + 2.0 * mx2_u),\n           p_u(1) * rad_dist_u + 2.0 * p2 * mxy_u + p1 * (rho2_u + 2.0 * my2_u);\n\n    double dxdmx = 1.0 + rad_dist_u + k1 * 2.0 * mx2_u + k2 * rho2_u * 4.0 * mx2_u + 2.0 * p1 * p_u(1) + 6.0 * p2 * p_u(0);\n    double dydmx = k1 * 2.0 * p_u(0) * p_u(1) + k2 * 4.0 * rho2_u * p_u(0) * p_u(1) + p1 * 2.0 * p_u(0) + 2.0 * p2 * p_u(1);\n    double dxdmy = dydmx;\n    double dydmy = 1.0 + rad_dist_u + k1 * 2.0 * my2_u + k2 * rho2_u * 4.0 * my2_u + 6.0 * p1 * p_u(1) + 2.0 * p2 * p_u(0);\n\n    J << dxdmx, dxdmy,\n         dydmx, dydmy;\n}\n\nvoid\nPinholeCamera::initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale) const\n{\n    cv::Size imageSize(mParameters.imageWidth(), mParameters.imageHeight());\n\n    cv::Mat mapX = cv::Mat::zeros(imageSize, CV_32F);\n    cv::Mat mapY = cv::Mat::zeros(imageSize, CV_32F);\n\n    for (int v = 0; v < imageSize.height; ++v)\n    {\n        for (int u = 0; u < imageSize.width; ++u)\n        {\n            double mx_u = m_inv_K11 / fScale * u + m_inv_K13 / fScale;\n            double my_u = m_inv_K22 / fScale * v + m_inv_K23 / fScale;\n\n            Eigen::Vector3d P;\n            P << mx_u, my_u, 1.0;\n\n            Eigen::Vector2d p;\n            spaceToPlane(P, p);\n\n            mapX.at<float>(v,u) = p(0);\n            mapY.at<float>(v,u) = p(1);\n        }\n    }\n\n    cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false);\n}\n\ncv::Mat\nPinholeCamera::initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\n                                       float fx, float fy,\n                                       cv::Size imageSize,\n                                       float cx, float cy,\n                                       cv::Mat rmat) const\n{\n    if (imageSize == cv::Size(0, 0))\n    {\n        imageSize = cv::Size(mParameters.imageWidth(), mParameters.imageHeight());\n    }\n\n    cv::Mat mapX = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F);\n    cv::Mat mapY = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F);\n\n    Eigen::Matrix3f R, R_inv;\n    cv::cv2eigen(rmat, R);\n    R_inv = R.inverse();\n\n    // assume no skew\n    Eigen::Matrix3f K_rect;\n\n    if (cx == -1.0f || cy == -1.0f)\n    {\n        K_rect << fx, 0, imageSize.width / 2,\n                  0, fy, imageSize.height / 2,\n                  0, 0, 1;\n    }\n    else\n    {\n        K_rect << fx, 0, cx,\n                  0, fy, cy,\n                  0, 0, 1;\n    }\n\n    if (fx == -1.0f || fy == -1.0f)\n    {\n        K_rect(0,0) = mParameters.fx();\n        K_rect(1,1) = mParameters.fy();\n    }\n\n    Eigen::Matrix3f K_rect_inv = K_rect.inverse();\n\n    for (int v = 0; v < imageSize.height; ++v)\n    {\n        for (int u = 0; u < imageSize.width; ++u)\n        {\n            Eigen::Vector3f xo;\n            xo << u, v, 1;\n\n            Eigen::Vector3f uo = R_inv * K_rect_inv * xo;\n\n            Eigen::Vector2d p;\n            spaceToPlane(uo.cast<double>(), p);\n\n            mapX.at<float>(v,u) = p(0);\n            mapY.at<float>(v,u) = p(1);\n        }\n    }\n\n    cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false);\n\n    cv::Mat K_rect_cv;\n    cv::eigen2cv(K_rect, K_rect_cv);\n    return K_rect_cv;\n}\n\nint\nPinholeCamera::parameterCount(void) const\n{\n    return 8;\n}\n\nconst PinholeCamera::Parameters&\nPinholeCamera::getParameters(void) const\n{\n    return mParameters;\n}\n\nvoid\nPinholeCamera::setParameters(const PinholeCamera::Parameters& parameters)\n{\n    mParameters = parameters;\n\n    if ((mParameters.k1() == 0.0) &&\n        (mParameters.k2() == 0.0) &&\n        (mParameters.p1() == 0.0) &&\n        (mParameters.p2() == 0.0))\n    {\n        m_noDistortion = true;\n    }\n    else\n    {\n        m_noDistortion = false;\n    }\n\n    m_inv_K11 = 1.0 / mParameters.fx();\n    m_inv_K13 = -mParameters.cx() / mParameters.fx();\n    m_inv_K22 = 1.0 / mParameters.fy();\n    m_inv_K23 = -mParameters.cy() / mParameters.fy();\n}\n\nvoid\nPinholeCamera::readParameters(const std::vector<double>& parameterVec)\n{\n    if ((int)parameterVec.size() != parameterCount())\n    {\n        return;\n    }\n\n    Parameters params = getParameters();\n\n    params.k1() = parameterVec.at(0);\n    params.k2() = parameterVec.at(1);\n    params.p1() = parameterVec.at(2);\n    params.p2() = parameterVec.at(3);\n    params.fx() = parameterVec.at(4);\n    params.fy() = parameterVec.at(5);\n    params.cx() = parameterVec.at(6);\n    params.cy() = parameterVec.at(7);\n\n    setParameters(params);\n}\n\nvoid\nPinholeCamera::writeParameters(std::vector<double>& parameterVec) const\n{\n    parameterVec.resize(parameterCount());\n    parameterVec.at(0) = mParameters.k1();\n    parameterVec.at(1) = mParameters.k2();\n    parameterVec.at(2) = mParameters.p1();\n    parameterVec.at(3) = mParameters.p2();\n    parameterVec.at(4) = mParameters.fx();\n    parameterVec.at(5) = mParameters.fy();\n    parameterVec.at(6) = mParameters.cx();\n    parameterVec.at(7) = mParameters.cy();\n}\n\nvoid\nPinholeCamera::writeParametersToYamlFile(const std::string& filename) const\n{\n    mParameters.writeToYamlFile(filename);\n}\n\nstd::string\nPinholeCamera::parametersToString(void) const\n{\n    std::ostringstream oss;\n    oss << mParameters;\n\n    return oss.str();\n}\n\n}\n"
  },
  {
    "path": "camera_model/src/camera_models/ScaramuzzaCamera.cc",
    "content": "#include \"camodocal/camera_models/ScaramuzzaCamera.h\"\n\n#include <cmath>\n#include <cstdio>\n#include <eigen3/Eigen/Dense>\n#include <eigen3/Eigen/SVD>\n#include <iomanip>\n#include <iostream>\n#include <opencv2/calib3d/calib3d.hpp>\n#include <opencv2/core/eigen.hpp>\n#include <opencv2/imgproc/imgproc.hpp>\n#include <boost/lexical_cast.hpp>\n#include <boost/algorithm/string.hpp>\n\n#include \"camodocal/gpl/gpl.h\"\n\n\nEigen::VectorXd polyfit(Eigen::VectorXd& xVec, Eigen::VectorXd& yVec, int poly_order) {\n    assert(poly_order > 0);\n    assert(xVec.size() > poly_order);\n    assert(xVec.size() == yVec.size());\n\n    Eigen::MatrixXd A(xVec.size(), poly_order+1);\n    Eigen::VectorXd B(xVec.size());\n\n    for(int i = 0; i < xVec.size(); ++i) {\n        const double x = xVec(i);\n        const double y = yVec(i);\n\n        double x_pow_k = 1.0;\n\n        for(int k=0; k<=poly_order; ++k) {\n            A(i,k) = x_pow_k;\n            x_pow_k *= x;\n        }\n\n        B(i) = y;\n    }\n\n    Eigen::JacobiSVD<Eigen::MatrixXd> svd(A, Eigen::ComputeThinU | Eigen::ComputeThinV);\n    Eigen::VectorXd x = svd.solve(B);\n\n    return x;\n}\n\nnamespace camodocal\n{\n\nOCAMCamera::Parameters::Parameters()\n : Camera::Parameters(SCARAMUZZA)\n , m_C(0.0)\n , m_D(0.0)\n , m_E(0.0)\n , m_center_x(0.0)\n , m_center_y(0.0)\n{\n    memset(m_poly, 0, sizeof(double) * SCARAMUZZA_POLY_SIZE);\n    memset(m_inv_poly, 0, sizeof(double) * SCARAMUZZA_INV_POLY_SIZE);\n}\n\n\n\nbool\nOCAMCamera::Parameters::readFromYamlFile(const std::string& filename)\n{\n    cv::FileStorage fs(filename, cv::FileStorage::READ);\n\n    if (!fs.isOpened())\n    {\n        return false;\n    }\n\n    if (!fs[\"model_type\"].isNone())\n    {\n        std::string sModelType;\n        fs[\"model_type\"] >> sModelType;\n\n        if (!boost::iequals(sModelType, \"scaramuzza\"))\n        {\n            return false;\n        }\n    }\n\n    m_modelType = SCARAMUZZA;\n    fs[\"camera_name\"] >> m_cameraName;\n    m_imageWidth = static_cast<int>(fs[\"image_width\"]);\n    m_imageHeight = static_cast<int>(fs[\"image_height\"]);\n\n    cv::FileNode n = fs[\"poly_parameters\"];\n    for(int i=0; i < SCARAMUZZA_POLY_SIZE; i++)\n        m_poly[i] = static_cast<double>(n[std::string(\"p\") + boost::lexical_cast<std::string>(i)]);\n\n    n = fs[\"inv_poly_parameters\"];\n    for(int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\n        m_inv_poly[i] = static_cast<double>(n[std::string(\"p\") + boost::lexical_cast<std::string>(i)]);\n\n    n = fs[\"affine_parameters\"];\n    m_C = static_cast<double>(n[\"ac\"]);\n    m_D = static_cast<double>(n[\"ad\"]);\n    m_E = static_cast<double>(n[\"ae\"]);\n\n    m_center_x = static_cast<double>(n[\"cx\"]);\n    m_center_y = static_cast<double>(n[\"cy\"]);\n\n    return true;\n}\n\nvoid\nOCAMCamera::Parameters::writeToYamlFile(const std::string& filename) const\n{\n    cv::FileStorage fs(filename, cv::FileStorage::WRITE);\n\n    fs << \"model_type\" << \"scaramuzza\";\n    fs << \"camera_name\" << m_cameraName;\n    fs << \"image_width\" << m_imageWidth;\n    fs << \"image_height\" << m_imageHeight;\n\n    fs << \"poly_parameters\";\n    fs << \"{\";\n    for(int i=0; i < SCARAMUZZA_POLY_SIZE; i++)\n        fs << std::string(\"p\") + boost::lexical_cast<std::string>(i) << m_poly[i];\n    fs << \"}\";\n\n    fs << \"inv_poly_parameters\";\n    fs << \"{\";\n    for(int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\n        fs << std::string(\"p\") + boost::lexical_cast<std::string>(i) << m_inv_poly[i];\n    fs << \"}\";\n\n    fs << \"affine_parameters\";\n    fs << \"{\" << \"ac\" << m_C\n              << \"ad\" << m_D\n              << \"ae\" << m_E\n              << \"cx\" << m_center_x\n              << \"cy\" << m_center_y << \"}\";\n\n    fs.release();\n}\n\nOCAMCamera::Parameters&\nOCAMCamera::Parameters::operator=(const OCAMCamera::Parameters& other)\n{\n    if (this != &other)\n    {\n        m_modelType = other.m_modelType;\n        m_cameraName = other.m_cameraName;\n        m_imageWidth = other.m_imageWidth;\n        m_imageHeight = other.m_imageHeight;\n        m_C = other.m_C;\n        m_D = other.m_D;\n        m_E = other.m_E;\n        m_center_x = other.m_center_x;\n        m_center_y = other.m_center_y;\n\n        memcpy(m_poly, other.m_poly, sizeof(double) * SCARAMUZZA_POLY_SIZE);\n        memcpy(m_inv_poly, other.m_inv_poly, sizeof(double) * SCARAMUZZA_INV_POLY_SIZE);\n    }\n\n    return *this;\n}\n\nstd::ostream&\noperator<< (std::ostream& out, const OCAMCamera::Parameters& params)\n{\n    out << \"Camera Parameters:\" << std::endl;\n    out << \"    model_type \" << \"scaramuzza\" << std::endl;\n    out << \"   camera_name \" << params.m_cameraName << std::endl;\n    out << \"   image_width \" << params.m_imageWidth << std::endl;\n    out << \"  image_height \" << params.m_imageHeight << std::endl;\n\n    out << std::fixed << std::setprecision(10);\n\n    out << \"Poly Parameters\" << std::endl;\n    for(int i=0; i < SCARAMUZZA_POLY_SIZE; i++)\n        out << std::string(\"p\") + boost::lexical_cast<std::string>(i) << \": \" << params.m_poly[i] << std::endl;\n\n    out << \"Inverse Poly Parameters\" << std::endl;\n    for(int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\n        out << std::string(\"p\") + boost::lexical_cast<std::string>(i) << \": \" << params.m_inv_poly[i] << std::endl;\n\n    out << \"Affine Parameters\" << std::endl;\n    out << \"            ac \" << params.m_C << std::endl\n        << \"            ad \" << params.m_D << std::endl\n        << \"            ae \" << params.m_E << std::endl;\n    out << \"            cx \" << params.m_center_x << std::endl\n        << \"            cy \" << params.m_center_y << std::endl;\n\n    return out;\n}\n\nOCAMCamera::OCAMCamera()\n : m_inv_scale(0.0)\n{\n\n}\n\nOCAMCamera::OCAMCamera(const OCAMCamera::Parameters& params)\n : mParameters(params)\n{\n    m_inv_scale = 1.0 / (params.C() - params.D() * params.E());\n}\n\nCamera::ModelType\nOCAMCamera::modelType(void) const\n{\n    return mParameters.modelType();\n}\n\nconst std::string&\nOCAMCamera::cameraName(void) const\n{\n    return mParameters.cameraName();\n}\n\nint\nOCAMCamera::imageWidth(void) const\n{\n    return mParameters.imageWidth();\n}\n\nint\nOCAMCamera::imageHeight(void) const\n{\n    return mParameters.imageHeight();\n}\n\nvoid\nOCAMCamera::estimateIntrinsics(const cv::Size& boardSize,\n                               const std::vector< std::vector<cv::Point3f> >& objectPoints,\n                               const std::vector< std::vector<cv::Point2f> >& imagePoints)\n{\n    // std::cout << \"OCAMCamera::estimateIntrinsics - NOT IMPLEMENTED\" << std::endl;\n    // throw std::string(\"OCAMCamera::estimateIntrinsics - NOT IMPLEMENTED\");\n\n    // Reference: Page 30 of\n    // \" Scaramuzza, D. Omnidirectional Vision: from Calibration to Robot Motion Estimation, ETH Zurich. Thesis no. 17635.\"\n    // http://e-collection.library.ethz.ch/eserv/eth:30301/eth-30301-02.pdf\n    // Matlab code: calibrate.m\n\n    // First, estimate every image's extrinsics parameters\n    std::vector< Eigen::Matrix3d > RList;\n    std::vector< Eigen::Vector3d > TList;\n\n    RList.reserve(imagePoints.size());\n    TList.reserve(imagePoints.size());\n\n    // i-th image\n    for (size_t image_index = 0; image_index < imagePoints.size(); ++image_index)\n    {\n        const std::vector<cv::Point3f>& objPts = objectPoints.at(image_index);\n        const std::vector<cv::Point2f>& imgPts = imagePoints.at(image_index);\n\n        assert(objPts.size() == imgPts.size());\n        assert(objPts.size() == static_cast<unsigned int>(boardSize.width * boardSize.height));\n\n        Eigen::MatrixXd M(objPts.size(), 6);\n\n        for(size_t corner_index = 0; corner_index < objPts.size(); ++corner_index) {\n            double X = objPts.at(corner_index).x;\n            double Y = objPts.at(corner_index).y;\n            assert(objPts.at(corner_index).z==0.0);\n            \n            double u = imgPts.at(corner_index).x;\n            double v = imgPts.at(corner_index).y;\n\n            M(corner_index, 0) = -v * X;\n            M(corner_index, 1) = -v * Y;\n            M(corner_index, 2) =  u * X;\n            M(corner_index, 3) =  u * Y;\n            M(corner_index, 4) = -v;\n            M(corner_index, 5) =  u;\n        }\n\n        Eigen::JacobiSVD<Eigen::MatrixXd> svd(M, Eigen::ComputeFullU | Eigen::ComputeFullV);\n        assert(svd.matrixV().cols() == 6);\n        Eigen::VectorXd h = -svd.matrixV().col(5);\n\n        // scaled version of R and T\n        const double sr11 = h(0);\n        const double sr12 = h(1);\n        const double sr21 = h(2);\n        const double sr22 = h(3);\n        const double st1  = h(4);\n        const double st2  = h(5);\n\n        const double AA = square(sr11*sr12 + sr21*sr22);\n        const double BB = square(sr11) + square(sr21);\n        const double CC = square(sr12) + square(sr22);\n\n        const double sr32_squared_1 = (- (CC-BB) + sqrt(square(CC-BB) + 4.0 * AA)) / 2.0;\n        const double sr32_squared_2 = (- (CC-BB) - sqrt(square(CC-BB) + 4.0 * AA)) / 2.0;\n\n// printf(\"rst = %.12f\\n\", sr32_squared_1*sr32_squared_1 + (CC-BB)*sr32_squared_1 - AA);\n\n        std::vector<double> sr32_squared_values;\n        if (sr32_squared_1 > 0) sr32_squared_values.push_back(sr32_squared_1);\n        if (sr32_squared_2 > 0) sr32_squared_values.push_back(sr32_squared_2);\n        assert(!sr32_squared_values.empty());\n\n        std::vector<double> sr32_values;\n        std::vector<double> sr31_values;\n        for (auto sr32_squared : sr32_squared_values) {\n            for(int sign = -1; sign <= 1; sign += 2) {\n                const double sr32 = static_cast<double>(sign) * std::sqrt(sr32_squared);\n                sr32_values.push_back( sr32 );\n                if (sr32_squared == 0.0) {\n                    // sr31 can be calculated through norm equality, \n                    // but it has positive and negative posibilities\n                    // positive one\n                    sr31_values.push_back(std::sqrt(CC-BB));\n                    // negative one\n                    sr32_values.push_back( sr32 );\n                    sr31_values.push_back(-std::sqrt(CC-BB));\n                    \n                    break; // skip the same situation\n                } else {\n                    // sr31 can be calculated throught dot product == 0\n                    sr31_values.push_back(- (sr11*sr12 + sr21*sr22) / sr32);\n                }\n            }\n        }\n\n        // std::cout << \"h= \" << std::setprecision(12) << h.transpose() << std::endl;\n        // std::cout << \"length: \" << sr32_values.size() << \" & \" << sr31_values.size() << std::endl;\n\n        assert(!sr31_values.empty());\n        assert(sr31_values.size() == sr32_values.size());\n        \n        std::vector<Eigen::Matrix3d> H_values;\n        for(size_t i=0;i<sr31_values.size(); ++i) {\n            const double sr31 = sr31_values.at(i);\n            const double sr32 = sr32_values.at(i);\n            const double lambda = 1.0 / sqrt(sr11*sr11 + sr21*sr21 + sr31*sr31);\n            Eigen::Matrix3d H;\n            H.setZero();\n            H(0,0) = sr11; H(0,1) = sr12; H(0,2) = st1;\n            H(1,0) = sr21; H(1,1) = sr22; H(1,2) = st2;\n            H(2,0) = sr31; H(2,1) = sr32; H(2,2) = 0;\n\n            H_values.push_back( lambda * H);\n            H_values.push_back(-lambda * H);\n        }\n\n        for(auto& H : H_values) {\n            // std::cout << \"H=\\n\" << H << std::endl;\n            Eigen::Matrix3d R;\n            R.col(0) = H.col(0);\n            R.col(1) = H.col(1);\n            R.col(2) = H.col(0).cross(H.col(1));\n            // std::cout << \"R33 = \" << R(2,2) << std::endl;\n        }\n        \n        std::vector<Eigen::Matrix3d> H_candidates;\n\n        for (auto& H : H_values)\n        {\n            Eigen::MatrixXd A_mat(2 * imagePoints.at(image_index).size(), 4);\n            Eigen::VectorXd B_vec(2 * imagePoints.at(image_index).size());\n            A_mat.setZero();\n            B_vec.setZero();\n\n            size_t line_index = 0;\n\n            // iterate images\n            const double& r11 = H(0,0);\n            const double& r12 = H(0,1);\n            // const double& r13 = H(0,2);\n            const double& r21 = H(1,0);\n            const double& r22 = H(1,1);\n            // const double& r23 = H(1,2);\n            const double& r31 = H(2,0);\n            const double& r32 = H(2,1);\n            // const double& r33 = H(2,2);\n            const double& t1  = H(0);\n            const double& t2  = H(1);\n                \n            // iterate chessboard corners in the image\n            for(size_t j=0; j<imagePoints.at(image_index).size(); ++j) {\n                assert(line_index == 2 * j);\n\n                const double& X = objectPoints.at(image_index).at(j).x;\n                const double& Y = objectPoints.at(image_index).at(j).y;\n                const double& u = imagePoints.at(image_index).at(j).x;\n                const double& v = imagePoints.at(image_index).at(j).y;\n\n                double A = r21 * X + r22 * Y + t2;\n                double B = v * (r31 * X + r32 * Y);\n                double C = r11 * X + r12 * Y + t1;\n                double D = u * (r31 * X + r32 * Y);\n                double rou = std::sqrt(u*u + v*v);\n\n                \n                A_mat(line_index+0, 0) = A;\n                A_mat(line_index+1, 0) = C;\n                A_mat(line_index+0, 1) = A * rou;\n                A_mat(line_index+1, 1) = C * rou;\n                A_mat(line_index+0, 2) = A * rou * rou;\n                A_mat(line_index+1, 2) = C * rou * rou;\n                \n                A_mat(line_index+0, 3) = -v;\n                A_mat(line_index+1, 3) = -u;\n                B_vec(line_index+0) = B;\n                B_vec(line_index+1) = D;\n\n                line_index += 2;\n            }\n\n            assert(line_index == static_cast<unsigned int>(A_mat.rows()));\n\n            // pseudo-inverse for polynomial parameters and all t3s\n            {\n                Eigen::JacobiSVD<Eigen::MatrixXd> svd(A_mat, Eigen::ComputeThinU | Eigen::ComputeThinV);\n\n                Eigen::VectorXd x = svd.solve(B_vec);\n\n                // std::cout << \"x(poly and t3) = \" << x << std::endl;\n\n                if (x(2) > 0 && x(3) > 0) {\n                    H_candidates.push_back(H);\n                }\n            }\n        }\n\n        // printf(\"H_candidates.size()=%zu\\n\", H_candidates.size());\n        assert(H_candidates.size()==1);\n\n        Eigen::Matrix3d& H = H_candidates.front();\n\n        Eigen::Matrix3d R; \n        R.col(0) = H.col(0); \n        R.col(1) = H.col(1); \n        R.col(2) = H.col(0).cross(H.col(1)); \n\n        Eigen::Vector3d T = H.col(2);\n        RList.push_back(R);\n        TList.push_back(T);\n\n        // std::cout << \"#\" << image_index << \" frame\" << \" R =\" << R << \" \\nT = \" << T.transpose() << std::endl;\n    }\n\n    // Second, estimate camera intrinsic parameters and all t3\n    Eigen::MatrixXd A_mat(2 * imagePoints.size() * imagePoints.at(0).size(), SCARAMUZZA_POLY_SIZE-1 + imagePoints.size());\n    Eigen::VectorXd B_vec(2 * imagePoints.size() * imagePoints.at(0).size());\n    A_mat.setZero();\n    B_vec.setZero();\n\n    size_t line_index = 0;\n\n    // iterate images\n    for(size_t i = 0; i < imagePoints.size(); ++i) {\n        const double& r11 = RList.at(i)(0,0);\n        const double& r12 = RList.at(i)(0,1);\n        // const double& r13 = RList.at(i)(0,2);\n        const double& r21 = RList.at(i)(1,0);\n        const double& r22 = RList.at(i)(1,1);\n        // const double& r23 = RList.at(i)(1,2);\n        const double& r31 = RList.at(i)(2,0);\n        const double& r32 = RList.at(i)(2,1);\n        // const double& r33 = RList.at(i)(2,2);\n        const double& t1  = TList.at(i)(0);\n        const double& t2  = TList.at(i)(1);\n        \n        // iterate chessboard corners in the image\n        for(size_t j=0; j<imagePoints.at(i).size(); ++j) {\n            assert(line_index == 2 * (i * imagePoints.at(0).size() + j));\n\n            const double& X = objectPoints.at(i).at(j).x;\n            const double& Y = objectPoints.at(i).at(j).y;\n            const double& u = imagePoints.at(i).at(j).x;\n            const double& v = imagePoints.at(i).at(j).y;\n\n            double A = r21 * X + r22 * Y + t2;\n            double B = v * (r31 * X + r32 * Y);\n            double C = r11 * X + r12 * Y + t1;\n            double D = u * (r31 * X + r32 * Y);\n            double rou = std::sqrt(u*u + v*v);\n\n            for(int k=1;k<=SCARAMUZZA_POLY_SIZE-1;++k) {\n                double pow_rou = 0.0;\n                if (k == 1) {\n                    pow_rou = 1.0;\n                }\n                else {\n                    pow_rou = std::pow(rou, k);\n                }\n\n                A_mat(line_index+0, k-1) = A * pow_rou;\n                A_mat(line_index+1, k-1) = C * pow_rou;\n            }\n            \n            A_mat(line_index+0, SCARAMUZZA_POLY_SIZE-1+i) = -v;\n            A_mat(line_index+1, SCARAMUZZA_POLY_SIZE-1+i) = -u;\n            B_vec(line_index+0) = B;\n            B_vec(line_index+1) = D;\n\n            line_index += 2;\n        }\n    }\n\n    assert(line_index == static_cast<unsigned int>(A_mat.rows()));\n\n    Eigen::Matrix<double, SCARAMUZZA_POLY_SIZE, 1> poly_coeff;\n    // pseudo-inverse for polynomial parameters and all t3s\n    {\n        Eigen::JacobiSVD<Eigen::MatrixXd> svd(A_mat, Eigen::ComputeThinU | Eigen::ComputeThinV);\n\n        Eigen::VectorXd x = svd.solve(B_vec);\n\n        poly_coeff[0] = x(0);\n        poly_coeff[1] = 0.0;\n        for(int i=1;i<poly_coeff.size()-1;++i) {\n            poly_coeff[i+1] = x(i);\n        }\n        assert(x.size() == static_cast<unsigned int>(SCARAMUZZA_POLY_SIZE-1+TList.size()));\n    }\n\n    Parameters params = getParameters();\n\n    // Affine matrix A is constructed as [C D; E 1]\n    params.C() = 1.0;\n    params.D() = 0.0;\n    params.E() = 0.0;\n\n    params.center_x() = params.imageWidth() / 2.0;\n    params.center_y() = params.imageHeight() / 2.0;\n    \n    for(size_t i=0; i<SCARAMUZZA_POLY_SIZE; ++i) {\n        params.poly(i) = poly_coeff[i];\n    }\n\n    // params.poly(0) = -216.9657476318;\n    // params.poly(1) = 0.0;\n    // params.poly(2) = 0.0017866911;\n    // params.poly(3) = -0.0000019866;\n    // params.poly(4) =  0.0000000077;\n\n    \n    // inv_poly\n    {\n        std::vector<double> rou_vec;\n        std::vector<double> z_vec;\n        for(double rou = 0.0; rou <= (params.imageWidth() + params.imageHeight())/2; rou += 0.1) {\n            double rou_pow_k = 1.0;\n            double z = 0.0;\n\n            for (int k = 0; k < SCARAMUZZA_POLY_SIZE; k++)\n            {\n                z += rou_pow_k * params.poly(k);\n                rou_pow_k *= rou;\n            }\n\n            rou_vec.push_back(rou);\n            z_vec.push_back(z);\n        }\n\n        assert(rou_vec.size() == z_vec.size());\n        Eigen::VectorXd xVec(rou_vec.size());\n        Eigen::VectorXd yVec(rou_vec.size());\n\n        for(size_t i=0; i<rou_vec.size(); ++i) {\n            xVec(i) = std::atan2(-z_vec.at(i), rou_vec.at(i));\n            yVec(i) = rou_vec.at(i);\n        }\n\n        // use lower order poly to eliminate over-fitting cause by noisy/inaccurate data\n        const int poly_fit_order = 4;\n        Eigen::VectorXd inv_poly_coeff = polyfit(xVec, yVec, poly_fit_order);\n        \n        for(int i=0; i<=poly_fit_order; ++i) {\n            params.inv_poly(i) = inv_poly_coeff(i);\n        }\n    }\n\n    setParameters(params);\n\n    std::cout << \"initial params:\\n\" << params << std::endl; \n}\n\n/** \n * \\brief Lifts a point from the image plane to the unit sphere\n *\n * \\param p image coordinates\n * \\param P coordinates of the point on the sphere\n */\nvoid\nOCAMCamera::liftSphere(const Eigen::Vector2d& p, Eigen::Vector3d& P) const\n{\n    liftProjective(p, P);\n    P.normalize();\n}\n\n/** \n * \\brief Lifts a point from the image plane to its projective ray\n *\n * \\param p image coordinates\n * \\param P coordinates of the projective ray\n */\nvoid\nOCAMCamera::liftProjective(const Eigen::Vector2d& p, Eigen::Vector3d& P) const\n{\n    // Relative to Center\n    Eigen::Vector2d xc(p[0] - mParameters.center_x(), p[1] - mParameters.center_y());\n\n    // Affine Transformation\n    // xc_a = inv(A) * xc;\n    Eigen::Vector2d xc_a(\n        m_inv_scale * (xc[0] - mParameters.D() * xc[1]),\n        m_inv_scale * (-mParameters.E() * xc[0] + mParameters.C() * xc[1])\n    );\n\n    double phi = std::sqrt(xc_a[0] * xc_a[0] + xc_a[1] * xc_a[1]);\n    double phi_i = 1.0;\n    double z = 0.0;\n\n    for (int i = 0; i < SCARAMUZZA_POLY_SIZE; i++)\n    {\n        z += phi_i * mParameters.poly(i);\n        phi_i *= phi;\n    }\n\n    P << xc[0], xc[1], -z;\n}\n\n\n/** \n * \\brief Project a 3D point (\\a x,\\a y,\\a z) to the image plane in (\\a u,\\a v)\n *\n * \\param P 3D point coordinates\n * \\param p return value, contains the image point coordinates\n */\nvoid\nOCAMCamera::spaceToPlane(const Eigen::Vector3d& P, Eigen::Vector2d& p) const\n{\n    double norm = std::sqrt(P[0] * P[0] + P[1] * P[1]);\n    double theta = std::atan2(-P[2], norm);\n    double rho = 0.0;\n    double theta_i = 1.0;\n\n    for (int i = 0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\n    {\n        rho += theta_i * mParameters.inv_poly(i);\n        theta_i *= theta;\n    }\n\n    double invNorm = 1.0 / norm;\n    Eigen::Vector2d xn(\n        P[0] * invNorm * rho,\n        P[1] * invNorm * rho\n    );\n\n    p << xn[0] * mParameters.C() + xn[1] * mParameters.D() + mParameters.center_x(),\n         xn[0] * mParameters.E() + xn[1]                   + mParameters.center_y();\n}\n\n\n/** \n * \\brief Projects an undistorted 2D point p_u to the image plane\n *\n * \\param p_u 2D point coordinates\n * \\return image point coordinates\n */\nvoid\nOCAMCamera::undistToPlane(const Eigen::Vector2d& p_u, Eigen::Vector2d& p) const\n{\n    Eigen::Vector3d P(p_u[0], p_u[1], 1.0);\n    spaceToPlane(P, p);\n}\n\n\n#if 0\nvoid\nOCAMCamera::initUndistortMap(cv::Mat& map1, cv::Mat& map2, double fScale) const\n{\n    cv::Size imageSize(mParameters.imageWidth(), mParameters.imageHeight());\n\n    cv::Mat mapX = cv::Mat::zeros(imageSize, CV_32F);\n    cv::Mat mapY = cv::Mat::zeros(imageSize, CV_32F);\n\n    for (int v = 0; v < imageSize.height; ++v)\n    {\n        for (int u = 0; u < imageSize.width; ++u)\n        {\n            double mx_u = m_inv_K11 / fScale * u + m_inv_K13 / fScale;\n            double my_u = m_inv_K22 / fScale * v + m_inv_K23 / fScale;\n\n            double xi = mParameters.xi();\n            double d2 = mx_u * mx_u + my_u * my_u;\n\n            Eigen::Vector3d P;\n            P << mx_u, my_u, 1.0 - xi * (d2 + 1.0) / (xi + sqrt(1.0 + (1.0 - xi * xi) * d2));\n\n            Eigen::Vector2d p;\n            spaceToPlane(P, p);\n\n            mapX.at<float>(v,u) = p(0);\n            mapY.at<float>(v,u) = p(1);\n        }\n    }\n\n    cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false);\n}\n#endif\n\ncv::Mat\nOCAMCamera::initUndistortRectifyMap(cv::Mat& map1, cv::Mat& map2,\n                                    float fx, float fy,\n                                    cv::Size imageSize,\n                                    float cx, float cy,\n                                    cv::Mat rmat) const\n{\n    if (imageSize == cv::Size(0, 0))\n    {\n        imageSize = cv::Size(mParameters.imageWidth(), mParameters.imageHeight());\n    }\n\n    cv::Mat mapX = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F);\n    cv::Mat mapY = cv::Mat::zeros(imageSize.height, imageSize.width, CV_32F);\n\n    Eigen::Matrix3f K_rect;\n\n    K_rect << fx, 0, cx < 0 ? imageSize.width / 2 : cx,\n              0, fy, cy < 0 ? imageSize.height / 2 : cy,\n              0, 0, 1;\n\n    if (fx < 0 || fy < 0)\n    {\n        throw std::string(std::string(__FUNCTION__) + \": Focal length must be specified\");\n    }\n\n    Eigen::Matrix3f K_rect_inv = K_rect.inverse();\n\n    Eigen::Matrix3f R, R_inv;\n    cv::cv2eigen(rmat, R);\n    R_inv = R.inverse();\n\n    for (int v = 0; v < imageSize.height; ++v)\n    {\n        for (int u = 0; u < imageSize.width; ++u)\n        {\n            Eigen::Vector3f xo;\n            xo << u, v, 1;\n\n            Eigen::Vector3f uo = R_inv * K_rect_inv * xo;\n\n            Eigen::Vector2d p;\n            spaceToPlane(uo.cast<double>(), p);\n\n            mapX.at<float>(v,u) = p(0);\n            mapY.at<float>(v,u) = p(1);\n        }\n    }\n\n    cv::convertMaps(mapX, mapY, map1, map2, CV_32FC1, false);\n\n    cv::Mat K_rect_cv;\n    cv::eigen2cv(K_rect, K_rect_cv);\n    return K_rect_cv;\n}\n\nint\nOCAMCamera::parameterCount(void) const\n{\n    return SCARAMUZZA_CAMERA_NUM_PARAMS;\n}\n\nconst OCAMCamera::Parameters&\nOCAMCamera::getParameters(void) const\n{\n    return mParameters;\n}\n\nvoid\nOCAMCamera::setParameters(const OCAMCamera::Parameters& parameters)\n{\n    mParameters = parameters;\n\n    m_inv_scale = 1.0 / (parameters.C() - parameters.D() * parameters.E());\n}\n\nvoid\nOCAMCamera::readParameters(const std::vector<double>& parameterVec)\n{\n    if ((int)parameterVec.size() != parameterCount())\n    {\n        return;\n    }\n\n    Parameters params = getParameters();\n\n    params.C() = parameterVec.at(0);\n    params.D() = parameterVec.at(1);\n    params.E() = parameterVec.at(2);\n    params.center_x() = parameterVec.at(3);\n    params.center_y() = parameterVec.at(4);\n    for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++)\n        params.poly(i) = parameterVec.at(5+i);\n    for (int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\n        params.inv_poly(i) = parameterVec.at(5 + SCARAMUZZA_POLY_SIZE + i);\n\n    setParameters(params);\n}\n\nvoid\nOCAMCamera::writeParameters(std::vector<double>& parameterVec) const\n{\n    parameterVec.resize(parameterCount());\n    parameterVec.at(0) = mParameters.C();\n    parameterVec.at(1) = mParameters.D();\n    parameterVec.at(2) = mParameters.E();\n    parameterVec.at(3) = mParameters.center_x();\n    parameterVec.at(4) = mParameters.center_y();\n    for (int i=0; i < SCARAMUZZA_POLY_SIZE; i++)\n        parameterVec.at(5+i) = mParameters.poly(i);\n    for (int i=0; i < SCARAMUZZA_INV_POLY_SIZE; i++)\n        parameterVec.at(5 + SCARAMUZZA_POLY_SIZE + i) = mParameters.inv_poly(i);\n}\n\nvoid\nOCAMCamera::writeParametersToYamlFile(const std::string& filename) const\n{\n    mParameters.writeToYamlFile(filename);\n}\n\nstd::string\nOCAMCamera::parametersToString(void) const\n{\n    std::ostringstream oss;\n    oss << mParameters;\n\n    return oss.str();\n}\n\n}\n"
  },
  {
    "path": "camera_model/src/chessboard/Chessboard.cc",
    "content": "#include \"camodocal/chessboard/Chessboard.h\"\n\n#include <opencv2/calib3d/calib3d.hpp>\n#include <opencv2/imgproc/imgproc.hpp>\n\n#include \"camodocal/chessboard/ChessboardQuad.h\"\n#include \"camodocal/chessboard/Spline.h\"\n\n#define MAX_CONTOUR_APPROX  7\n\nnamespace camodocal\n{\n\nChessboard::Chessboard(cv::Size boardSize, cv::Mat& image)\n : mBoardSize(boardSize)\n , mCornersFound(false)\n{\n    if (image.channels() == 1)\n    {\n        cv::cvtColor(image, mSketch, CV_GRAY2BGR);\n        image.copyTo(mImage);\n    }\n    else\n    {\n        image.copyTo(mSketch);\n        cv::cvtColor(image, mImage, CV_BGR2GRAY);\n    }\n}\n\nvoid\nChessboard::findCorners(bool useOpenCV)\n{\n    mCornersFound = findChessboardCorners(mImage, mBoardSize, mCorners,\n                                          CV_CALIB_CB_ADAPTIVE_THRESH +\n                                          CV_CALIB_CB_NORMALIZE_IMAGE +\n                                          CV_CALIB_CB_FILTER_QUADS +\n                                          CV_CALIB_CB_FAST_CHECK,\n                                          useOpenCV);\n\n    if (mCornersFound)\n    {\n        // draw chessboard corners\n        cv::drawChessboardCorners(mSketch, mBoardSize, mCorners, mCornersFound);\n    }\n}\n\nconst std::vector<cv::Point2f>&\nChessboard::getCorners(void) const\n{\n    return mCorners;\n}\n\nbool\nChessboard::cornersFound(void) const\n{\n    return mCornersFound;\n}\n\nconst cv::Mat&\nChessboard::getImage(void) const\n{\n    return mImage;\n}\n\nconst cv::Mat&\nChessboard::getSketch(void) const\n{\n    return mSketch;\n}\n\nbool\nChessboard::findChessboardCorners(const cv::Mat& image,\n                                  const cv::Size& patternSize,\n                                  std::vector<cv::Point2f>& corners,\n                                  int flags, bool useOpenCV)\n{\n    if (useOpenCV)\n    {\n        return cv::findChessboardCorners(image, patternSize, corners, flags);\n    }\n    else\n    {\n        return findChessboardCornersImproved(image, patternSize, corners, flags);\n    }\n}\n\nbool\nChessboard::findChessboardCornersImproved(const cv::Mat& image,\n                                          const cv::Size& patternSize,\n                                          std::vector<cv::Point2f>& corners,\n                                          int flags)\n{\n    /************************************************************************************\\\n        This is improved variant of chessboard corner detection algorithm that\n        uses a graph of connected quads. It is based on the code contributed\n        by Vladimir Vezhnevets and Philip Gruebele.\n        Here is the copyright notice from the original Vladimir's code:\n        ===============================================================\n\n        The algorithms developed and implemented by Vezhnevets Vldimir\n        aka Dead Moroz (vvp@graphics.cs.msu.ru)\n        See http://graphics.cs.msu.su/en/research/calibration/opencv.html\n        for detailed information.\n\n        Reliability additions and modifications made by Philip Gruebele.\n        <a href=\"mailto:pgruebele@cox.net\">pgruebele@cox.net</a>\n\n        Some improvements were made by:\n        1) Martin Rufli - increased chance of correct corner matching\n           Rufli, M., Scaramuzza, D., and Siegwart, R. (2008),\n           Automatic Detection of Checkerboards on Blurred and Distorted Images,\n           Proceedings of the IEEE/RSJ International Conference on\n           Intelligent Robots and Systems (IROS 2008), Nice, France, September 2008.\n        2) Lionel Heng - post-detection checks and better thresholding\n\n    \\************************************************************************************/\n\n    //int bestDilation        = -1;\n    const int minDilations    =  0;\n    const int maxDilations    =  7;\n\n    std::vector<ChessboardQuadPtr> outputQuadGroup;\n\n    if (image.depth() != CV_8U || image.channels() == 2)\n    {\n        return false;\n    }\n\n    if (patternSize.width < 2 || patternSize.height < 2)\n    {\n        return false;\n    }\n\n    if (patternSize.width > 127 || patternSize.height > 127)\n    {\n        return false;\n    }\n\n    cv::Mat img = image;\n\n    // Image histogram normalization and\n    // BGR to Grayscale image conversion (if applicable)\n    // MARTIN: Set to \"false\"\n    if (image.channels() != 1 || (flags & CV_CALIB_CB_NORMALIZE_IMAGE))\n    {\n        cv::Mat norm_img(image.rows, image.cols, CV_8UC1);\n\n        if (image.channels() != 1)\n        {\n            cv::cvtColor(image, norm_img, CV_BGR2GRAY);\n            img = norm_img;\n        }\n\n        if (flags & CV_CALIB_CB_NORMALIZE_IMAGE)\n        {\n            cv::equalizeHist(image, norm_img);\n            img = norm_img;\n        }\n    }\n\n    if (flags & CV_CALIB_CB_FAST_CHECK)\n    {\n        if (!checkChessboard(img, patternSize))\n        {\n            return false;\n        }\n    }\n\n    // PART 1: FIND LARGEST PATTERN\n    //-----------------------------------------------------------------------\n    // Checker patterns are tried to be found by dilating the background and\n    // then applying a canny edge finder on the closed contours (checkers).\n    // Try one dilation run, but if the pattern is not found, repeat until\n    // max_dilations is reached.\n\n    int prevSqrSize = 0;\n    bool found = false;\n    std::vector<ChessboardCornerPtr> outputCorners;\n\n    for (int k = 0; k < 6; ++k)\n    {\n        for (int dilations = minDilations; dilations <= maxDilations; ++dilations)\n        {\n            if (found)\n            {\n                break;\n            }\n\n            cv::Mat thresh_img;\n\n            // convert the input grayscale image to binary (black-n-white)\n            if (flags & CV_CALIB_CB_ADAPTIVE_THRESH)\n            {\n                int blockSize = lround(prevSqrSize == 0 ?\n                    std::min(img.cols,img.rows)*(k%2 == 0 ? 0.2 : 0.1): prevSqrSize*2)|1;\n\n                // convert to binary\n                cv::adaptiveThreshold(img, thresh_img, 255, CV_ADAPTIVE_THRESH_MEAN_C, CV_THRESH_BINARY, blockSize, (k/2)*5);\n            }\n            else\n            {\n                // empiric threshold level\n                double mean = (cv::mean(img))[0];\n                int thresh_level = lround(mean - 10);\n                thresh_level = std::max(thresh_level, 10);\n\n                cv::threshold(img, thresh_img, thresh_level, 255, CV_THRESH_BINARY);\n            }\n\n            // MARTIN's Code\n            // Use both a rectangular and a cross kernel. In this way, a more\n            // homogeneous dilation is performed, which is crucial for small,\n            // distorted checkers. Use the CROSS kernel first, since its action\n            // on the image is more subtle\n            cv::Mat kernel1 = cv::getStructuringElement(CV_SHAPE_CROSS, cv::Size(3,3), cv::Point(1,1));\n            cv::Mat kernel2 = cv::getStructuringElement(CV_SHAPE_RECT, cv::Size(3,3), cv::Point(1,1));\n\n            if (dilations >= 1)\n                cv::dilate(thresh_img, thresh_img, kernel1);\n            if (dilations >= 2)\n                cv::dilate(thresh_img, thresh_img, kernel2);\n            if (dilations >= 3)\n                cv::dilate(thresh_img, thresh_img, kernel1);\n            if (dilations >= 4)\n                cv::dilate(thresh_img, thresh_img, kernel2);\n            if (dilations >= 5)\n                cv::dilate(thresh_img, thresh_img, kernel1);\n            if (dilations >= 6)\n                cv::dilate(thresh_img, thresh_img, kernel2);\n\n            // In order to find rectangles that go to the edge, we draw a white\n            // line around the image edge. Otherwise FindContours will miss those\n            // clipped rectangle contours. The border color will be the image mean,\n            // because otherwise we risk screwing up filters like cvSmooth()\n            cv::rectangle(thresh_img, cv::Point(0,0),\n                          cv::Point(thresh_img.cols - 1, thresh_img.rows - 1),\n                          CV_RGB(255,255,255), 3, 8);\n\n            // Generate quadrangles in the following function\n            std::vector<ChessboardQuadPtr> quads;\n\n            generateQuads(quads, thresh_img, flags, dilations, true);\n            if (quads.empty())\n            {\n                continue;\n            }\n\n            // The following function finds and assigns neighbor quads to every\n            // quadrangle in the immediate vicinity fulfilling certain\n            // prerequisites\n            findQuadNeighbors(quads, dilations);\n\n            // The connected quads will be organized in groups. The following loop\n            // increases a \"group_idx\" identifier.\n            // The function \"findConnectedQuads assigns all connected quads\n            // a unique group ID.\n            // If more quadrangles were assigned to a given group (i.e. connected)\n            // than are expected by the input variable \"patternSize\", the\n            // function \"cleanFoundConnectedQuads\" erases the surplus\n            // quadrangles by minimizing the convex hull of the remaining pattern.\n\n            for (int group_idx = 0; ; ++group_idx)\n            {\n                std::vector<ChessboardQuadPtr> quadGroup;\n\n                findConnectedQuads(quads, quadGroup, group_idx, dilations);\n\n                if (quadGroup.empty())\n                {\n                    break;\n                }\n\n                cleanFoundConnectedQuads(quadGroup, patternSize);\n\n                // The following function labels all corners of every quad\n                // with a row and column entry.\n                // \"count\" specifies the number of found quads in \"quad_group\"\n                // with group identifier \"group_idx\"\n                // The last parameter is set to \"true\", because this is the\n                // first function call and some initializations need to be\n                // made.\n                labelQuadGroup(quadGroup, patternSize, true);\n\n                found = checkQuadGroup(quadGroup, outputCorners, patternSize);\n\n                float sumDist = 0;\n                int total = 0;\n\n                for (int i = 0; i < (int)outputCorners.size(); ++i)\n                {\n                    int ni = 0;\n                    float avgi = outputCorners.at(i)->meanDist(ni);\n                    sumDist += avgi * ni;\n                    total += ni;\n                }\n                prevSqrSize = lround(sumDist / std::max(total, 1));\n\n                if (found && !checkBoardMonotony(outputCorners, patternSize))\n                {\n                    found = false;\n                }\n            }\n        }\n    }\n\n    if (!found)\n    {\n        return false;\n    }\n    else\n    {\n        corners.clear();\n        corners.reserve(outputCorners.size());\n        for (size_t i = 0; i < outputCorners.size(); ++i)\n        {\n            corners.push_back(outputCorners.at(i)->pt);\n        }\n\n        cv::cornerSubPix(image, corners, cv::Size(11, 11), cv::Size(-1,-1),\n                         cv::TermCriteria(CV_TERMCRIT_EPS + CV_TERMCRIT_ITER, 30, 0.1));\n\n        return true;\n    }\n}\n\n//===========================================================================\n// ERASE OVERHEAD\n//===========================================================================\n// If we found too many connected quads, remove those which probably do not\n// belong.\nvoid\nChessboard::cleanFoundConnectedQuads(std::vector<ChessboardQuadPtr>& quadGroup,\n                                     cv::Size patternSize)\n{\n    cv::Point2f center(0.0f, 0.0f);\n\n    // Number of quads this pattern should contain\n    int count = ((patternSize.width + 1)*(patternSize.height + 1) + 1)/2;\n\n    // If we have more quadrangles than we should, try to eliminate duplicates\n    // or ones which don't belong to the pattern rectangle. Else go to the end\n    // of the function\n    if ((int)quadGroup.size() <= count)\n    {\n        return;\n    }\n\n    // Create an array of quadrangle centers\n    std::vector<cv::Point2f> centers;\n    centers.resize(quadGroup.size());\n\n    for (size_t i = 0; i < quadGroup.size(); ++i)\n    {\n        cv::Point2f ci(0.0f, 0.0f);\n        ChessboardQuadPtr& q = quadGroup[i];\n\n        for (int j = 0; j < 4; ++j)\n        {\n            ci += q->corners[j]->pt;\n        }\n\n        ci *= 0.25f;\n\n        // Centers(i), is the geometric center of quad(i)\n        // Center, is the center of all found quads\n        centers[i] = ci;\n\n        center += ci;\n    }\n\n    center *= 1.0f / quadGroup.size();\n\n    // If we have more quadrangles than we should, we try to eliminate bad\n    // ones based on minimizing the bounding box. We iteratively remove the\n    // point which reduces the size of the bounding box of the blobs the most\n    // (since we want the rectangle to be as small as possible) remove the\n    // quadrange that causes the biggest reduction in pattern size until we\n    // have the correct number\n    while ((int)quadGroup.size() > count)\n    {\n        double minBoxArea = DBL_MAX;\n        int minBoxAreaIndex = -1;\n\n        // For each point, calculate box area without that point\n        for (size_t skip = 0; skip < quadGroup.size(); ++skip)\n        {\n            // get bounding rectangle\n            cv::Point2f temp = centers[skip];\n            centers[skip] = center;\n\n            std::vector<cv::Point2f> hull;\n            cv::convexHull(centers, hull, true, true);\n            centers[skip] = temp;\n\n            double hull_area = fabs(cv::contourArea(hull));\n\n            // remember smallest box area\n            if (hull_area < minBoxArea)\n            {\n                minBoxArea = hull_area;\n                minBoxAreaIndex = skip;\n            }\n        }\n\n        ChessboardQuadPtr& q0 = quadGroup[minBoxAreaIndex];\n\n        // remove any references to this quad as a neighbor\n        for (size_t i = 0; i < quadGroup.size(); ++i)\n        {\n            ChessboardQuadPtr& q = quadGroup.at(i);\n            for (int j = 0; j < 4; ++j)\n            {\n                if (q->neighbors[j] == q0)\n                {\n                    q->neighbors[j].reset();\n                    q->count--;\n                    for (int k = 0; k < 4; ++k)\n                    {\n                        if (q0->neighbors[k] == q)\n                        {\n                            q0->neighbors[k].reset();\n                            q0->count--;\n                            break;\n                        }\n                    }\n                    break;\n                }\n            }\n        }\n        // remove the quad\n        quadGroup.at(minBoxAreaIndex) = quadGroup.back();\n        centers.at(minBoxAreaIndex) = centers.back();\n        quadGroup.pop_back();\n        centers.pop_back();\n    }\n}\n\n//===========================================================================\n// FIND COONECTED QUADS\n//===========================================================================\nvoid\nChessboard::findConnectedQuads(std::vector<ChessboardQuadPtr>& quads,\n                               std::vector<ChessboardQuadPtr>& group,\n                               int group_idx, int dilation)\n{\n    ChessboardQuadPtr q;\n\n    // Scan the array for a first unlabeled quad\n    for (size_t i = 0; i < quads.size(); ++i)\n    {\n        ChessboardQuadPtr& quad = quads.at(i);\n\n        if (quad->count > 0 && quad->group_idx < 0)\n        {\n            q = quad;\n            break;\n        }\n    }\n\n    if (q.get() == 0)\n    {\n        return;\n    }\n\n    // Recursively find a group of connected quads starting from the seed quad\n\n    std::vector<ChessboardQuadPtr> stack;\n    stack.push_back(q);\n\n    group.push_back(q);\n    q->group_idx = group_idx;\n\n    while (!stack.empty())\n    {\n        q = stack.back();\n        stack.pop_back();\n\n        for (int i = 0; i < 4; ++i)\n        {\n            ChessboardQuadPtr& neighbor = q->neighbors[i];\n\n            // If he neighbor exists and the neighbor has more than 0\n            // neighbors and the neighbor has not been classified yet.\n            if (neighbor.get() && neighbor->count > 0 && neighbor->group_idx < 0)\n            {\n                stack.push_back(neighbor);\n                group.push_back(neighbor);\n                neighbor->group_idx = group_idx;\n            }\n        }\n    }\n}\n\nvoid\nChessboard::labelQuadGroup(std::vector<ChessboardQuadPtr>& quadGroup,\n                           cv::Size patternSize, bool firstRun)\n{\n    // If this is the first function call, a seed quad needs to be selected\n    if (firstRun)\n    {\n        // Search for the (first) quad with the maximum number of neighbors\n        // (usually 4). This will be our starting point.\n        int mark = -1;\n        int maxNeighborCount = 0;\n        for (size_t i = 0; i < quadGroup.size(); ++i)\n        {\n            ChessboardQuadPtr& q = quadGroup.at(i);\n            if (q->count > maxNeighborCount)\n            {\n                mark = i;\n                maxNeighborCount = q->count;\n\n                if (maxNeighborCount == 4)\n                {\n                    break;\n                }\n            }\n        }\n\n        // Mark the starting quad's (per definition) upper left corner with\n        //(0,0) and then proceed clockwise\n        // The following labeling sequence enshures a \"right coordinate system\"\n        ChessboardQuadPtr& q = quadGroup.at(mark);\n\n        q->labeled = true;\n\n        q->corners[0]->row = 0;\n        q->corners[0]->column = 0;\n        q->corners[1]->row = 0;\n        q->corners[1]->column = 1;\n        q->corners[2]->row = 1;\n        q->corners[2]->column = 1;\n        q->corners[3]->row = 1;\n        q->corners[3]->column = 0;\n    }\n\n\n    // Mark all other corners with their respective row and column\n    bool flagChanged = true;\n    while (flagChanged)\n    {\n        // First reset the flag to \"false\"\n        flagChanged = false;\n\n        // Go through all quads top down is faster, since unlabeled quads will\n        // be inserted at the end of the list\n        for (int i = quadGroup.size() - 1; i >= 0; --i)\n        {\n            ChessboardQuadPtr& quad = quadGroup.at(i);\n\n            // Check whether quad \"i\" has been labeled already\n             if (!quad->labeled)\n            {\n                // Check its neighbors, whether some of them have been labeled\n                // already\n                for (int j = 0; j < 4; j++)\n                {\n                    // Check whether the neighbor exists (i.e. is not the NULL\n                    // pointer)\n                    if (quad->neighbors[j])\n                    {\n                        ChessboardQuadPtr& quadNeighbor = quad->neighbors[j];\n\n                        // Only proceed, if neighbor \"j\" was labeled\n                        if (quadNeighbor->labeled)\n                        {\n                            // For every quad it could happen to pass here\n                            // multiple times. We therefore \"break\" later.\n                            // Check whitch of the neighbors corners is\n                            // connected to the current quad\n                            int connectedNeighborCornerId = -1;\n                            for (int k = 0; k < 4; ++k)\n                            {\n                                if (quadNeighbor->neighbors[k] == quad)\n                                {\n                                    connectedNeighborCornerId = k;\n\n                                    // there is only one, therefore\n                                    break;\n                                }\n                            }\n\n\n                            // For the following calculations we need the row\n                            // and column of the connected neighbor corner and\n                            // all other corners of the connected quad \"j\",\n                            // clockwise (CW)\n                            ChessboardCornerPtr& conCorner        = quadNeighbor->corners[connectedNeighborCornerId];\n                            ChessboardCornerPtr& conCornerCW1     = quadNeighbor->corners[(connectedNeighborCornerId+1)%4];\n                            ChessboardCornerPtr& conCornerCW2     = quadNeighbor->corners[(connectedNeighborCornerId+2)%4];\n                            ChessboardCornerPtr& conCornerCW3     = quadNeighbor->corners[(connectedNeighborCornerId+3)%4];\n\n                            quad->corners[j]->row            =    conCorner->row;\n                            quad->corners[j]->column        =    conCorner->column;\n                            quad->corners[(j+1)%4]->row        =    conCorner->row - conCornerCW2->row + conCornerCW3->row;\n                            quad->corners[(j+1)%4]->column    =    conCorner->column - conCornerCW2->column + conCornerCW3->column;\n                            quad->corners[(j+2)%4]->row        =    conCorner->row + conCorner->row - conCornerCW2->row;\n                            quad->corners[(j+2)%4]->column    =    conCorner->column + conCorner->column - conCornerCW2->column;\n                            quad->corners[(j+3)%4]->row        =    conCorner->row - conCornerCW2->row + conCornerCW1->row;\n                            quad->corners[(j+3)%4]->column    =    conCorner->column - conCornerCW2->column + conCornerCW1->column;\n\n                            // Mark this quad as labeled\n                            quad->labeled = true;\n\n                            // Changes have taken place, set the flag\n                            flagChanged = true;\n\n                            // once is enough!\n                            break;\n                        }\n                    }\n                }\n            }\n        }\n    }\n\n\n    // All corners are marked with row and column\n    // Record the minimal and maximal row and column indices\n    // It is unlikely that more than 8bit checkers are used per dimension, if there are\n    // an error would have been thrown at the beginning of \"cvFindChessboardCorners2\"\n    int min_row        =  127;\n    int max_row        = -127;\n    int min_column    =  127;\n    int max_column    = -127;\n\n    for (int i = 0; i < (int)quadGroup.size(); ++i)\n    {\n        ChessboardQuadPtr& q = quadGroup.at(i);\n\n        for (int j = 0; j < 4; ++j)\n        {\n            ChessboardCornerPtr& c = q->corners[j];\n\n            if (c->row > max_row)\n            {\n                max_row = c->row;\n            }\n            if (c->row < min_row)\n            {\n                min_row = c->row;\n            }\n            if (c->column > max_column)\n            {\n                max_column = c->column;\n            }\n            if (c->column < min_column)\n            {\n                min_column = c->column;\n            }\n        }\n    }\n\n    // Label all internal corners with \"needsNeighbor\" = false\n    // Label all external corners with \"needsNeighbor\" = true,\n    // except if in a given dimension the pattern size is reached\n    for (int i = min_row; i <= max_row; ++i)\n    {\n        for (int j = min_column; j <= max_column; ++j)\n        {\n            // A flag that indicates, whether a row/column combination is\n            // executed multiple times\n            bool flag = false;\n\n            // Remember corner and quad\n            int cornerID;\n            int quadID;\n\n            for (int k = 0; k < (int)quadGroup.size(); ++k)\n            {\n                ChessboardQuadPtr& q = quadGroup.at(k);\n\n                for (int l = 0; l < 4; ++l)\n                {\n                    if ((q->corners[l]->row == i) && (q->corners[l]->column == j))\n                    {\n                        if (flag)\n                        {\n                            // Passed at least twice through here\n                            q->corners[l]->needsNeighbor = false;\n                            quadGroup[quadID]->corners[cornerID]->needsNeighbor = false;\n                        }\n                        else\n                        {\n                            // Mark with needs a neighbor, but note the\n                            // address\n                            q->corners[l]->needsNeighbor = true;\n                            cornerID = l;\n                            quadID = k;\n                        }\n\n                        // set the flag to true\n                        flag = true;\n                    }\n                }\n            }\n        }\n    }\n\n    // Complete Linking:\n    // sometimes not all corners were properly linked in \"findQuadNeighbors\",\n    // but after labeling each corner with its respective row and column, it is\n    // possible to match them anyway.\n    for (int i = min_row; i <= max_row; ++i)\n    {\n        for (int j = min_column; j <= max_column; ++j)\n        {\n            // the following \"number\" indicates the number of corners which\n            // correspond to the given (i,j) value\n            // 1    is a border corner or a conrer which still needs a neighbor\n            // 2    is a fully connected internal corner\n            // >2    something went wrong during labeling, report a warning\n            int number = 1;\n\n            // remember corner and quad\n            int cornerID;\n            int quadID;\n\n            for (int k = 0; k < (int)quadGroup.size(); ++k)\n            {\n                ChessboardQuadPtr& q = quadGroup.at(k);\n\n                for (int l = 0; l < 4; ++l)\n                {\n                    if ((q->corners[l]->row == i) && (q->corners[l]->column == j))\n                    {\n\n                        if (number == 1)\n                        {\n                            // First corner, note its ID\n                            cornerID = l;\n                            quadID = k;\n                        }\n\n                        else if (number == 2)\n                        {\n                            // Second corner, check wheter this and the\n                            // first one have equal coordinates, else\n                            // interpolate\n                            cv::Point2f delta = q->corners[l]->pt - quadGroup[quadID]->corners[cornerID]->pt;\n\n                            if (delta.x != 0.0f || delta.y != 0.0f)\n                            {\n                                // Interpolate\n                                q->corners[l]->pt -= delta * 0.5f;\n\n                                quadGroup[quadID]->corners[cornerID]->pt += delta * 0.5f;\n                            }\n                        }\n                        else if (number > 2)\n                        {\n                            // Something went wrong during row/column labeling\n                            // Report a Warning\n                            // ->Implemented in the function \"mrWriteCorners\"\n                        }\n\n                        // increase the number by one\n                        ++number;\n                    }\n                }\n            }\n        }\n    }\n\n\n    // Bordercorners don't need any neighbors, if the pattern size in the\n    // respective direction is reached\n    // The only time we can make shure that the target pattern size is reached in a given\n    // dimension, is when the larger side has reached the target size in the maximal\n    // direction, or if the larger side is larger than the smaller target size and the\n    // smaller side equals the smaller target size\n    int largerDimPattern = std::max(patternSize.height,patternSize.width);\n    int smallerDimPattern = std::min(patternSize.height,patternSize.width);\n    bool flagSmallerDim1 = false;\n    bool flagSmallerDim2 = false;\n\n    if ((largerDimPattern + 1) == max_column - min_column)\n    {\n        flagSmallerDim1 = true;\n        // We found out that in the column direction the target pattern size is reached\n        // Therefore border column corners do not need a neighbor anymore\n        // Go through all corners\n        for (int k = 0; k < (int)quadGroup.size(); ++k)\n        {\n            ChessboardQuadPtr& q = quadGroup.at(k);\n\n            for (int l = 0; l < 4; ++l)\n            {\n                ChessboardCornerPtr& c = q->corners[l];\n\n                if (c->column == min_column || c->column == max_column)\n                {\n                    // Needs no neighbor anymore\n                    c->needsNeighbor = false;\n                }\n            }\n        }\n    }\n\n    if ((largerDimPattern + 1) == max_row - min_row)\n    {\n        flagSmallerDim2 = true;\n        // We found out that in the column direction the target pattern size is reached\n        // Therefore border column corners do not need a neighbor anymore\n        // Go through all corners\n        for (int k = 0; k < (int)quadGroup.size(); ++k)\n        {\n            ChessboardQuadPtr& q = quadGroup.at(k);\n\n            for (int l = 0; l < 4; ++l)\n            {\n                ChessboardCornerPtr& c = q->corners[l];\n\n                if (c->row == min_row || c->row == max_row)\n                {\n                    // Needs no neighbor anymore\n                    c->needsNeighbor = false;\n                }\n            }\n        }\n    }\n\n\n    // Check the two flags:\n    //    -    If one is true and the other false, then the pattern target\n    //        size was reached in in one direction -> We can check, whether the target\n    //        pattern size is also reached in the other direction\n    //  -    If both are set to true, then we deal with a square board -> do nothing\n    //  -    If both are set to false -> There is a possibility that the larger side is\n    //        larger than the smaller target size -> Check and if true, then check whether\n    //        the other side has the same size as the smaller target size\n    if ((flagSmallerDim1 == false && flagSmallerDim2 == true))\n    {\n        // Larger target pattern size is in row direction, check wheter smaller target\n        // pattern size is reached in column direction\n        if ((smallerDimPattern + 1) == max_column - min_column)\n        {\n            for (int k = 0; k < (int)quadGroup.size(); ++k)\n            {\n                ChessboardQuadPtr& q = quadGroup.at(k);\n\n                for (int l = 0; l < 4; ++l)\n                {\n                    ChessboardCornerPtr& c = q->corners[l];\n\n                    if (c->column == min_column || c->column == max_column)\n                    {\n                        // Needs no neighbor anymore\n                        c->needsNeighbor = false;\n                    }\n                }\n            }\n        }\n    }\n\n    if ((flagSmallerDim1 == true && flagSmallerDim2 == false))\n    {\n        // Larger target pattern size is in column direction, check wheter smaller target\n        // pattern size is reached in row direction\n        if ((smallerDimPattern + 1) == max_row - min_row)\n        {\n            for (int k = 0; k < (int)quadGroup.size(); ++k)\n            {\n                ChessboardQuadPtr& q = quadGroup.at(k);\n\n                for (int l = 0; l < 4; ++l)\n                {\n                    ChessboardCornerPtr& c = q->corners[l];\n\n                    if (c->row == min_row || c->row == max_row)\n                    {\n                        // Needs no neighbor anymore\n                        c->needsNeighbor = false;\n                    }\n                }\n            }\n        }\n    }\n\n    if ((flagSmallerDim1 == false && flagSmallerDim2 == false) && smallerDimPattern + 1 < max_column - min_column)\n    {\n        // Larger target pattern size is in column direction, check wheter smaller target\n        // pattern size is reached in row direction\n        if ((smallerDimPattern + 1) == max_row - min_row)\n        {\n            for (int k = 0; k < (int)quadGroup.size(); ++k)\n            {\n                ChessboardQuadPtr& q = quadGroup.at(k);\n\n                for (int l = 0; l < 4; ++l)\n                {\n                    ChessboardCornerPtr& c = q->corners[l];\n\n                    if (c->row == min_row || c->row == max_row)\n                    {\n                        // Needs no neighbor anymore\n                        c->needsNeighbor = false;\n                    }\n                }\n            }\n        }\n    }\n\n    if ((flagSmallerDim1 == false && flagSmallerDim2 == false) && smallerDimPattern + 1 < max_row - min_row)\n    {\n        // Larger target pattern size is in row direction, check wheter smaller target\n        // pattern size is reached in column direction\n        if ((smallerDimPattern + 1) == max_column - min_column)\n        {\n            for (int k = 0; k < (int)quadGroup.size(); ++k)\n            {\n                ChessboardQuadPtr& q = quadGroup.at(k);\n\n                for (int l = 0; l < 4; ++l)\n                {\n                    ChessboardCornerPtr& c = q->corners[l];\n\n                    if (c->column == min_column || c->column == max_column)\n                    {\n                        // Needs no neighbor anymore\n                        c->needsNeighbor = false;\n                    }\n                }\n            }\n        }\n    }\n}\n\n//===========================================================================\n// GIVE A GROUP IDX\n//===========================================================================\nvoid\nChessboard::findQuadNeighbors(std::vector<ChessboardQuadPtr>& quads, int dilation)\n{\n    // Thresh dilation is used to counter the effect of dilation on the\n    // distance between 2 neighboring corners. Since the distance below is\n    // computed as its square, we do here the same. Additionally, we take the\n    // conservative assumption that dilation was performed using the 3x3 CROSS\n    // kernel, which coresponds to the 4-neighborhood.\n    const float thresh_dilation = (float)(2*dilation+3)*(2*dilation+3)*2;    // the \"*2\" is for the x and y component\n                                                                            // the \"3\" is for initial corner mismatch\n\n    // Find quad neighbors\n    for (size_t idx = 0; idx < quads.size(); ++idx)\n    {\n        ChessboardQuadPtr& curQuad = quads.at(idx);\n\n        // Go through all quadrangles and label them in groups\n        // For each corner of this quadrangle\n        for (int i = 0; i < 4; ++i)\n        {\n            float minDist = FLT_MAX;\n            int closestCornerIdx = -1;\n            ChessboardQuadPtr closestQuad;\n\n            if (curQuad->neighbors[i])\n            {\n                continue;\n            }\n\n            cv::Point2f pt = curQuad->corners[i]->pt;\n\n            // Find the closest corner in all other quadrangles\n            for (size_t k = 0; k < quads.size(); ++k)\n            {\n                if (k == idx)\n                {\n                    continue;\n                }\n\n                ChessboardQuadPtr& quad = quads.at(k);\n\n                for (int j = 0; j < 4; ++j)\n                {\n                    // If it already has a neighbor\n                    if (quad->neighbors[j])\n                    {\n                        continue;\n                    }\n\n                    cv::Point2f dp = pt - quad->corners[j]->pt;\n                    float dist = dp.dot(dp);\n\n                    // The following \"if\" checks, whether \"dist\" is the\n                    // shortest so far and smaller than the smallest\n                    // edge length of the current and target quads\n                    if (dist < minDist &&\n                        dist <= (curQuad->edge_len + thresh_dilation) &&\n                        dist <= (quad->edge_len + thresh_dilation)   )\n                    {\n                        // Check whether conditions are fulfilled\n                        if (matchCorners(curQuad, i, quad, j))\n                        {\n                            closestCornerIdx = j;\n                            closestQuad = quad;\n                            minDist = dist;\n                        }\n                    }\n                }\n            }\n\n            // Have we found a matching corner point?\n            if (closestCornerIdx >= 0 && minDist < FLT_MAX)\n            {\n                ChessboardCornerPtr closestCorner = closestQuad->corners[closestCornerIdx];\n\n                // Make sure that the closest quad does not have the current\n                // quad as neighbor already\n                bool valid = true;\n                for (int j = 0; j < 4; ++j)\n                {\n                    if (closestQuad->neighbors[j] == curQuad)\n                    {\n                        valid = false;\n                        break;\n                    }\n                }\n                if (!valid)\n                {\n                    continue;\n                }\n\n                // We've found one more corner - remember it\n                closestCorner->pt = (pt + closestCorner->pt) * 0.5f;\n\n                curQuad->count++;\n                curQuad->neighbors[i] = closestQuad;\n                curQuad->corners[i] = closestCorner;\n\n                closestQuad->count++;\n                closestQuad->neighbors[closestCornerIdx] = curQuad;\n                closestQuad->corners[closestCornerIdx] = closestCorner;\n            }\n        }\n    }\n}\n\n\n\n//===========================================================================\n// AUGMENT PATTERN WITH ADDITIONAL QUADS\n//===========================================================================\n// The first part of the function is basically a copy of\n// \"findQuadNeighbors\"\n// The comparisons between two points and two lines could be computed in their\n// own function\nint\nChessboard::augmentBestRun(std::vector<ChessboardQuadPtr>& candidateQuads, int candidateDilation,\n                           std::vector<ChessboardQuadPtr>& existingQuads, int existingDilation)\n{\n    // thresh dilation is used to counter the effect of dilation on the\n    // distance between 2 neighboring corners. Since the distance below is\n    // computed as its square, we do here the same. Additionally, we take the\n    // conservative assumption that dilation was performed using the 3x3 CROSS\n    // kernel, which coresponds to the 4-neighborhood.\n    const float thresh_dilation = (2*candidateDilation+3)*(2*existingDilation+3)*2;    // the \"*2\" is for the x and y component\n\n    // Search all old quads which have a neighbor that needs to be linked\n    for (size_t idx = 0; idx < existingQuads.size(); ++idx)\n    {\n        ChessboardQuadPtr& curQuad = existingQuads.at(idx);\n\n        // For each corner of this quadrangle\n        for (int i = 0; i < 4; ++i)\n        {\n            float minDist = FLT_MAX;\n            int closestCornerIdx = -1;\n            ChessboardQuadPtr closestQuad;\n\n            // If curQuad corner[i] is already linked, continue\n            if (!curQuad->corners[i]->needsNeighbor)\n            {\n                continue;\n            }\n\n            cv::Point2f pt = curQuad->corners[i]->pt;\n\n            // Look for a match in all candidateQuads' corners\n            for (size_t k = 0; k < candidateQuads.size(); ++k)\n            {\n                ChessboardQuadPtr& candidateQuad = candidateQuads.at(k);\n\n                // Only look at unlabeled new quads\n                if (candidateQuad->labeled)\n                {\n                    continue;\n                }\n\n                for (int j = 0; j < 4; ++j)\n                {\n                    // Only proceed if they are less than dist away from each\n                    // other\n                    cv::Point2f dp = pt - candidateQuad->corners[j]->pt;\n                    float dist = dp.dot(dp);\n\n                    if ((dist < minDist) &&\n                        dist <= (curQuad->edge_len + thresh_dilation) &&\n                        dist <= (candidateQuad->edge_len + thresh_dilation))\n                    {\n                        if (matchCorners(curQuad, i, candidateQuad, j))\n                        {\n                            closestCornerIdx = j;\n                            closestQuad = candidateQuad;\n                            minDist = dist;\n                        }\n                    }\n                }\n            }\n\n            // Have we found a matching corner point?\n            if (closestCornerIdx >= 0 && minDist < FLT_MAX)\n            {\n                ChessboardCornerPtr closestCorner = closestQuad->corners[closestCornerIdx];\n                closestCorner->pt = (pt + closestCorner->pt) * 0.5f;\n\n                // We've found one more corner - remember it\n                // ATTENTION: write the corner x and y coordinates separately,\n                // else the crucial row/column entries will be overwritten !!!\n                curQuad->corners[i]->pt = closestCorner->pt;\n                curQuad->neighbors[i] = closestQuad;\n                closestQuad->corners[closestCornerIdx]->pt = closestCorner->pt;\n\n                // Label closest quad as labeled. In this way we exclude it\n                // being considered again during the next loop iteration\n                closestQuad->labeled = true;\n\n                // We have a new member of the final pattern, copy it over\n                ChessboardQuadPtr newQuad(new ChessboardQuad);\n                newQuad->count        = 1;\n                newQuad->edge_len    = closestQuad->edge_len;\n                newQuad->group_idx    = curQuad->group_idx;    //the same as the current quad\n                newQuad->labeled    = false;                //do it right afterwards\n\n                curQuad->neighbors[i] = newQuad;\n\n                // We only know one neighbor for sure\n                newQuad->neighbors[closestCornerIdx] = curQuad;\n\n                for (int j = 0; j < 4; j++)\n                {\n                    newQuad->corners[j].reset(new ChessboardCorner);\n                    newQuad->corners[j]->pt = closestQuad->corners[j]->pt;\n                }\n\n                existingQuads.push_back(newQuad);\n\n                // Start the function again\n                return -1;\n            }\n        }\n    }\n\n    // Finished, don't start the function again\n    return 1;\n}\n\n\n\n//===========================================================================\n// GENERATE QUADRANGLES\n//===========================================================================\nvoid\nChessboard::generateQuads(std::vector<ChessboardQuadPtr>& quads,\n                          cv::Mat& image, int flags,\n                          int dilation, bool firstRun)\n{\n    // Empirical lower bound for the size of allowable quadrangles.\n    // MARTIN, modified: Added \"*0.1\" in order to find smaller quads.\n    int minSize = lround(image.cols * image.rows * .03 * 0.01 * 0.92 * 0.1);\n\n    std::vector< std::vector<cv::Point> > contours;\n    std::vector<cv::Vec4i> hierarchy;\n\n    // Initialize contour retrieving routine\n    cv::findContours(image, contours, hierarchy, CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE);\n\n    std::vector< std::vector<cv::Point> > quadContours;\n\n    for (size_t i = 0; i < contours.size(); ++i)\n    {\n        // Reject contours with a too small perimeter and contours which are\n        // completely surrounded by another contour\n        // MARTIN: If this function is called during PART 1, then the parameter \"first run\"\n        // is set to \"true\". This guarantees, that only \"nice\" squares are detected.\n        // During PART 2, we allow the polygonal approximation function below to\n        // approximate more freely, which can result in recognized \"squares\" that are in\n        // reality multiple blurred and sticked together squares.\n        std::vector<cv::Point>& contour = contours.at(i);\n\n        if (hierarchy[i][3] == -1 || cv::contourArea(contour) < minSize)\n        {\n            continue;\n        }\n\n        int min_approx_level = 1, max_approx_level;\n        if (firstRun)\n        {\n            max_approx_level = 3;\n        }\n        else\n        {\n            max_approx_level = MAX_CONTOUR_APPROX;\n        }\n\n        std::vector<cv::Point> approxContour;\n        for (int approx_level = min_approx_level; approx_level <= max_approx_level; approx_level++)\n        {\n            cv::approxPolyDP(contour, approxContour, approx_level, true);\n\n            if (approxContour.size() == 4)\n            {\n                break;\n            }\n        }\n\n        // Reject non-quadrangles\n        if (approxContour.size() == 4 && cv::isContourConvex(approxContour))\n        {\n            double p = cv::arcLength(approxContour, true);\n            double area = cv::contourArea(approxContour);\n\n            cv::Point pt[4];\n            for (int i = 0; i < 4; i++)\n            {\n                pt[i] = approxContour[i];\n            }\n\n            cv::Point dp = pt[0] - pt[2];\n            double d1 = sqrt(dp.dot(dp));\n\n            dp = pt[1] - pt[3];\n            double d2 = sqrt(dp.dot(dp));\n\n            // PHILIPG: Only accept those quadrangles which are more\n            // square than rectangular and which are big enough\n            dp = pt[0] - pt[1];\n            double d3 = sqrt(dp.dot(dp));\n            dp = pt[1] - pt[2];\n            double d4 = sqrt(dp.dot(dp));\n\n            if (!(flags & CV_CALIB_CB_FILTER_QUADS) ||\n                (d3*4 > d4 && d4*4 > d3 && d3*d4 < area*1.5 && area > minSize &&\n                d1 >= 0.15 * p && d2 >= 0.15 * p))\n            {\n                quadContours.push_back(approxContour);\n            }\n        }\n    }\n\n    // Allocate quad & corner buffers\n    quads.resize(quadContours.size());\n\n    // Create array of quads structures\n    for (size_t idx = 0; idx < quadContours.size(); ++idx)\n    {\n        ChessboardQuadPtr& q = quads.at(idx);\n        std::vector<cv::Point>& contour = quadContours.at(idx);\n\n        // Reset group ID\n        q.reset(new ChessboardQuad);\n        assert(contour.size() == 4);\n\n        for (int i = 0; i < 4; ++i)\n        {\n            cv::Point2f pt = contour.at(i);\n            ChessboardCornerPtr corner(new ChessboardCorner);\n            corner->pt = pt;\n            q->corners[i] = corner;\n        }\n\n        for (int i = 0; i < 4; ++i)\n        {\n            cv::Point2f dp = q->corners[i]->pt - q->corners[(i+1)&3]->pt;\n            float d = dp.dot(dp);\n            if (q->edge_len > d)\n            {\n                q->edge_len = d;\n            }\n        }\n    }\n}\n\nbool\nChessboard::checkQuadGroup(std::vector<ChessboardQuadPtr>& quads,\n                           std::vector<ChessboardCornerPtr>& corners,\n                           cv::Size patternSize)\n{\n    // Initialize\n    bool flagRow = false;\n    bool flagColumn = false;\n    int height = -1;\n    int width = -1;\n\n    // Compute the minimum and maximum row / column ID\n    // (it is unlikely that more than 8bit checkers are used per dimension)\n    int min_row    =  127;\n    int max_row    = -127;\n    int min_col    =  127;\n    int max_col    = -127;\n\n    for (size_t i = 0; i < quads.size(); ++i)\n    {\n        ChessboardQuadPtr& q = quads.at(i);\n\n        for (int j = 0; j < 4; ++j)\n        {\n            ChessboardCornerPtr& c = q->corners[j];\n\n            if (c->row > max_row)\n            {\n                max_row = c->row;\n            }\n            if (c->row < min_row)\n            {\n                min_row = c->row;\n            }\n            if (c->column > max_col)\n            {\n                max_col = c->column;\n            }\n            if (c->column < min_col)\n            {\n                min_col = c->column;\n            }\n        }\n    }\n\n    // If in a given direction the target pattern size is reached, we know exactly how\n    // the checkerboard is oriented.\n    // Else we need to prepare enough \"dummy\" corners for the worst case.\n    for (size_t i = 0; i < quads.size(); ++i)\n    {\n        ChessboardQuadPtr& q = quads.at(i);\n\n        for (int j = 0; j < 4; ++j)\n        {\n            ChessboardCornerPtr& c = q->corners[j];\n\n            if (c->column == max_col && c->row != min_row && c->row != max_row && !c->needsNeighbor)\n            {\n                flagColumn = true;\n            }\n            if (c->row == max_row && c->column != min_col && c->column != max_col && !c->needsNeighbor)\n            {\n                flagRow = true;\n            }\n        }\n    }\n\n    if (flagColumn)\n    {\n        if (max_col - min_col == patternSize.width + 1)\n        {\n            width = patternSize.width;\n            height = patternSize.height;\n        }\n        else\n        {\n            width = patternSize.height;\n            height = patternSize.width;\n        }\n    }\n    else if (flagRow)\n    {\n        if (max_row - min_row == patternSize.width + 1)\n        {\n            height = patternSize.width;\n            width = patternSize.height;\n        }\n        else\n        {\n            height = patternSize.height;\n            width = patternSize.width;\n        }\n    }\n    else\n    {\n        // If target pattern size is not reached in at least one of the two\n        // directions,  then we do not know where the remaining corners are\n        // located. Account for this.\n        width = std::max(patternSize.width, patternSize.height);\n        height = std::max(patternSize.width, patternSize.height);\n    }\n\n    ++min_row;\n    ++min_col;\n    max_row = min_row + height - 1;\n    max_col = min_col + width - 1;\n\n    corners.clear();\n\n    int linkedBorderCorners = 0;\n\n    // Write the corners in increasing order to the output file\n    for (int i = min_row; i <= max_row; ++i)\n    {\n        for (int j = min_col; j <= max_col; ++j)\n        {\n            // Reset the iterator\n            int iter = 1;\n\n            for (int k = 0; k < (int)quads.size(); ++k)\n            {\n                ChessboardQuadPtr& quad = quads.at(k);\n\n                for (int l = 0; l < 4; ++l)\n                {\n                    ChessboardCornerPtr& c = quad->corners[l];\n\n                    if (c->row == i && c->column == j)\n                    {\n                        bool boardEdge = false;\n                        if (i == min_row || i == max_row ||\n                            j == min_col || j == max_col)\n                        {\n                            boardEdge = true;\n                        }\n\n                        if ((iter == 1 && boardEdge) || (iter == 2 && !boardEdge))\n                        {\n                            // The respective row and column have been found\n                            corners.push_back(quad->corners[l]);\n                        }\n\n                        if (iter == 2 && boardEdge)\n                        {\n                            ++linkedBorderCorners;\n                        }\n\n                        // If the iterator is larger than two, this means that more than\n                        // two corners have the same row / column entries. Then some\n                        // linking errors must have occured and we should not use the found\n                        // pattern\n                        if (iter > 2)\n                        {\n                            return false;\n                        }\n\n                        iter++;\n                    }\n                }\n            }\n        }\n    }\n\n    if ((int)corners.size() != patternSize.width * patternSize.height ||\n        linkedBorderCorners < (patternSize.width * 2 + patternSize.height * 2 - 2) * 0.75f)\n    {\n        return false;\n    }\n\n    // check that no corners lie at image boundary\n    float border = 5.0f;\n    for (int i = 0; i < (int)corners.size(); ++i)\n    {\n        ChessboardCornerPtr& c = corners.at(i);\n\n        if (c->pt.x < border || c->pt.x > mImage.cols - border ||\n            c->pt.y < border || c->pt.y > mImage.rows - border)\n        {\n            return false;\n        }\n    }\n\n    // check if we need to transpose the board\n    if (width != patternSize.width)\n    {\n        std::swap(width, height);\n\n        std::vector<ChessboardCornerPtr> outputCorners;\n        outputCorners.resize(corners.size());\n\n        for (int i = 0; i < height; ++i)\n        {\n            for (int j = 0; j < width; ++j)\n            {\n                outputCorners.at(i * width + j) = corners.at(j * height + i);\n            }\n        }\n\n        corners = outputCorners;\n    }\n\n    // check if we need to revert the order in each row\n    cv::Point2f p0 = corners.at(0)->pt;\n    cv::Point2f p1 = corners.at(width-1)->pt;\n    cv::Point2f p2 = corners.at(width)->pt;\n\n    if ((p1 - p0).cross(p2 - p0) < 0.0f)\n    {\n        for (int i = 0; i < height; ++i)\n        {\n            for (int j = 0; j < width / 2; ++j)\n            {\n                std::swap(corners.at(i * width + j), corners.at(i * width + width - j - 1));\n            }\n        }\n    }\n\n    p0 = corners.at(0)->pt;\n    p2 = corners.at(width)->pt;\n\n    // check if we need to rotate the board\n    if (p2.y < p0.y)\n    {\n        std::vector<ChessboardCornerPtr> outputCorners;\n        outputCorners.resize(corners.size());\n\n        for (int i = 0; i < height; ++i)\n        {\n            for (int j = 0; j < width; ++j)\n            {\n                outputCorners.at(i * width + j) = corners.at((height - i - 1) * width + width - j - 1);\n            }\n        }\n\n        corners = outputCorners;\n    }\n\n    return true;\n}\n\nvoid\nChessboard::getQuadrangleHypotheses(const std::vector< std::vector<cv::Point> >& contours,\n                                    std::vector< std::pair<float, int> >& quads,\n                                    int classId) const\n{\n    const float minAspectRatio = 0.2f;\n    const float maxAspectRatio = 5.0f;\n    const float minBoxSize = 10.0f;\n\n    for (size_t i = 0; i < contours.size(); ++i)\n    {\n        cv::RotatedRect box = cv::minAreaRect(contours.at(i));\n        float boxSize = std::max(box.size.width, box.size.height);\n        if (boxSize < minBoxSize)\n        {\n            continue;\n        }\n\n        float aspectRatio = box.size.width / std::max(box.size.height, 1.0f);\n\n        if (aspectRatio < minAspectRatio || aspectRatio > maxAspectRatio)\n        {\n            continue;\n        }\n\n        quads.push_back(std::pair<float, int>(boxSize, classId));\n    }\n}\n\nbool less_pred(const std::pair<float, int>& p1, const std::pair<float, int>& p2)\n{\n    return p1.first < p2.first;\n}\n\nvoid countClasses(const std::vector<std::pair<float, int> >& pairs, size_t idx1, size_t idx2, std::vector<int>& counts)\n{\n    counts.assign(2, 0);\n    for (size_t i = idx1; i != idx2; ++i)\n    {\n        counts[pairs[i].second]++;\n    }\n}\n\nbool\nChessboard::checkChessboard(const cv::Mat& image, cv::Size patternSize) const\n{\n    const int erosionCount = 1;\n    const float blackLevel = 20.f;\n    const float whiteLevel = 130.f;\n    const float blackWhiteGap = 70.f;\n\n    cv::Mat white = image.clone();\n\n    cv::Mat black = image.clone();\n\n    cv::erode(white, white, cv::Mat(), cv::Point(-1,-1), erosionCount);\n    cv::dilate(black, black, cv::Mat(), cv::Point(-1,-1), erosionCount);\n\n    cv::Mat thresh(image.rows, image.cols, CV_8UC1);\n\n    bool result = false;\n    for (float threshLevel = blackLevel; threshLevel < whiteLevel && !result; threshLevel += 20.0f)\n    {\n        cv::threshold(white, thresh, threshLevel + blackWhiteGap, 255, CV_THRESH_BINARY);\n\n        std::vector< std::vector<cv::Point> > contours;\n        std::vector<cv::Vec4i> hierarchy;\n\n        // Initialize contour retrieving routine\n        cv::findContours(thresh, contours, hierarchy, CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE);\n\n        std::vector<std::pair<float, int> > quads;\n        getQuadrangleHypotheses(contours, quads, 1);\n\n        cv::threshold(black, thresh, threshLevel, 255, CV_THRESH_BINARY_INV);\n\n        cv::findContours(thresh, contours, hierarchy, CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE);\n        getQuadrangleHypotheses(contours, quads, 0);\n\n        const size_t min_quads_count = patternSize.width * patternSize.height / 2;\n        std::sort(quads.begin(), quads.end(), less_pred);\n\n        // now check if there are many hypotheses with similar sizes\n        // do this by floodfill-style algorithm\n        const float sizeRelDev = 0.4f;\n\n        for (size_t i = 0; i < quads.size(); ++i)\n        {\n            size_t j = i + 1;\n            for (; j < quads.size(); ++j)\n            {\n                if (quads[j].first / quads[i].first > 1.0f + sizeRelDev)\n                {\n                    break;\n                }\n            }\n\n            if (j + 1 > min_quads_count + i)\n            {\n                // check the number of black and white squares\n                std::vector<int> counts;\n                countClasses(quads, i, j, counts);\n                const int blackCount = lroundf(ceilf(patternSize.width / 2.0f) * ceilf(patternSize.height / 2.0f));\n                const int whiteCount = lroundf(floorf(patternSize.width / 2.0f) * floorf(patternSize.height / 2.0f));\n                if (counts[0] < blackCount * 0.75f ||\n                    counts[1] < whiteCount * 0.75f)\n                {\n                    continue;\n                }\n                result = true;\n                break;\n            }\n        }\n    }\n\n    return result;\n}\n\nbool\nChessboard::checkBoardMonotony(std::vector<ChessboardCornerPtr>& corners,\n                               cv::Size patternSize)\n{\n    const float threshFactor = 0.2f;\n\n    Spline splineXY, splineYX;\n    splineXY.setLowBC(Spline::PARABOLIC_RUNOUT_BC);\n    splineXY.setHighBC(Spline::PARABOLIC_RUNOUT_BC);\n    splineYX.setLowBC(Spline::PARABOLIC_RUNOUT_BC);\n    splineYX.setHighBC(Spline::PARABOLIC_RUNOUT_BC);\n\n    // check if each row of corners approximates a cubic spline\n    for (int i = 0; i < patternSize.height; ++i)\n    {\n        splineXY.clear();\n        splineYX.clear();\n\n        cv::Point2f p[3];\n        p[0] = corners.at(i * patternSize.width)->pt;\n        p[1] = corners.at(i * patternSize.width + patternSize.width / 2)->pt;\n        p[2] = corners.at(i * patternSize.width + patternSize.width - 1)->pt;\n\n        for (int j = 0; j < 3; ++j)\n        {\n            splineXY.addPoint(p[j].x, p[j].y);\n            splineYX.addPoint(p[j].y, p[j].x);\n        }\n\n        for (int j = 1; j < patternSize.width - 1; ++j)\n        {\n            cv::Point2f& p_j = corners.at(i * patternSize.width + j)->pt;\n\n            float thresh = std::numeric_limits<float>::max();\n\n            // up-neighbor\n            if (i > 0)\n            {\n                cv::Point2f& neighbor = corners.at((i - 1) * patternSize.width + j)->pt;\n                thresh = fminf(thresh, cv::norm(neighbor - p_j));\n            }\n            // down-neighbor\n            if (i < patternSize.height - 1)\n            {\n                cv::Point2f& neighbor = corners.at((i + 1) * patternSize.width + j)->pt;\n                thresh = fminf(thresh, cv::norm(neighbor - p_j));\n            }\n            // left-neighbor\n            {\n                cv::Point2f& neighbor = corners.at(i * patternSize.width + j - 1)->pt;\n                thresh = fminf(thresh, cv::norm(neighbor - p_j));\n            }\n            // right-neighbor\n            {\n                cv::Point2f& neighbor = corners.at(i * patternSize.width + j + 1)->pt;\n                thresh = fminf(thresh, cv::norm(neighbor - p_j));\n            }\n\n            thresh *= threshFactor;\n\n            if (fminf(fabsf(splineXY(p_j.x) - p_j.y), fabsf(splineYX(p_j.y) - p_j.x)) > thresh)\n            {\n                return false;\n            }\n        }\n    }\n\n    // check if each column of corners approximates a cubic spline\n    for (int j = 0; j < patternSize.width; ++j)\n    {\n        splineXY.clear();\n        splineYX.clear();\n\n        cv::Point2f p[3];\n        p[0] = corners.at(j)->pt;\n        p[1] = corners.at(patternSize.height / 2 * patternSize.width + j)->pt;\n        p[2] = corners.at((patternSize.height - 1) * patternSize.width + j)->pt;\n\n        for (int i = 0; i < 3; ++i)\n        {\n            splineXY.addPoint(p[i].x, p[i].y);\n            splineYX.addPoint(p[i].y, p[i].x);\n        }\n\n        for (int i = 1; i < patternSize.height - 1; ++i)\n        {\n            cv::Point2f& p_i = corners.at(i * patternSize.width + j)->pt;\n\n            float thresh = std::numeric_limits<float>::max();\n\n            // up-neighbor\n            {\n                cv::Point2f& neighbor = corners.at((i - 1) * patternSize.width + j)->pt;\n                thresh = fminf(thresh, cv::norm(neighbor - p_i));\n            }\n            // down-neighbor\n            {\n                cv::Point2f& neighbor = corners.at((i + 1) * patternSize.width + j)->pt;\n                thresh = fminf(thresh, cv::norm(neighbor - p_i));\n            }\n            // left-neighbor\n            if (j > 0)\n            {\n                cv::Point2f& neighbor = corners.at(i * patternSize.width + j - 1)->pt;\n                thresh = fminf(thresh, cv::norm(neighbor - p_i));\n            }\n            // right-neighbor\n            if (j < patternSize.width - 1)\n            {\n                cv::Point2f& neighbor = corners.at(i * patternSize.width + j + 1)->pt;\n                thresh = fminf(thresh, cv::norm(neighbor - p_i));\n            }\n\n            thresh *= threshFactor;\n\n            if (fminf(fabsf(splineXY(p_i.x) - p_i.y), fabsf(splineYX(p_i.y) - p_i.x)) > thresh)\n            {\n                return false;\n            }\n        }\n    }\n\n    return true;\n}\n\nbool\nChessboard::matchCorners(ChessboardQuadPtr& quad1, int corner1,\n                         ChessboardQuadPtr& quad2, int corner2) const\n{\n    // First Check everything from the viewpoint of the\n    // current quad compute midpoints of \"parallel\" quad\n    // sides 1\n    float x1 = (quad1->corners[corner1]->pt.x + quad1->corners[(corner1+1)%4]->pt.x)/2;\n    float y1 = (quad1->corners[corner1]->pt.y + quad1->corners[(corner1+1)%4]->pt.y)/2;\n    float x2 = (quad1->corners[(corner1+2)%4]->pt.x + quad1->corners[(corner1+3)%4]->pt.x)/2;\n    float y2 = (quad1->corners[(corner1+2)%4]->pt.y + quad1->corners[(corner1+3)%4]->pt.y)/2;\n    // compute midpoints of \"parallel\" quad sides 2\n    float x3 = (quad1->corners[corner1]->pt.x + quad1->corners[(corner1+3)%4]->pt.x)/2;\n    float y3 = (quad1->corners[corner1]->pt.y + quad1->corners[(corner1+3)%4]->pt.y)/2;\n    float x4 = (quad1->corners[(corner1+1)%4]->pt.x + quad1->corners[(corner1+2)%4]->pt.x)/2;\n    float y4 = (quad1->corners[(corner1+1)%4]->pt.y + quad1->corners[(corner1+2)%4]->pt.y)/2;\n\n    // MARTIN: Heuristic\n    // For corner2 of quad2 to be considered,\n    // it needs to be on the same side of the two lines as\n    // corner1. This is given, if the cross product has\n    // the same sign for both computations below:\n    float a1 = x1 - x2;\n    float b1 = y1 - y2;\n    // the current corner\n    float c11 = quad1->corners[corner1]->pt.x - x2;\n    float d11 = quad1->corners[corner1]->pt.y - y2;\n    // the candidate corner\n    float c12 = quad2->corners[corner2]->pt.x - x2;\n    float d12 = quad2->corners[corner2]->pt.y - y2;\n    float sign11 = a1*d11 - c11*b1;\n    float sign12 = a1*d12 - c12*b1;\n\n    float a2 = x3 - x4;\n    float b2 = y3 - y4;\n    // the current corner\n    float c21 = quad1->corners[corner1]->pt.x - x4;\n    float d21 = quad1->corners[corner1]->pt.y - y4;\n    // the candidate corner\n    float c22 = quad2->corners[corner2]->pt.x - x4;\n    float d22 = quad2->corners[corner2]->pt.y - y4;\n    float sign21 = a2*d21 - c21*b2;\n    float sign22 = a2*d22 - c22*b2;\n\n    // Also make shure that two border quads of the same row or\n    // column don't link. Check from the current corner's view,\n    // whether the corner diagonal from the candidate corner\n    // is also on the same side of the two lines as the current\n    // corner and the candidate corner.\n    float c13 = quad2->corners[(corner2+2)%4]->pt.x - x2;\n    float d13 = quad2->corners[(corner2+2)%4]->pt.y - y2;\n    float c23 = quad2->corners[(corner2+2)%4]->pt.x - x4;\n    float d23 = quad2->corners[(corner2+2)%4]->pt.y - y4;\n    float sign13 = a1*d13 - c13*b1;\n    float sign23 = a2*d23 - c23*b2;\n\n\n    // Second: Then check everything from the viewpoint of\n    // the candidate quad. Compute midpoints of \"parallel\"\n    // quad sides 1\n    float u1 = (quad2->corners[corner2]->pt.x + quad2->corners[(corner2+1)%4]->pt.x)/2;\n    float v1 = (quad2->corners[corner2]->pt.y + quad2->corners[(corner2+1)%4]->pt.y)/2;\n    float u2 = (quad2->corners[(corner2+2)%4]->pt.x + quad2->corners[(corner2+3)%4]->pt.x)/2;\n    float v2 = (quad2->corners[(corner2+2)%4]->pt.y + quad2->corners[(corner2+3)%4]->pt.y)/2;\n    // compute midpoints of \"parallel\" quad sides 2\n    float u3 = (quad2->corners[corner2]->pt.x + quad2->corners[(corner2+3)%4]->pt.x)/2;\n    float v3 = (quad2->corners[corner2]->pt.y + quad2->corners[(corner2+3)%4]->pt.y)/2;\n    float u4 = (quad2->corners[(corner2+1)%4]->pt.x + quad2->corners[(corner2+2)%4]->pt.x)/2;\n    float v4 = (quad2->corners[(corner2+1)%4]->pt.y + quad2->corners[(corner2+2)%4]->pt.y)/2;\n\n    // MARTIN: Heuristic\n    // For corner2 of quad2 to be considered,\n    // it needs to be on the same side of the two lines as\n    // corner1. This is given, if the cross product has\n    // the same sign for both computations below:\n    float a3 = u1 - u2;\n    float b3 = v1 - v2;\n    // the current corner\n    float c31 = quad1->corners[corner1]->pt.x - u2;\n    float d31 = quad1->corners[corner1]->pt.y - v2;\n    // the candidate corner\n    float c32 = quad2->corners[corner2]->pt.x - u2;\n    float d32 = quad2->corners[corner2]->pt.y - v2;\n    float sign31 = a3*d31-c31*b3;\n    float sign32 = a3*d32-c32*b3;\n\n    float a4 = u3 - u4;\n    float b4 = v3 - v4;\n    // the current corner\n    float c41 = quad1->corners[corner1]->pt.x - u4;\n    float d41 = quad1->corners[corner1]->pt.y - v4;\n    // the candidate corner\n    float c42 = quad2->corners[corner2]->pt.x - u4;\n    float d42 = quad2->corners[corner2]->pt.y - v4;\n    float sign41 = a4*d41-c41*b4;\n    float sign42 = a4*d42-c42*b4;\n\n    // Also make sure that two border quads of the same row or\n    // column don't link. Check from the candidate corner's view,\n    // whether the corner diagonal from the current corner\n    // is also on the same side of the two lines as the current\n    // corner and the candidate corner.\n    float c33 = quad1->corners[(corner1+2)%4]->pt.x - u2;\n    float d33 = quad1->corners[(corner1+2)%4]->pt.y - v2;\n    float c43 = quad1->corners[(corner1+2)%4]->pt.x - u4;\n    float d43 = quad1->corners[(corner1+2)%4]->pt.y - v4;\n    float sign33 = a3*d33-c33*b3;\n    float sign43 = a4*d43-c43*b4;\n\n\n    // This time we also need to make shure, that no quad\n    // is linked to a quad of another dilation run which\n    // may lie INSIDE it!!!\n    // Third: Therefore check everything from the viewpoint\n    // of the current quad compute midpoints of \"parallel\"\n    // quad sides 1\n    float x5 = quad1->corners[corner1]->pt.x;\n    float y5 = quad1->corners[corner1]->pt.y;\n    float x6 = quad1->corners[(corner1+1)%4]->pt.x;\n    float y6 = quad1->corners[(corner1+1)%4]->pt.y;\n    // compute midpoints of \"parallel\" quad sides 2\n    float x7 = x5;\n    float y7 = y5;\n    float x8 = quad1->corners[(corner1+3)%4]->pt.x;\n    float y8 = quad1->corners[(corner1+3)%4]->pt.y;\n\n    // MARTIN: Heuristic\n    // For corner2 of quad2 to be considered,\n    // it needs to be on the other side of the two lines than\n    // corner1. This is given, if the cross product has\n    // a different sign for both computations below:\n    float a5 = x6 - x5;\n    float b5 = y6 - y5;\n    // the current corner\n    float c51 = quad1->corners[(corner1+2)%4]->pt.x - x5;\n    float d51 = quad1->corners[(corner1+2)%4]->pt.y - y5;\n    // the candidate corner\n    float c52 = quad2->corners[corner2]->pt.x - x5;\n    float d52 = quad2->corners[corner2]->pt.y - y5;\n    float sign51 = a5*d51 - c51*b5;\n    float sign52 = a5*d52 - c52*b5;\n\n    float a6 = x8 - x7;\n    float b6 = y8 - y7;\n    // the current corner\n    float c61 = quad1->corners[(corner1+2)%4]->pt.x - x7;\n    float d61 = quad1->corners[(corner1+2)%4]->pt.y - y7;\n    // the candidate corner\n    float c62 = quad2->corners[corner2]->pt.x - x7;\n    float d62 = quad2->corners[corner2]->pt.y - y7;\n    float sign61 = a6*d61 - c61*b6;\n    float sign62 = a6*d62 - c62*b6;\n\n\n    // Fourth: Then check everything from the viewpoint of\n    // the candidate quad compute midpoints of \"parallel\"\n    // quad sides 1\n    float u5 = quad2->corners[corner2]->pt.x;\n    float v5 = quad2->corners[corner2]->pt.y;\n    float u6 = quad2->corners[(corner2+1)%4]->pt.x;\n    float v6 = quad2->corners[(corner2+1)%4]->pt.y;\n    // compute midpoints of \"parallel\" quad sides 2\n    float u7 = u5;\n    float v7 = v5;\n    float u8 = quad2->corners[(corner2+3)%4]->pt.x;\n    float v8 = quad2->corners[(corner2+3)%4]->pt.y;\n\n    // MARTIN: Heuristic\n    // For corner2 of quad2 to be considered,\n    // it needs to be on the other side of the two lines than\n    // corner1. This is given, if the cross product has\n    // a different sign for both computations below:\n    float a7 = u6 - u5;\n    float b7 = v6 - v5;\n    // the current corner\n    float c71 = quad1->corners[corner1]->pt.x - u5;\n    float d71 = quad1->corners[corner1]->pt.y - v5;\n    // the candidate corner\n    float c72 = quad2->corners[(corner2+2)%4]->pt.x - u5;\n    float d72 = quad2->corners[(corner2+2)%4]->pt.y - v5;\n    float sign71 = a7*d71-c71*b7;\n    float sign72 = a7*d72-c72*b7;\n\n    float a8 = u8 - u7;\n    float b8 = v8 - v7;\n    // the current corner\n    float c81 = quad1->corners[corner1]->pt.x - u7;\n    float d81 = quad1->corners[corner1]->pt.y - v7;\n    // the candidate corner\n    float c82 = quad2->corners[(corner2+2)%4]->pt.x - u7;\n    float d82 = quad2->corners[(corner2+2)%4]->pt.y - v7;\n    float sign81 = a8*d81-c81*b8;\n    float sign82 = a8*d82-c82*b8;\n\n    // Check whether conditions are fulfilled\n    if (((sign11 < 0 && sign12 < 0) || (sign11 > 0 && sign12 > 0))  &&\n        ((sign21 < 0 && sign22 < 0) || (sign21 > 0 && sign22 > 0))  &&\n        ((sign31 < 0 && sign32 < 0) || (sign31 > 0 && sign32 > 0))  &&\n        ((sign41 < 0 && sign42 < 0) || (sign41 > 0 && sign42 > 0))  &&\n        ((sign11 < 0 && sign13 < 0) || (sign11 > 0 && sign13 > 0))  &&\n        ((sign21 < 0 && sign23 < 0) || (sign21 > 0 && sign23 > 0))  &&\n        ((sign31 < 0 && sign33 < 0) || (sign31 > 0 && sign33 > 0))  &&\n        ((sign41 < 0 && sign43 < 0) || (sign41 > 0 && sign43 > 0))  &&\n        ((sign51 < 0 && sign52 > 0) || (sign51 > 0 && sign52 < 0))  &&\n        ((sign61 < 0 && sign62 > 0) || (sign61 > 0 && sign62 < 0))  &&\n        ((sign71 < 0 && sign72 > 0) || (sign71 > 0 && sign72 < 0))  &&\n        ((sign81 < 0 && sign82 > 0) || (sign81 > 0 && sign82 < 0)))\n    {\n        return true;\n    }\n    else\n    {\n        return false;\n    }\n}\n\n}\n"
  },
  {
    "path": "camera_model/src/gpl/EigenQuaternionParameterization.cc",
    "content": "#include \"camodocal/gpl/EigenQuaternionParameterization.h\"\n\n#include <cmath>\n\nnamespace camodocal\n{\n\nbool\nEigenQuaternionParameterization::Plus(const double* x,\n                                      const double* delta,\n                                      double* x_plus_delta) const\n{\n    const double norm_delta =\n        sqrt(delta[0] * delta[0] + delta[1] * delta[1] + delta[2] * delta[2]);\n    if (norm_delta > 0.0)\n    {\n        const double sin_delta_by_delta = (sin(norm_delta) / norm_delta);\n        double q_delta[4];\n        q_delta[0] = sin_delta_by_delta * delta[0];\n        q_delta[1] = sin_delta_by_delta * delta[1];\n        q_delta[2] = sin_delta_by_delta * delta[2];\n        q_delta[3] = cos(norm_delta);\n        EigenQuaternionProduct(q_delta, x, x_plus_delta);\n    }\n    else\n    {\n        for (int i = 0; i < 4; ++i)\n        {\n            x_plus_delta[i] = x[i];\n        }\n    }\n    return true;\n}\n\nbool\nEigenQuaternionParameterization::ComputeJacobian(const double* x,\n                                                 double* jacobian) const\n{\n    jacobian[0] =  x[3]; jacobian[1]  =  x[2]; jacobian[2]  = -x[1];  // NOLINT\n    jacobian[3] = -x[2]; jacobian[4]  =  x[3]; jacobian[5]  =  x[0];  // NOLINT\n    jacobian[6] =  x[1]; jacobian[7] = -x[0]; jacobian[8] =  x[3];  // NOLINT\n    jacobian[9] = -x[0]; jacobian[10]  = -x[1]; jacobian[11]  = -x[2];  // NOLINT\n    return true;\n}\n\n}\n"
  },
  {
    "path": "camera_model/src/gpl/gpl.cc",
    "content": "#include \"camodocal/gpl/gpl.h\"\r\n\r\n#include <set>\r\n#ifdef _WIN32\r\n#include <winsock.h>\r\n#else\r\n#include <time.h>\r\n#endif\r\n\r\n\r\n// source: https://stackoverflow.com/questions/5167269/clock-gettime-alternative-in-mac-os-x\r\n#ifdef __APPLE__\r\n#include <mach/mach_time.h>\r\n#define ORWL_NANO (+1.0E-9)\r\n#define ORWL_GIGA UINT64_C(1000000000)\r\n\r\nstatic double orwl_timebase = 0.0;\r\nstatic uint64_t orwl_timestart = 0;\r\n\r\nstruct timespec orwl_gettime(void) {\r\n  // be more careful in a multithreaded environement\r\n  if (!orwl_timestart) {\r\n    mach_timebase_info_data_t tb = { 0 };\r\n    mach_timebase_info(&tb);\r\n    orwl_timebase = tb.numer;\r\n    orwl_timebase /= tb.denom;\r\n    orwl_timestart = mach_absolute_time();\r\n  }\r\n  struct timespec t;\r\n  double diff = (mach_absolute_time() - orwl_timestart) * orwl_timebase;\r\n  t.tv_sec = diff * ORWL_NANO;\r\n  t.tv_nsec = diff - (t.tv_sec * ORWL_GIGA);\r\n  return t;\r\n}\r\n#endif // __APPLE__\r\n\r\n\r\nconst double WGS84_A = 6378137.0;\r\nconst double WGS84_ECCSQ = 0.00669437999013;\r\n\r\n// Windows lacks fminf\r\n#ifndef fminf\r\n#define fminf(x, y) (((x) < (y)) ? (x) : (y))\r\n#endif\r\n\r\nnamespace camodocal\r\n{\r\n\r\ndouble hypot3(double x, double y, double z)\r\n{\r\n\treturn sqrt(square(x) + square(y) + square(z));\r\n}\r\n\r\nfloat hypot3f(float x, float y, float z)\r\n{\r\n\treturn sqrtf(square(x) + square(y) + square(z));\r\n}\r\n\r\ndouble d2r(double deg)\r\n{\r\n\treturn deg / 180.0 * M_PI;\r\n}\r\n\r\nfloat d2r(float deg)\r\n{\r\n\treturn deg / 180.0f * M_PI;\r\n}\r\n\r\ndouble r2d(double rad)\r\n{\r\n\treturn rad / M_PI * 180.0;\r\n}\r\n\r\nfloat r2d(float rad)\r\n{\r\n\treturn rad / M_PI * 180.0f;\r\n}\r\n\r\ndouble sinc(double theta)\r\n{\r\n    return sin(theta) / theta;\r\n}\r\n\r\n#ifdef _WIN32\r\n#include <sys/timeb.h>\r\n#include <sys/types.h>\r\n#include <winsock.h>\r\nLARGE_INTEGER\r\ngetFILETIMEoffset()\r\n{\r\n    SYSTEMTIME s;\r\n    FILETIME f;\r\n    LARGE_INTEGER t;\r\n\r\n    s.wYear = 1970;\r\n    s.wMonth = 1;\r\n    s.wDay = 1;\r\n    s.wHour = 0;\r\n    s.wMinute = 0;\r\n    s.wSecond = 0;\r\n    s.wMilliseconds = 0;\r\n    SystemTimeToFileTime(&s, &f);\r\n    t.QuadPart = f.dwHighDateTime;\r\n    t.QuadPart <<= 32;\r\n    t.QuadPart |= f.dwLowDateTime;\r\n    return (t);\r\n}\r\n\r\nint\r\nclock_gettime(int X, struct timespec *tp)\r\n{\r\n    LARGE_INTEGER           t;\r\n    FILETIME            f;\r\n    double                  microseconds;\r\n    static LARGE_INTEGER    offset;\r\n    static double           frequencyToMicroseconds;\r\n    static int              initialized = 0;\r\n    static BOOL             usePerformanceCounter = 0;\r\n\r\n    if (!initialized) {\r\n        LARGE_INTEGER performanceFrequency;\r\n        initialized = 1;\r\n        usePerformanceCounter = QueryPerformanceFrequency(&performanceFrequency);\r\n        if (usePerformanceCounter) {\r\n            QueryPerformanceCounter(&offset);\r\n            frequencyToMicroseconds = (double)performanceFrequency.QuadPart / 1000000.;\r\n        } else {\r\n            offset = getFILETIMEoffset();\r\n            frequencyToMicroseconds = 10.;\r\n        }\r\n    }\r\n    if (usePerformanceCounter) QueryPerformanceCounter(&t);\r\n    else {\r\n        GetSystemTimeAsFileTime(&f);\r\n        t.QuadPart = f.dwHighDateTime;\r\n        t.QuadPart <<= 32;\r\n        t.QuadPart |= f.dwLowDateTime;\r\n    }\r\n\r\n    t.QuadPart -= offset.QuadPart;\r\n    microseconds = (double)t.QuadPart / frequencyToMicroseconds;\r\n    t.QuadPart = microseconds;\r\n    tp->tv_sec = t.QuadPart / 1000000;\r\n    tp->tv_nsec = (t.QuadPart % 1000000) * 1000;\r\n    return (0);\r\n}\r\n#endif\r\n\r\nunsigned long long timeInMicroseconds(void)\r\n{\r\n\t struct timespec tp;\r\n#ifdef __APPLE__\r\n     tp = orwl_gettime();\r\n#else\r\n\t clock_gettime(CLOCK_REALTIME, &tp);\r\n#endif\r\n\r\n\t return tp.tv_sec * 1000000 + tp.tv_nsec / 1000;\r\n}\r\n\r\ndouble timeInSeconds(void)\r\n{\r\n    struct timespec tp;\r\n#ifdef __APPLE__\r\n     tp = orwl_gettime();\r\n#else\r\n\t clock_gettime(CLOCK_REALTIME, &tp);\r\n#endif\r\n\r\n\t return static_cast<double>(tp.tv_sec) +\r\n\t\t\t static_cast<double>(tp.tv_nsec) / 1000000000.0;\r\n}\r\n\r\nfloat colormapAutumn[128][3] =\r\n{\r\n\t\t{1.0f,0.f,0.f},\r\n\t\t{1.0f,0.007874f,0.f},\r\n\t\t{1.0f,0.015748f,0.f},\r\n\t\t{1.0f,0.023622f,0.f},\r\n\t\t{1.0f,0.031496f,0.f},\r\n\t\t{1.0f,0.03937f,0.f},\r\n\t\t{1.0f,0.047244f,0.f},\r\n\t\t{1.0f,0.055118f,0.f},\r\n\t\t{1.0f,0.062992f,0.f},\r\n\t\t{1.0f,0.070866f,0.f},\r\n\t\t{1.0f,0.07874f,0.f},\r\n\t\t{1.0f,0.086614f,0.f},\r\n\t\t{1.0f,0.094488f,0.f},\r\n\t\t{1.0f,0.10236f,0.f},\r\n\t\t{1.0f,0.11024f,0.f},\r\n\t\t{1.0f,0.11811f,0.f},\r\n\t\t{1.0f,0.12598f,0.f},\r\n\t\t{1.0f,0.13386f,0.f},\r\n\t\t{1.0f,0.14173f,0.f},\r\n\t\t{1.0f,0.14961f,0.f},\r\n\t\t{1.0f,0.15748f,0.f},\r\n\t\t{1.0f,0.16535f,0.f},\r\n\t\t{1.0f,0.17323f,0.f},\r\n\t\t{1.0f,0.1811f,0.f},\r\n\t\t{1.0f,0.18898f,0.f},\r\n\t\t{1.0f,0.19685f,0.f},\r\n\t\t{1.0f,0.20472f,0.f},\r\n\t\t{1.0f,0.2126f,0.f},\r\n\t\t{1.0f,0.22047f,0.f},\r\n\t\t{1.0f,0.22835f,0.f},\r\n\t\t{1.0f,0.23622f,0.f},\r\n\t\t{1.0f,0.24409f,0.f},\r\n\t\t{1.0f,0.25197f,0.f},\r\n\t\t{1.0f,0.25984f,0.f},\r\n\t\t{1.0f,0.26772f,0.f},\r\n\t\t{1.0f,0.27559f,0.f},\r\n\t\t{1.0f,0.28346f,0.f},\r\n\t\t{1.0f,0.29134f,0.f},\r\n\t\t{1.0f,0.29921f,0.f},\r\n\t\t{1.0f,0.30709f,0.f},\r\n\t\t{1.0f,0.31496f,0.f},\r\n\t\t{1.0f,0.32283f,0.f},\r\n\t\t{1.0f,0.33071f,0.f},\r\n\t\t{1.0f,0.33858f,0.f},\r\n\t\t{1.0f,0.34646f,0.f},\r\n\t\t{1.0f,0.35433f,0.f},\r\n\t\t{1.0f,0.3622f,0.f},\r\n\t\t{1.0f,0.37008f,0.f},\r\n\t\t{1.0f,0.37795f,0.f},\r\n\t\t{1.0f,0.38583f,0.f},\r\n\t\t{1.0f,0.3937f,0.f},\r\n\t\t{1.0f,0.40157f,0.f},\r\n\t\t{1.0f,0.40945f,0.f},\r\n\t\t{1.0f,0.41732f,0.f},\r\n\t\t{1.0f,0.4252f,0.f},\r\n\t\t{1.0f,0.43307f,0.f},\r\n\t\t{1.0f,0.44094f,0.f},\r\n\t\t{1.0f,0.44882f,0.f},\r\n\t\t{1.0f,0.45669f,0.f},\r\n\t\t{1.0f,0.46457f,0.f},\r\n\t\t{1.0f,0.47244f,0.f},\r\n\t\t{1.0f,0.48031f,0.f},\r\n\t\t{1.0f,0.48819f,0.f},\r\n\t\t{1.0f,0.49606f,0.f},\r\n\t\t{1.0f,0.50394f,0.f},\r\n\t\t{1.0f,0.51181f,0.f},\r\n\t\t{1.0f,0.51969f,0.f},\r\n\t\t{1.0f,0.52756f,0.f},\r\n\t\t{1.0f,0.53543f,0.f},\r\n\t\t{1.0f,0.54331f,0.f},\r\n\t\t{1.0f,0.55118f,0.f},\r\n\t\t{1.0f,0.55906f,0.f},\r\n\t\t{1.0f,0.56693f,0.f},\r\n\t\t{1.0f,0.5748f,0.f},\r\n\t\t{1.0f,0.58268f,0.f},\r\n\t\t{1.0f,0.59055f,0.f},\r\n\t\t{1.0f,0.59843f,0.f},\r\n\t\t{1.0f,0.6063f,0.f},\r\n\t\t{1.0f,0.61417f,0.f},\r\n\t\t{1.0f,0.62205f,0.f},\r\n\t\t{1.0f,0.62992f,0.f},\r\n\t\t{1.0f,0.6378f,0.f},\r\n\t\t{1.0f,0.64567f,0.f},\r\n\t\t{1.0f,0.65354f,0.f},\r\n\t\t{1.0f,0.66142f,0.f},\r\n\t\t{1.0f,0.66929f,0.f},\r\n\t\t{1.0f,0.67717f,0.f},\r\n\t\t{1.0f,0.68504f,0.f},\r\n\t\t{1.0f,0.69291f,0.f},\r\n\t\t{1.0f,0.70079f,0.f},\r\n\t\t{1.0f,0.70866f,0.f},\r\n\t\t{1.0f,0.71654f,0.f},\r\n\t\t{1.0f,0.72441f,0.f},\r\n\t\t{1.0f,0.73228f,0.f},\r\n\t\t{1.0f,0.74016f,0.f},\r\n\t\t{1.0f,0.74803f,0.f},\r\n\t\t{1.0f,0.75591f,0.f},\r\n\t\t{1.0f,0.76378f,0.f},\r\n\t\t{1.0f,0.77165f,0.f},\r\n\t\t{1.0f,0.77953f,0.f},\r\n\t\t{1.0f,0.7874f,0.f},\r\n\t\t{1.0f,0.79528f,0.f},\r\n\t\t{1.0f,0.80315f,0.f},\r\n\t\t{1.0f,0.81102f,0.f},\r\n\t\t{1.0f,0.8189f,0.f},\r\n\t\t{1.0f,0.82677f,0.f},\r\n\t\t{1.0f,0.83465f,0.f},\r\n\t\t{1.0f,0.84252f,0.f},\r\n\t\t{1.0f,0.85039f,0.f},\r\n\t\t{1.0f,0.85827f,0.f},\r\n\t\t{1.0f,0.86614f,0.f},\r\n\t\t{1.0f,0.87402f,0.f},\r\n\t\t{1.0f,0.88189f,0.f},\r\n\t\t{1.0f,0.88976f,0.f},\r\n\t\t{1.0f,0.89764f,0.f},\r\n\t\t{1.0f,0.90551f,0.f},\r\n\t\t{1.0f,0.91339f,0.f},\r\n\t\t{1.0f,0.92126f,0.f},\r\n\t\t{1.0f,0.92913f,0.f},\r\n\t\t{1.0f,0.93701f,0.f},\r\n\t\t{1.0f,0.94488f,0.f},\r\n\t\t{1.0f,0.95276f,0.f},\r\n\t\t{1.0f,0.96063f,0.f},\r\n\t\t{1.0f,0.9685f,0.f},\r\n\t\t{1.0f,0.97638f,0.f},\r\n\t\t{1.0f,0.98425f,0.f},\r\n\t\t{1.0f,0.99213f,0.f},\r\n\t\t{1.0f,1.0f,0.0f}\r\n};\r\n\r\n\r\nfloat colormapJet[128][3] =\r\n{\r\n\t\t{0.0f,0.0f,0.53125f},\r\n\t\t{0.0f,0.0f,0.5625f},\r\n\t\t{0.0f,0.0f,0.59375f},\r\n\t\t{0.0f,0.0f,0.625f},\r\n\t\t{0.0f,0.0f,0.65625f},\r\n\t\t{0.0f,0.0f,0.6875f},\r\n\t\t{0.0f,0.0f,0.71875f},\r\n\t\t{0.0f,0.0f,0.75f},\r\n\t\t{0.0f,0.0f,0.78125f},\r\n\t\t{0.0f,0.0f,0.8125f},\r\n\t\t{0.0f,0.0f,0.84375f},\r\n\t\t{0.0f,0.0f,0.875f},\r\n\t\t{0.0f,0.0f,0.90625f},\r\n\t\t{0.0f,0.0f,0.9375f},\r\n\t\t{0.0f,0.0f,0.96875f},\r\n\t\t{0.0f,0.0f,1.0f},\r\n\t\t{0.0f,0.03125f,1.0f},\r\n\t\t{0.0f,0.0625f,1.0f},\r\n\t\t{0.0f,0.09375f,1.0f},\r\n\t\t{0.0f,0.125f,1.0f},\r\n\t\t{0.0f,0.15625f,1.0f},\r\n\t\t{0.0f,0.1875f,1.0f},\r\n\t\t{0.0f,0.21875f,1.0f},\r\n\t\t{0.0f,0.25f,1.0f},\r\n\t\t{0.0f,0.28125f,1.0f},\r\n\t\t{0.0f,0.3125f,1.0f},\r\n\t\t{0.0f,0.34375f,1.0f},\r\n\t\t{0.0f,0.375f,1.0f},\r\n\t\t{0.0f,0.40625f,1.0f},\r\n\t\t{0.0f,0.4375f,1.0f},\r\n\t\t{0.0f,0.46875f,1.0f},\r\n\t\t{0.0f,0.5f,1.0f},\r\n\t\t{0.0f,0.53125f,1.0f},\r\n\t\t{0.0f,0.5625f,1.0f},\r\n\t\t{0.0f,0.59375f,1.0f},\r\n\t\t{0.0f,0.625f,1.0f},\r\n\t\t{0.0f,0.65625f,1.0f},\r\n\t\t{0.0f,0.6875f,1.0f},\r\n\t\t{0.0f,0.71875f,1.0f},\r\n\t\t{0.0f,0.75f,1.0f},\r\n\t\t{0.0f,0.78125f,1.0f},\r\n\t\t{0.0f,0.8125f,1.0f},\r\n\t\t{0.0f,0.84375f,1.0f},\r\n\t\t{0.0f,0.875f,1.0f},\r\n\t\t{0.0f,0.90625f,1.0f},\r\n\t\t{0.0f,0.9375f,1.0f},\r\n\t\t{0.0f,0.96875f,1.0f},\r\n\t\t{0.0f,1.0f,1.0f},\r\n\t\t{0.03125f,1.0f,0.96875f},\r\n\t\t{0.0625f,1.0f,0.9375f},\r\n\t\t{0.09375f,1.0f,0.90625f},\r\n\t\t{0.125f,1.0f,0.875f},\r\n\t\t{0.15625f,1.0f,0.84375f},\r\n\t\t{0.1875f,1.0f,0.8125f},\r\n\t\t{0.21875f,1.0f,0.78125f},\r\n\t\t{0.25f,1.0f,0.75f},\r\n\t\t{0.28125f,1.0f,0.71875f},\r\n\t\t{0.3125f,1.0f,0.6875f},\r\n\t\t{0.34375f,1.0f,0.65625f},\r\n\t\t{0.375f,1.0f,0.625f},\r\n\t\t{0.40625f,1.0f,0.59375f},\r\n\t\t{0.4375f,1.0f,0.5625f},\r\n\t\t{0.46875f,1.0f,0.53125f},\r\n\t\t{0.5f,1.0f,0.5f},\r\n\t\t{0.53125f,1.0f,0.46875f},\r\n\t\t{0.5625f,1.0f,0.4375f},\r\n\t\t{0.59375f,1.0f,0.40625f},\r\n\t\t{0.625f,1.0f,0.375f},\r\n\t\t{0.65625f,1.0f,0.34375f},\r\n\t\t{0.6875f,1.0f,0.3125f},\r\n\t\t{0.71875f,1.0f,0.28125f},\r\n\t\t{0.75f,1.0f,0.25f},\r\n\t\t{0.78125f,1.0f,0.21875f},\r\n\t\t{0.8125f,1.0f,0.1875f},\r\n\t\t{0.84375f,1.0f,0.15625f},\r\n\t\t{0.875f,1.0f,0.125f},\r\n\t\t{0.90625f,1.0f,0.09375f},\r\n\t\t{0.9375f,1.0f,0.0625f},\r\n\t\t{0.96875f,1.0f,0.03125f},\r\n\t\t{1.0f,1.0f,0.0f},\r\n\t\t{1.0f,0.96875f,0.0f},\r\n\t\t{1.0f,0.9375f,0.0f},\r\n\t\t{1.0f,0.90625f,0.0f},\r\n\t\t{1.0f,0.875f,0.0f},\r\n\t\t{1.0f,0.84375f,0.0f},\r\n\t\t{1.0f,0.8125f,0.0f},\r\n\t\t{1.0f,0.78125f,0.0f},\r\n\t\t{1.0f,0.75f,0.0f},\r\n\t\t{1.0f,0.71875f,0.0f},\r\n\t\t{1.0f,0.6875f,0.0f},\r\n\t\t{1.0f,0.65625f,0.0f},\r\n\t\t{1.0f,0.625f,0.0f},\r\n\t\t{1.0f,0.59375f,0.0f},\r\n\t\t{1.0f,0.5625f,0.0f},\r\n\t\t{1.0f,0.53125f,0.0f},\r\n\t\t{1.0f,0.5f,0.0f},\r\n\t\t{1.0f,0.46875f,0.0f},\r\n\t\t{1.0f,0.4375f,0.0f},\r\n\t\t{1.0f,0.40625f,0.0f},\r\n\t\t{1.0f,0.375f,0.0f},\r\n\t\t{1.0f,0.34375f,0.0f},\r\n\t\t{1.0f,0.3125f,0.0f},\r\n\t\t{1.0f,0.28125f,0.0f},\r\n\t\t{1.0f,0.25f,0.0f},\r\n\t\t{1.0f,0.21875f,0.0f},\r\n\t\t{1.0f,0.1875f,0.0f},\r\n\t\t{1.0f,0.15625f,0.0f},\r\n\t\t{1.0f,0.125f,0.0f},\r\n\t\t{1.0f,0.09375f,0.0f},\r\n\t\t{1.0f,0.0625f,0.0f},\r\n\t\t{1.0f,0.03125f,0.0f},\r\n\t\t{1.0f,0.0f,0.0f},\r\n\t\t{0.96875f,0.0f,0.0f},\r\n\t\t{0.9375f,0.0f,0.0f},\r\n\t\t{0.90625f,0.0f,0.0f},\r\n\t\t{0.875f,0.0f,0.0f},\r\n\t\t{0.84375f,0.0f,0.0f},\r\n\t\t{0.8125f,0.0f,0.0f},\r\n\t\t{0.78125f,0.0f,0.0f},\r\n\t\t{0.75f,0.0f,0.0f},\r\n\t\t{0.71875f,0.0f,0.0f},\r\n\t\t{0.6875f,0.0f,0.0f},\r\n\t\t{0.65625f,0.0f,0.0f},\r\n\t\t{0.625f,0.0f,0.0f},\r\n\t\t{0.59375f,0.0f,0.0f},\r\n\t\t{0.5625f,0.0f,0.0f},\r\n\t\t{0.53125f,0.0f,0.0f},\r\n\t\t{0.5f,0.0f,0.0f}\r\n};\r\n\r\nvoid colorDepthImage(cv::Mat& imgDepth, cv::Mat& imgColoredDepth,\r\n\t\t\t\t\t float minRange, float maxRange)\r\n{\r\n\timgColoredDepth = cv::Mat::zeros(imgDepth.size(), CV_8UC3);\r\n\r\n\tfor (int i = 0; i < imgColoredDepth.rows; ++i)\r\n\t{\r\n\t\tconst float* depth = imgDepth.ptr<float>(i);\r\n\t\tunsigned char* pixel = imgColoredDepth.ptr<unsigned char>(i);\r\n\t\tfor (int j = 0; j < imgColoredDepth.cols; ++j)\r\n\t\t{\r\n\t\t\tif (depth[j] != 0)\r\n\t\t\t{\r\n\t\t\t\tint idx = fminf(depth[j] - minRange, maxRange - minRange) / (maxRange - minRange) * 127.0f;\r\n\t\t\t\tidx = 127 - idx;\r\n\r\n\t\t\t\tpixel[0] = colormapJet[idx][2] * 255.0f;\r\n\t\t\t\tpixel[1] = colormapJet[idx][1] * 255.0f;\r\n\t\t\t\tpixel[2] = colormapJet[idx][0] * 255.0f;\r\n\t\t\t}\r\n\r\n\t\t\tpixel += 3;\r\n\t\t}\r\n\t}\r\n}\r\n\r\nbool colormap(const std::string& name, unsigned char idx,\r\n\t\t\t  float& r, float& g, float& b)\r\n{\r\n\tif (name.compare(\"jet\") == 0)\r\n\t{\r\n\t\tfloat* color = colormapJet[idx];\r\n\r\n\t\tr = color[0];\r\n\t\tg = color[1];\r\n\t\tb = color[2];\r\n\r\n\t\treturn true;\r\n\t}\r\n\telse if (name.compare(\"autumn\") == 0)\r\n\t{\r\n\t\tfloat* color = colormapAutumn[idx];\r\n\r\n\t\tr = color[0];\r\n\t\tg = color[1];\r\n\t\tb = color[2];\r\n\r\n\t\treturn true;\r\n\t}\r\n\r\n\treturn false;\r\n}\r\n\r\nstd::vector<cv::Point2i> bresLine(int x0, int y0, int x1, int y1)\r\n{\r\n\t// Bresenham's line algorithm\r\n\t// Find cells intersected by line between (x0,y0) and (x1,y1)\r\n\r\n\tstd::vector<cv::Point2i> cells;\r\n\r\n\tint dx = std::abs(x1 - x0);\r\n\tint dy = std::abs(y1 - y0);\r\n\r\n\tint sx = (x0 < x1) ? 1 : -1;\r\n\tint sy = (y0 < y1) ? 1 : -1;\r\n\r\n\tint err = dx - dy;\r\n\r\n\twhile (1)\r\n\t{\r\n\t\tcells.push_back(cv::Point2i(x0, y0));\r\n\r\n\t\tif (x0 == x1 && y0 == y1)\r\n\t\t{\r\n\t\t\tbreak;\r\n\t\t}\r\n\r\n\t\tint e2 = 2 * err;\r\n\t\tif (e2 > -dy)\r\n\t\t{\r\n\t\t\terr -= dy;\r\n\t\t\tx0 += sx;\r\n\t\t}\r\n\t\tif (e2 < dx)\r\n\t\t{\r\n\t\t\terr += dx;\r\n\t\t\ty0 += sy;\r\n\t\t}\r\n\t}\r\n\r\n\treturn cells;\r\n}\r\n\r\nstd::vector<cv::Point2i> bresCircle(int x0, int y0, int r)\r\n{\r\n\t// Bresenham's circle algorithm\r\n\t// Find cells intersected by circle with center (x0,y0) and radius r\r\n\r\n\tstd::vector< std::vector<bool> > mask(2 * r + 1);\r\n\t\r\n\tfor (int i = 0; i < 2 * r + 1; ++i)\r\n\t{\r\n\t\tmask[i].resize(2 * r + 1);\r\n\t\tfor (int j = 0; j < 2 * r + 1; ++j)\r\n\t\t{\r\n\t\t\tmask[i][j] = false;\r\n\t\t}\r\n\t}\r\n\r\n\tint f = 1 - r;\r\n\tint ddF_x = 1;\r\n\tint ddF_y = -2 * r;\r\n\tint x = 0;\r\n\tint y = r;\r\n\r\n\tstd::vector<cv::Point2i> line;\r\n\r\n\tline = bresLine(x0, y0 - r, x0, y0 + r);\r\n\tfor (std::vector<cv::Point2i>::iterator it = line.begin(); it != line.end(); ++it)\r\n\t{\r\n\t\tmask[it->x - x0 + r][it->y - y0 + r] = true;\r\n\t}\r\n\r\n\tline = bresLine(x0 - r, y0, x0 + r, y0);\r\n\tfor (std::vector<cv::Point2i>::iterator it = line.begin(); it != line.end(); ++it)\r\n\t{\r\n\t\tmask[it->x - x0 + r][it->y - y0 + r] = true;\r\n\t}\r\n\r\n\twhile (x < y)\r\n\t{\r\n\t\tif (f >= 0)\r\n\t\t{\r\n\t\t\ty--;\r\n\t\t\tddF_y += 2;\r\n\t\t\tf += ddF_y;\r\n\t\t}\r\n\r\n\t\tx++;\r\n\t\tddF_x += 2;\r\n\t\tf += ddF_x;\r\n\r\n\t\tline = bresLine(x0 - x, y0 + y, x0 + x, y0 + y);\r\n\t\tfor (std::vector<cv::Point2i>::iterator it = line.begin(); it != line.end(); ++it)\r\n\t\t{\r\n\t\t\tmask[it->x - x0 + r][it->y - y0 + r] = true;\r\n\t\t}\r\n\r\n\t\tline = bresLine(x0 - x, y0 - y, x0 + x, y0 - y);\r\n\t\tfor (std::vector<cv::Point2i>::iterator it = line.begin(); it != line.end(); ++it)\r\n\t\t{\r\n\t\t\tmask[it->x - x0 + r][it->y - y0 + r] = true;\r\n\t\t}\r\n\r\n\t\tline = bresLine(x0 - y, y0 + x, x0 + y, y0 + x);\r\n\t\tfor (std::vector<cv::Point2i>::iterator it = line.begin(); it != line.end(); ++it)\r\n\t\t{\r\n\t\t\tmask[it->x - x0 + r][it->y - y0 + r] = true;\r\n\t\t}\r\n\r\n\t\tline = bresLine(x0 - y, y0 - x, x0 + y, y0 - x);\r\n\t\tfor (std::vector<cv::Point2i>::iterator it = line.begin(); it != line.end(); ++it)\r\n\t\t{\r\n\t\t\tmask[it->x - x0 + r][it->y - y0 + r] = true;\r\n\t\t}\r\n\t}\r\n\r\n\tstd::vector<cv::Point2i> cells;\r\n\tfor (int i = 0; i < 2 * r + 1; ++i)\r\n\t{\r\n\t\tfor (int j = 0; j < 2 * r + 1; ++j)\r\n\t\t{\r\n\t\t\tif (mask[i][j])\r\n\t\t\t{\r\n\t\t\t\tcells.push_back(cv::Point2i(i - r + x0, j - r + y0));\r\n\t\t\t}\r\n\t\t}\r\n\t}\r\n\r\n\treturn cells;\r\n}\r\n\r\nvoid\r\nfitCircle(const std::vector<cv::Point2d>& points,\r\n          double& centerX, double& centerY, double& radius)\r\n{\r\n    // D. Umbach, and K. Jones, A Few Methods for Fitting Circles to Data,\r\n    // IEEE Transactions on Instrumentation and Measurement, 2000\r\n    // We use the modified least squares method.\r\n    double sum_x = 0.0;\r\n    double sum_y = 0.0;\r\n    double sum_xx = 0.0;\r\n    double sum_xy = 0.0;\r\n    double sum_yy = 0.0;\r\n    double sum_xxx = 0.0;\r\n    double sum_xxy = 0.0;\r\n    double sum_xyy = 0.0;\r\n    double sum_yyy = 0.0;\r\n\r\n    int n = points.size();\r\n    for (int i = 0; i < n; ++i)\r\n    {\r\n        double x = points.at(i).x;\r\n        double y = points.at(i).y;\r\n\r\n        sum_x += x;\r\n        sum_y += y;\r\n        sum_xx += x * x;\r\n        sum_xy += x * y;\r\n        sum_yy += y * y;\r\n        sum_xxx += x * x * x;\r\n        sum_xxy += x * x * y;\r\n        sum_xyy += x * y * y;\r\n        sum_yyy += y * y * y;\r\n    }\r\n\r\n    double A = n * sum_xx - square(sum_x);\r\n    double B = n * sum_xy - sum_x * sum_y;\r\n    double C = n * sum_yy - square(sum_y);\r\n    double D = 0.5 * (n * sum_xyy - sum_x * sum_yy + n * sum_xxx - sum_x * sum_xx);\r\n    double E = 0.5 * (n * sum_xxy - sum_y * sum_xx + n * sum_yyy - sum_y * sum_yy);\r\n\r\n    centerX = (D * C - B * E) / (A * C - square(B));\r\n    centerY = (A * E - B * D) / (A * C - square(B));\r\n\r\n    double sum_r = 0.0;\r\n    for (int i = 0; i < n; ++i)\r\n    {\r\n        double x = points.at(i).x;\r\n        double y = points.at(i).y;\r\n\r\n        sum_r += hypot(x - centerX, y - centerY);\r\n    }\r\n\r\n    radius = sum_r / n;\r\n}\r\n\r\nstd::vector<cv::Point2d>\r\nintersectCircles(double x1, double y1, double r1,\r\n                 double x2, double y2, double r2)\r\n{\r\n    std::vector<cv::Point2d> ipts;\r\n\r\n    double d = hypot(x1 - x2, y1 - y2);\r\n    if (d > r1 + r2)\r\n    {\r\n        // circles are separate\r\n        return ipts;\r\n    }\r\n    if (d < fabs(r1 - r2))\r\n    {\r\n        // one circle is contained within the other\r\n        return ipts;\r\n    }\r\n\r\n    double a = (square(r1) - square(r2) + square(d)) / (2.0 * d);\r\n    double h = sqrt(square(r1) - square(a));\r\n\r\n    double x3 = x1 + a * (x2 - x1) / d;\r\n    double y3 = y1 + a * (y2 - y1) / d;\r\n\r\n    if (h < 1e-10)\r\n    {\r\n        // two circles touch at one point\r\n        ipts.push_back(cv::Point2d(x3, y3));\r\n        return ipts;\r\n    }\r\n\r\n    ipts.push_back(cv::Point2d(x3 + h * (y2 - y1) / d,\r\n                               y3 - h * (x2 - x1) / d));\r\n    ipts.push_back(cv::Point2d(x3 - h * (y2 - y1) / d,\r\n                               y3 + h * (x2 - x1) / d));\r\n    return ipts;\r\n}\r\n\r\nchar\r\nUTMLetterDesignator(double latitude)\r\n{\r\n    // This routine determines the correct UTM letter designator for the given latitude\r\n    // returns 'Z' if latitude is outside the UTM limits of 84N to 80S\r\n    // Written by Chuck Gantz- chuck.gantz@globalstar.com\r\n    char letterDesignator;\r\n\r\n    if ((84.0 >= latitude) && (latitude >= 72.0)) letterDesignator = 'X';\r\n    else if ((72.0 > latitude) && (latitude >= 64.0)) letterDesignator = 'W';\r\n    else if ((64.0 > latitude) && (latitude >= 56.0)) letterDesignator = 'V';\r\n    else if ((56.0 > latitude) && (latitude >= 48.0)) letterDesignator = 'U';\r\n    else if ((48.0 > latitude) && (latitude >= 40.0)) letterDesignator = 'T';\r\n    else if ((40.0 > latitude) && (latitude >= 32.0)) letterDesignator = 'S';\r\n    else if ((32.0 > latitude) && (latitude >= 24.0)) letterDesignator = 'R';\r\n    else if ((24.0 > latitude) && (latitude >= 16.0)) letterDesignator = 'Q';\r\n    else if ((16.0 > latitude) && (latitude >= 8.0)) letterDesignator = 'P';\r\n    else if (( 8.0 > latitude) && (latitude >= 0.0)) letterDesignator = 'N';\r\n    else if (( 0.0 > latitude) && (latitude >= -8.0)) letterDesignator = 'M';\r\n    else if ((-8.0 > latitude) && (latitude >= -16.0)) letterDesignator = 'L';\r\n    else if ((-16.0 > latitude) && (latitude >= -24.0)) letterDesignator = 'K';\r\n    else if ((-24.0 > latitude) && (latitude >= -32.0)) letterDesignator = 'J';\r\n    else if ((-32.0 > latitude) && (latitude >= -40.0)) letterDesignator = 'H';\r\n    else if ((-40.0 > latitude) && (latitude >= -48.0)) letterDesignator = 'G';\r\n    else if ((-48.0 > latitude) && (latitude >= -56.0)) letterDesignator = 'F';\r\n    else if ((-56.0 > latitude) && (latitude >= -64.0)) letterDesignator = 'E';\r\n    else if ((-64.0 > latitude) && (latitude >= -72.0)) letterDesignator = 'D';\r\n    else if ((-72.0 > latitude) && (latitude >= -80.0)) letterDesignator = 'C';\r\n    else letterDesignator = 'Z'; //This is here as an error flag to show that the Latitude is outside the UTM limits\r\n\r\n    return letterDesignator;\r\n}\r\n\r\nvoid\r\nLLtoUTM(double latitude, double longitude,\r\n        double& utmNorthing, double& utmEasting, std::string& utmZone)\r\n{\r\n    // converts lat/long to UTM coords.  Equations from USGS Bulletin 1532\r\n    // East Longitudes are positive, West longitudes are negative.\r\n    // North latitudes are positive, South latitudes are negative\r\n    // Lat and Long are in decimal degrees\r\n    // Written by Chuck Gantz- chuck.gantz@globalstar.com\r\n\r\n    double k0 = 0.9996;\r\n\r\n    double LongOrigin;\r\n    double eccPrimeSquared;\r\n    double N, T, C, A, M;\r\n\r\n    double LatRad = latitude * M_PI / 180.0;\r\n    double LongRad = longitude * M_PI / 180.0;\r\n    double LongOriginRad;\r\n\r\n    int ZoneNumber = static_cast<int>((longitude + 180.0) / 6.0) + 1;\r\n\r\n    if (latitude >= 56.0 && latitude < 64.0 &&\r\n            longitude >= 3.0 && longitude < 12.0) {\r\n        ZoneNumber = 32;\r\n    }\r\n\r\n    // Special zones for Svalbard\r\n    if (latitude >= 72.0 && latitude < 84.0) {\r\n        if (     longitude >= 0.0  && longitude <  9.0) ZoneNumber = 31;\r\n        else if (longitude >= 9.0  && longitude < 21.0) ZoneNumber = 33;\r\n        else if (longitude >= 21.0 && longitude < 33.0) ZoneNumber = 35;\r\n        else if (longitude >= 33.0 && longitude < 42.0) ZoneNumber = 37;\r\n    }\r\n    LongOrigin = static_cast<double>((ZoneNumber - 1) * 6 - 180 + 3);  //+3 puts origin in middle of zone\r\n    LongOriginRad = LongOrigin * M_PI / 180.0;\r\n\r\n    // compute the UTM Zone from the latitude and longitude\r\n    std::ostringstream oss;\r\n    oss << ZoneNumber << UTMLetterDesignator(latitude);\r\n    utmZone = oss.str();\r\n\r\n    eccPrimeSquared = WGS84_ECCSQ / (1.0 - WGS84_ECCSQ);\r\n\r\n    N = WGS84_A / sqrt(1.0 - WGS84_ECCSQ * sin(LatRad) * sin(LatRad));\r\n    T = tan(LatRad) * tan(LatRad);\r\n    C = eccPrimeSquared * cos(LatRad) * cos(LatRad);\r\n    A = cos(LatRad) * (LongRad - LongOriginRad);\r\n\r\n    M = WGS84_A * ((1.0 - WGS84_ECCSQ / 4.0\r\n                    - 3.0 * WGS84_ECCSQ * WGS84_ECCSQ / 64.0\r\n                    - 5.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 256.0)\r\n                   * LatRad\r\n                   - (3.0 * WGS84_ECCSQ / 8.0\r\n                      + 3.0 * WGS84_ECCSQ * WGS84_ECCSQ / 32.0\r\n                      + 45.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 1024.0)\r\n                   * sin(2.0 * LatRad)\r\n                   + (15.0 * WGS84_ECCSQ * WGS84_ECCSQ / 256.0\r\n                      + 45.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 1024.0)\r\n                   * sin(4.0 * LatRad)\r\n                   - (35.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 3072.0)\r\n                   * sin(6.0 * LatRad));\r\n\r\n    utmEasting = k0 * N * (A + (1.0 - T + C) * A * A * A / 6.0\r\n                           + (5.0 - 18.0 * T + T * T + 72.0 * C\r\n                              - 58.0 * eccPrimeSquared)\r\n                           * A * A * A * A * A / 120.0)\r\n                 + 500000.0;\r\n\r\n    utmNorthing = k0 * (M + N * tan(LatRad) *\r\n                        (A * A / 2.0 +\r\n                         (5.0 - T + 9.0 * C + 4.0 * C * C) * A * A * A * A / 24.0\r\n                         + (61.0 - 58.0 * T + T * T + 600.0 * C\r\n                            - 330.0 * eccPrimeSquared)\r\n                         * A * A * A * A * A * A / 720.0));\r\n    if (latitude < 0.0) {\r\n        utmNorthing += 10000000.0; //10000000 meter offset for southern hemisphere\r\n    }\r\n}\r\n\r\nvoid\r\nUTMtoLL(double utmNorthing, double utmEasting, const std::string& utmZone,\r\n        double& latitude, double& longitude)\r\n{\r\n    // converts UTM coords to lat/long.  Equations from USGS Bulletin 1532\r\n    // East Longitudes are positive, West longitudes are negative.\r\n    // North latitudes are positive, South latitudes are negative\r\n    // Lat and Long are in decimal degrees.\r\n    // Written by Chuck Gantz- chuck.gantz@globalstar.com\r\n\r\n    double k0 = 0.9996;\r\n    double eccPrimeSquared;\r\n    double e1 = (1.0 - sqrt(1.0 - WGS84_ECCSQ)) / (1.0 + sqrt(1.0 - WGS84_ECCSQ));\r\n    double N1, T1, C1, R1, D, M;\r\n    double LongOrigin;\r\n    double mu, phi1, phi1Rad;\r\n    double x, y;\r\n    int ZoneNumber;\r\n    char ZoneLetter;\r\n    bool NorthernHemisphere;\r\n\r\n    x = utmEasting - 500000.0; //remove 500,000 meter offset for longitude\r\n    y = utmNorthing;\r\n\r\n    std::istringstream iss(utmZone);\r\n    iss >> ZoneNumber >> ZoneLetter;\r\n    if ((static_cast<int>(ZoneLetter) - static_cast<int>('N')) >= 0) {\r\n        NorthernHemisphere = true;//point is in northern hemisphere\r\n    } else {\r\n        NorthernHemisphere = false;//point is in southern hemisphere\r\n        y -= 10000000.0;//remove 10,000,000 meter offset used for southern hemisphere\r\n    }\r\n\r\n    LongOrigin = (ZoneNumber - 1.0) * 6.0 - 180.0 + 3.0;  //+3 puts origin in middle of zone\r\n\r\n    eccPrimeSquared = WGS84_ECCSQ / (1.0 - WGS84_ECCSQ);\r\n\r\n    M = y / k0;\r\n    mu = M / (WGS84_A * (1.0 - WGS84_ECCSQ / 4.0\r\n                         - 3.0 * WGS84_ECCSQ * WGS84_ECCSQ / 64.0\r\n                         - 5.0 * WGS84_ECCSQ * WGS84_ECCSQ * WGS84_ECCSQ / 256.0));\r\n\r\n    phi1Rad = mu + (3.0 * e1 / 2.0 - 27.0 * e1 * e1 * e1 / 32.0) * sin(2.0 * mu)\r\n              + (21.0 * e1 * e1 / 16.0 - 55.0 * e1 * e1 * e1 * e1 / 32.0)\r\n              * sin(4.0 * mu)\r\n              + (151.0 * e1 * e1 * e1 / 96.0) * sin(6.0 * mu);\r\n    phi1 = phi1Rad / M_PI * 180.0;\r\n\r\n    N1 = WGS84_A / sqrt(1.0 - WGS84_ECCSQ * sin(phi1Rad) * sin(phi1Rad));\r\n    T1 = tan(phi1Rad) * tan(phi1Rad);\r\n    C1 = eccPrimeSquared * cos(phi1Rad) * cos(phi1Rad);\r\n    R1 = WGS84_A * (1.0 - WGS84_ECCSQ) /\r\n         pow(1.0 - WGS84_ECCSQ * sin(phi1Rad) * sin(phi1Rad), 1.5);\r\n    D = x / (N1 * k0);\r\n\r\n    latitude = phi1Rad - (N1 * tan(phi1Rad) / R1)\r\n               * (D * D / 2.0 - (5.0 + 3.0 * T1 + 10.0 * C1 - 4.0 * C1 * C1\r\n                                 - 9.0 * eccPrimeSquared) * D * D * D * D / 24.0\r\n                  + (61.0 + 90.0 * T1 + 298.0 * C1 + 45.0 * T1 * T1\r\n                     - 252.0 * eccPrimeSquared - 3.0 * C1 * C1)\r\n                  * D * D * D * D * D * D / 720.0);\r\n    latitude *= 180.0 / M_PI;\r\n\r\n    longitude = (D - (1.0 + 2.0 * T1 + C1) * D * D * D / 6.0\r\n                 + (5.0 - 2.0 * C1 + 28.0 * T1 - 3.0 * C1 * C1\r\n                    + 8.0 * eccPrimeSquared + 24.0 * T1 * T1)\r\n                 * D * D * D * D * D / 120.0) / cos(phi1Rad);\r\n    longitude = LongOrigin + longitude / M_PI * 180.0;\r\n}\r\n\r\nlong int\r\ntimestampDiff(uint64_t t1, uint64_t t2)\r\n{\r\n    if (t2 > t1)\r\n    {\r\n        uint64_t d = t2 - t1;\r\n\r\n        if (d > std::numeric_limits<long int>::max())\r\n        {\r\n            return std::numeric_limits<long int>::max();\r\n        }\r\n        else\r\n        {\r\n            return d;\r\n        }\r\n    }\r\n    else\r\n    {\r\n        uint64_t d = t1 - t2;\r\n\r\n        if (d > std::numeric_limits<long int>::max())\r\n        {\r\n            return std::numeric_limits<long int>::min();\r\n        }\r\n        else\r\n        {\r\n            return - static_cast<long int>(d);\r\n        }\r\n    }\r\n}\r\n\r\n}\r\n"
  },
  {
    "path": "camera_model/src/intrinsic_calib.cc",
    "content": "#include <boost/algorithm/string.hpp>\n#include <boost/filesystem.hpp>\n#include <boost/program_options.hpp>\n#include <iomanip>\n#include <iostream>\n#include <algorithm>\n#include <opencv2/core/core.hpp>\n#include <opencv2/imgproc/imgproc.hpp>\n#include <opencv2/highgui/highgui.hpp>\n\n#include \"camodocal/chessboard/Chessboard.h\"\n#include \"camodocal/calib/CameraCalibration.h\"\n#include \"camodocal/gpl/gpl.h\"\n\nint main(int argc, char** argv)\n{\n    cv::Size boardSize;\n    float squareSize;\n    std::string inputDir;\n    std::string cameraModel;\n    std::string cameraName;\n    std::string prefix;\n    std::string fileExtension;\n    bool useOpenCV;\n    bool viewResults;\n    bool verbose;\n\n    //========= Handling Program options =========\n    boost::program_options::options_description desc(\"Allowed options\");\n    desc.add_options()\n        (\"help\", \"produce help message\")\n        (\"width,w\", boost::program_options::value<int>(&boardSize.width)->default_value(8), \"Number of inner corners on the chessboard pattern in x direction\")\n        (\"height,h\", boost::program_options::value<int>(&boardSize.height)->default_value(12), \"Number of inner corners on the chessboard pattern in y direction\")\n        (\"size,s\", boost::program_options::value<float>(&squareSize)->default_value(7.f), \"Size of one square in mm\")\n        (\"input,i\", boost::program_options::value<std::string>(&inputDir)->default_value(\"calibrationdata\"), \"Input directory containing chessboard images\")\n        (\"prefix,p\", boost::program_options::value<std::string>(&prefix)->default_value(\"left-\"), \"Prefix of images\")\n        (\"file-extension,e\", boost::program_options::value<std::string>(&fileExtension)->default_value(\".png\"), \"File extension of images\")\n        (\"camera-model\", boost::program_options::value<std::string>(&cameraModel)->default_value(\"mei\"), \"Camera model: kannala-brandt | mei | pinhole\")\n        (\"camera-name\", boost::program_options::value<std::string>(&cameraName)->default_value(\"camera\"), \"Name of camera\")\n        (\"opencv\", boost::program_options::bool_switch(&useOpenCV)->default_value(true), \"Use OpenCV to detect corners\")\n        (\"view-results\", boost::program_options::bool_switch(&viewResults)->default_value(false), \"View results\")\n        (\"verbose,v\", boost::program_options::bool_switch(&verbose)->default_value(true), \"Verbose output\")\n        ;\n\n    boost::program_options::positional_options_description pdesc;\n    pdesc.add(\"input\", 1);\n\n    boost::program_options::variables_map vm;\n    boost::program_options::store(boost::program_options::command_line_parser(argc, argv).options(desc).positional(pdesc).run(), vm);\n    boost::program_options::notify(vm);\n\n    if (vm.count(\"help\"))\n    {\n        std::cout << desc << std::endl;\n        return 1;\n    }\n\n    if (!boost::filesystem::exists(inputDir) && !boost::filesystem::is_directory(inputDir))\n    {\n        std::cerr << \"# ERROR: Cannot find input directory \" << inputDir << \".\" << std::endl;\n        return 1;\n    }\n\n    camodocal::Camera::ModelType modelType;\n    if (boost::iequals(cameraModel, \"kannala-brandt\"))\n    {\n        modelType = camodocal::Camera::KANNALA_BRANDT;\n    }\n    else if (boost::iequals(cameraModel, \"mei\"))\n    {\n        modelType = camodocal::Camera::MEI;\n    }\n    else if (boost::iequals(cameraModel, \"pinhole\"))\n    {\n        modelType = camodocal::Camera::PINHOLE;\n    }\n    else if (boost::iequals(cameraModel, \"scaramuzza\"))\n    {\n        modelType = camodocal::Camera::SCARAMUZZA;\n    }\n    else\n    {\n        std::cerr << \"# ERROR: Unknown camera model: \" << cameraModel << std::endl;\n        return 1;\n    }\n\n    switch (modelType)\n    {\n    case camodocal::Camera::KANNALA_BRANDT:\n        std::cout << \"# INFO: Camera model: Kannala-Brandt\" << std::endl;\n        break;\n    case camodocal::Camera::MEI:\n        std::cout << \"# INFO: Camera model: Mei\" << std::endl;\n        break;\n    case camodocal::Camera::PINHOLE:\n        std::cout << \"# INFO: Camera model: Pinhole\" << std::endl;\n        break;\n    case camodocal::Camera::SCARAMUZZA:\n        std::cout << \"# INFO: Camera model: Scaramuzza-Omnidirect\" << std::endl;\n        break;\n    }\n\n    // look for images in input directory\n    std::vector<std::string> imageFilenames;\n    boost::filesystem::directory_iterator itr;\n    for (boost::filesystem::directory_iterator itr(inputDir); itr != boost::filesystem::directory_iterator(); ++itr)\n    {\n        if (!boost::filesystem::is_regular_file(itr->status()))\n        {\n            continue;\n        }\n\n        std::string filename = itr->path().filename().string();\n\n        // check if prefix matches\n        if (!prefix.empty())\n        {\n            if (filename.compare(0, prefix.length(), prefix) != 0)\n            {\n                continue;\n            }\n        }\n\n        // check if file extension matches\n        if (filename.compare(filename.length() - fileExtension.length(), fileExtension.length(), fileExtension) != 0)\n        {\n            continue;\n        }\n\n        imageFilenames.push_back(itr->path().string());\n\n        if (verbose)\n        {\n            std::cerr << \"# INFO: Adding \" << imageFilenames.back() << std::endl;\n        }\n    }\n\n    if (imageFilenames.empty())\n    {\n        std::cerr << \"# ERROR: No chessboard images found.\" << std::endl;\n        return 1;\n    }\n\n    if (verbose)\n    {\n        std::cerr << \"# INFO: # images: \" << imageFilenames.size() << std::endl;\n    }\n\n    std::sort(imageFilenames.begin(), imageFilenames.end());\n\n    cv::Mat image = cv::imread(imageFilenames.front(), -1);\n    const cv::Size frameSize = image.size();\n\n    camodocal::CameraCalibration calibration(modelType, cameraName, frameSize, boardSize, squareSize);\n    calibration.setVerbose(verbose);\n\n    std::vector<bool> chessboardFound(imageFilenames.size(), false);\n    for (size_t i = 0; i < imageFilenames.size(); ++i)\n    {\n        image = cv::imread(imageFilenames.at(i), -1);\n\n        camodocal::Chessboard chessboard(boardSize, image);\n\n        chessboard.findCorners(useOpenCV);\n        if (chessboard.cornersFound())\n        {\n            if (verbose)\n            {\n                std::cerr << \"# INFO: Detected chessboard in image \" << i + 1 << \", \" << imageFilenames.at(i) << std::endl;\n            }\n\n            calibration.addChessboardData(chessboard.getCorners());\n\n            cv::Mat sketch;\n            chessboard.getSketch().copyTo(sketch);\n\n            cv::imshow(\"Image\", sketch);\n            cv::waitKey(50);\n        }\n        else if (verbose)\n        {\n            std::cerr << \"# INFO: Did not detect chessboard in image \" << i + 1 << std::endl;\n        }\n        chessboardFound.at(i) = chessboard.cornersFound();\n    }\n    cv::destroyWindow(\"Image\");\n\n    if (calibration.sampleCount() < 10)\n    {\n        std::cerr << \"# ERROR: Insufficient number of detected chessboards.\" << std::endl;\n        return 1;\n    }\n\n    if (verbose)\n    {\n        std::cerr << \"# INFO: Calibrating...\" << std::endl;\n    }\n\n    double startTime = camodocal::timeInSeconds();\n\n    calibration.calibrate();\n    calibration.writeParams(cameraName + \"_camera_calib.yaml\");\n    calibration.writeChessboardData(cameraName + \"_chessboard_data.dat\");\n\n    if (verbose)\n    {\n        std::cout << \"# INFO: Calibration took a total time of \"\n                  << std::fixed << std::setprecision(3) << camodocal::timeInSeconds() - startTime\n                  << \" sec.\\n\";\n    }\n\n    if (verbose)\n    {\n        std::cerr << \"# INFO: Wrote calibration file to \" << cameraName + \"_camera_calib.yaml\" << std::endl;\n    }\n\n    if (viewResults)\n    {\n        std::vector<cv::Mat> cbImages;\n        std::vector<std::string> cbImageFilenames;\n\n        for (size_t i = 0; i < imageFilenames.size(); ++i)\n        {\n            if (!chessboardFound.at(i))\n            {\n                continue;\n            }\n\n            cbImages.push_back(cv::imread(imageFilenames.at(i), -1));\n            cbImageFilenames.push_back(imageFilenames.at(i));\n        }\n\n        // visualize observed and reprojected points\n        calibration.drawResults(cbImages);\n\n        for (size_t i = 0; i < cbImages.size(); ++i)\n        {\n            cv::putText(cbImages.at(i), cbImageFilenames.at(i), cv::Point(10,20),\n                        cv::FONT_HERSHEY_COMPLEX, 0.5, cv::Scalar(255, 255, 255),\n                        1, CV_AA);\n            cv::imshow(\"Image\", cbImages.at(i));\n            cv::waitKey(0);\n        }\n    }\n\n    return 0;\n}\n"
  },
  {
    "path": "camera_model/src/sparse_graph/Transform.cc",
    "content": "#include <camodocal/sparse_graph/Transform.h>\n\nnamespace camodocal\n{\n\nTransform::Transform()\n{\n    m_q.setIdentity();\n    m_t.setZero();\n}\n\nTransform::Transform(const Eigen::Matrix4d& H)\n{\n   m_q = Eigen::Quaterniond(H.block<3,3>(0,0));\n   m_t = H.block<3,1>(0,3);\n}\n\nEigen::Quaterniond&\nTransform::rotation(void)\n{\n    return m_q;\n}\n\nconst Eigen::Quaterniond&\nTransform::rotation(void) const\n{\n    return m_q;\n}\n\ndouble*\nTransform::rotationData(void)\n{\n    return m_q.coeffs().data();\n}\n\nconst double* const\nTransform::rotationData(void) const\n{\n    return m_q.coeffs().data();\n}\n\nEigen::Vector3d&\nTransform::translation(void)\n{\n    return m_t;\n}\n\nconst Eigen::Vector3d&\nTransform::translation(void) const\n{\n    return m_t;\n}\n\ndouble*\nTransform::translationData(void)\n{\n    return m_t.data();\n}\n\nconst double* const\nTransform::translationData(void) const\n{\n    return m_t.data();\n}\n\nEigen::Matrix4d\nTransform::toMatrix(void) const\n{\n    Eigen::Matrix4d H;\n    H.setIdentity();\n    H.block<3,3>(0,0) = m_q.toRotationMatrix();\n    H.block<3,1>(0,3) = m_t;\n\n    return H;\n}\n\n}\n"
  },
  {
    "path": "config/advio_12_config.yaml",
    "content": "%YAML:1.0\n\n#common parameters\nimu_topic: \"/imu0\"\nimage_topic: \"/cam0/image_raw\"\nmask_topic: \"/cam0/mask\"\noutput_path: \"~/output/\"\n\n#camera calibration \nmodel_type: PINHOLE\ncamera_name: camera\nimage_width: 720\nimage_height: 1280\ndistortion_parameters:\n   k1: 0.0478\n   k2: 0.0339\n   p1: -0.00033\n   p2: -0.00091\nprojection_parameters:\n   fx: 1077.2\n   fy: 1079.3\n   cx: 362.14\n   cy: 636.39\n\nestimate_extrinsic: 0\n#Rotation from camera frame to imu frame, imu^R_cam\nextrinsicRotation: !!opencv-matrix\n   rows: 3\n   cols: 3\n   dt: d\n   data: [0.9999763379093255, -0.004079205042965442, -0.005539287650170447, -0.004066386342107199, -0.9999890330121858, 0.0023234365646622014, -0.00554870467502187, -0.0023008567036498766, -0.9999819588046867]\n\n#Translation from camera frame to imu frame, imu^T_cam\nextrinsicTranslation: !!opencv-matrix\n   rows: 3\n   cols: 1\n   dt: d\n   data: [-0.008977668364731128, 0.07557012320238939, -0.005545773942541918]\n\n#feature traker parameters\nmax_cnt: 250            # max feature number in feature tracking\nmin_dist: 30            # min distance between two features \nfreq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image \nH_threshold: 1.0        # ransac threshold (pixel)\nshow_track: 1           # publish tracking image as topic\nequalize: 1             # if image is too dark or light, trun on equalize to find enough features\nfisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points\n\n#optimization parameters\nmax_solver_time: 0.04   # max solver itration time (ms), to guarantee real time\nmax_num_iterations: 8   # max solver itrations, to guarantee real time\nkeyframe_parallax: 10.0 # keyframe selection threshold (pixel)\n\n#discrete-time imu noise parameters (as provided in dataset)\nacc_n: 0.1517      # accelerometer measurement noise standard deviation.\ngyr_n: 0.0758      # gyroscope measurement noise standard deviation.\nacc_w: 6.6407e-6   # accelerometer bias random work noise standard deviation.\ngyr_w: 1.6127e-6   # gyroscope bias random work noise standard deviation.\ng_norm: 9.8067     # gravity magnitude\n\n#unsynchronization parameters\nestimate_td: 0                 \ntd: 0.0            # initial value of time offset. unit s. readed image clock + td = real image clock (IMU clock)\n\n#rolling shutter parameters\nrolling_shutter: 0                  # 0: global shutter camera, 1: rolling shutter camera\nrolling_shutter_tr: 0               # unit: s. rolling shutter read out time per frame (from data sheet). \n\n#visualization parameters\nsave_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0 \nvisualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results\nvisualize_camera_size: 0.4      # size of camera marker in RVIZ\n"
  },
  {
    "path": "config/ol_market1_config.yaml",
    "content": "%YAML:1.0\n\n#common parameters\nimu_topic: \"/d400/imu0\"\nimage_topic: \"/d400/color/image_raw\"\nmask_topic: \"/planes/segments_crf\"\noutput_path: \"~/market1\"\n\n#camera calibration \nmodel_type: PINHOLE\ncamera_name: camera\nimage_width: 848\nimage_height: 480\ndistortion_parameters:\n   k1: 0.0\n   k2: 0.0\n   p1: 0.0\n   p2: 0.0\nprojection_parameters:\n   fx: 616.802734375\n   fy: 616.7510375976562\n   cx: 435.0341796875\n   cy: 242.90113830566406\n\nestimate_extrinsic: 0\n#Rotation from camera frame to imu frame, imu^R_cam\nextrinsicRotation: !!opencv-matrix\n   rows: 3\n   cols: 3\n   dt: d\n   data: [9.99946615e-01, 5.27582295e-03, 8.88440156e-03,\n           -5.28478975e-03, 9.99985549e-01, 9.86100143e-04,\n           -8.87907068e-03, -1.03299969e-03, 9.99960047e-01]\n#Translation from camera frame to imu frame, imu^T_cam\nextrinsicTranslation: !!opencv-matrix\n   rows: 3\n   cols: 1\n   dt: d\n   data: [ 0.020096886903,-0.00507855368778,-0.0115051260218]\n\n#feature traker paprameters\nmax_cnt: 250            # max feature number in feature tracking\nmin_dist: 30            # min distance between two features \nfreq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image \nH_threshold: 1.0        # ransac threshold (pixel)\nshow_track: 1           # publish tracking image as topic\nequalize: 1             # if image is too dark or light, trun on equalize to find enough features\nfisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points\n\n#optimization parameters\nmax_solver_time: 0.04   # max solver itration time (ms), to guarantee real time\nmax_num_iterations: 8   # max solver itrations, to guarantee real time\nkeyframe_parallax: 10.0 # keyframe selection threshold (pixel)\n\n#discrete-time imu noise parameters (as provided in dataset)\nacc_n: 0.0163       # accelerometer measurement noise standard deviation.\ngyr_n: 0.0032       # gyroscope measurement noise standard deviation.\nacc_w: 0.0049       # accelerometer bias random work noise standard deviation.\ngyr_w: 0.0004       # gyroscope bias random work noise standard deviation.\ng_norm: 9.81007     # gravity magnitude\n\n#unsynchronization parameters\nestimate_td: 0                 \ntd: 0.0             # initial value of time offset. unit s. readed image clock + td = real image clock (IMU clock)\n\n#rolling shutter parameters\nrolling_shutter: 0                  # 0: global shutter camera, 1: rolling shutter camera\nrolling_shutter_tr: 0               # unit: s. rolling shutter read out time per frame (from data sheet). \n\n#visualization parameters\nsave_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0 \nvisualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results\nvisualize_camera_size: 0.4      # size of camera marker in RVIZ\n"
  },
  {
    "path": "config/rpvio_rviz_config.rviz",
    "content": "Panels:\n  - Class: rviz/Displays\n    Help Height: 0\n    Name: Displays\n    Property Tree Widget:\n      Expanded:\n        - /Global Options1\n        - /VIO1\n        - /Path1\n      Splitter Ratio: 0.4651159942150116\n    Tree Height: 140\n  - Class: rviz/Selection\n    Name: Selection\n  - Class: rviz/Tool Properties\n    Expanded:\n      - /2D Pose Estimate1\n      - /2D Nav Goal1\n      - /Publish Point1\n    Name: Tool Properties\n    Splitter Ratio: 0.5886790156364441\n  - Class: rviz/Views\n    Expanded:\n      - /Current View1\n    Name: Views\n    Splitter Ratio: 0.5\n  - Class: rviz/Time\n    Experimental: false\n    Name: Time\n    SyncMode: 0\n    SyncSource: tracked image\n  - Class: rviz/Displays\n    Help Height: 78\n    Name: Displays\n    Property Tree Widget:\n      Expanded: ~\n      Splitter Ratio: 0.5\n    Tree Height: 363\nPreferences:\n  PromptSaveOnExit: true\nToolbars:\n  toolButtonStyle: 2\nVisualization Manager:\n  Class: \"\"\n  Displays:\n    - Alpha: 0.5\n      Cell Size: 1\n      Class: rviz/Grid\n      Color: 130; 130; 130\n      Enabled: false\n      Line Style:\n        Line Width: 0.029999999329447746\n        Value: Lines\n      Name: Grid\n      Normal Cell Count: 0\n      Offset:\n        X: 0\n        Y: 0\n        Z: 0\n      Plane: XY\n      Plane Cell Count: 10\n      Reference Frame: <Fixed Frame>\n      Value: false\n    - Class: rviz/Axes\n      Enabled: false\n      Length: 1\n      Name: Axes\n      Radius: 0.10000000149011612\n      Reference Frame: <Fixed Frame>\n      Value: false\n    - Alpha: 1\n      Buffer Length: 1\n      Class: rviz/Path\n      Color: 255; 0; 0\n      Enabled: true\n      Head Diameter: 0.30000001192092896\n      Head Length: 0.20000000298023224\n      Length: 0.30000001192092896\n      Line Style: Lines\n      Line Width: 0.5\n      Name: ground_truth_path\n      Offset:\n        X: 0\n        Y: 0\n        Z: 0\n      Pose Color: 255; 85; 255\n      Pose Style: None\n      Radius: 0.029999999329447746\n      Shaft Diameter: 0.10000000149011612\n      Shaft Length: 0.10000000149011612\n      Topic: /benchmark_publisher/path\n      Unreliable: false\n      Value: true\n    - Class: rviz/Image\n      Enabled: true\n      Image Topic: /rpvio_feature_tracker/feature_img\n      Max Value: 1\n      Median window: 5\n      Min Value: 0\n      Name: tracked image\n      Normalize Range: true\n      Queue Size: 2\n      Transport Hint: raw\n      Unreliable: false\n      Value: true\n    - Class: rviz/Image\n      Enabled: false\n      Image Topic: /cam0/image_raw\n      Max Value: 1\n      Median window: 5\n      Min Value: 0\n      Name: raw_image\n      Normalize Range: true\n      Queue Size: 2\n      Transport Hint: raw\n      Unreliable: false\n      Value: false\n    - Class: rviz/Group\n      Displays:\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 0; 255; 0\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.029999999329447746\n          Name: Path\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /vins_estimator/path\n          Unreliable: false\n          Value: true\n        - Class: rviz/MarkerArray\n          Enabled: true\n          Marker Topic: /vins_estimator/camera_pose_visual\n          Name: camera_visual\n          Namespaces:\n            CameraPoseVisualization: true\n          Queue Size: 100\n          Value: true\n        - Alpha: 1\n          Autocompute Intensity Bounds: true\n          Autocompute Value Bounds:\n            Max Value: 10\n            Min Value: -10\n            Value: true\n          Axis: Z\n          Channel Name: intensity\n          Class: rviz/PointCloud\n          Color: 0; 255; 0\n          Color Transformer: FlatColor\n          Decay Time: 0\n          Enabled: true\n          Invert Rainbow: false\n          Max Color: 255; 255; 255\n          Max Intensity: 4096\n          Min Color: 0; 0; 0\n          Min Intensity: 0\n          Name: current_point\n          Position Transformer: XYZ\n          Queue Size: 10\n          Selectable: true\n          Size (Pixels): 1\n          Size (m): 0.009999999776482582\n          Style: Points\n          Topic: /vins_estimator/point_cloud\n          Unreliable: false\n          Use Fixed Frame: true\n          Use rainbow: true\n          Value: true\n        - Alpha: 1\n          Autocompute Intensity Bounds: false\n          Autocompute Value Bounds:\n            Max Value: 10\n            Min Value: -10\n            Value: true\n          Axis: Z\n          Channel Name: intensity\n          Class: rviz/PointCloud\n          Color: 255; 255; 255\n          Color Transformer: FlatColor\n          Decay Time: 3000\n          Enabled: true\n          Invert Rainbow: true\n          Max Color: 0; 0; 0\n          Max Intensity: 4096\n          Min Color: 0; 0; 0\n          Min Intensity: 0\n          Name: history_point\n          Position Transformer: XYZ\n          Queue Size: 10\n          Selectable: true\n          Size (Pixels): 1\n          Size (m): 0.5\n          Style: Points\n          Topic: /vins_estimator/history_cloud\n          Unreliable: false\n          Use Fixed Frame: true\n          Use rainbow: false\n          Value: true\n        - Class: rviz/TF\n          Enabled: true\n          Frame Timeout: 15\n          Frames:\n            All Enabled: true\n            body:\n              Value: true\n            camera:\n              Value: true\n            world:\n              Value: true\n          Marker Scale: 1\n          Name: TF\n          Show Arrows: true\n          Show Axes: true\n          Show Names: true\n          Tree:\n            world:\n              body:\n                camera:\n                  {}\n          Update Interval: 0\n          Value: true\n      Enabled: true\n      Name: VIO\n    - Class: rviz/Group\n      Displays:\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 0; 255; 0\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.029999999329447746\n          Name: pose_graph_path\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /pose_graph/pose_graph_path\n          Unreliable: false\n          Value: true\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 255; 170; 0\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.029999999329447746\n          Name: base_path\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /pose_graph/base_path\n          Unreliable: false\n          Value: true\n        - Class: rviz/MarkerArray\n          Enabled: true\n          Marker Topic: /pose_graph/pose_graph\n          Name: loop_visual\n          Namespaces:\n            {}\n          Queue Size: 100\n          Value: true\n        - Class: rviz/MarkerArray\n          Enabled: true\n          Marker Topic: /pose_graph/camera_pose_visual\n          Name: camera_visual\n          Namespaces:\n            {}\n          Queue Size: 100\n          Value: true\n        - Class: rviz/Image\n          Enabled: false\n          Image Topic: /pose_graph/match_image\n          Max Value: 1\n          Median window: 5\n          Min Value: 0\n          Name: loop_match_image\n          Normalize Range: true\n          Queue Size: 2\n          Transport Hint: raw\n          Unreliable: false\n          Value: false\n        - Class: rviz/Marker\n          Enabled: true\n          Marker Topic: /pose_graph/key_odometrys\n          Name: Marker\n          Namespaces:\n            {}\n          Queue Size: 100\n          Value: true\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 25; 255; 0\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.30000001192092896\n          Name: Sequence1\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /pose_graph/path_1\n          Unreliable: false\n          Value: true\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 255; 0; 0\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.30000001192092896\n          Name: Sequence2\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /pose_graph/path_2\n          Unreliable: false\n          Value: true\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 0; 85; 255\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.30000001192092896\n          Name: Sequence3\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /pose_graph/path_3\n          Unreliable: false\n          Value: true\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 255; 170; 0\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.30000001192092896\n          Name: Sequence4\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /pose_graph/path_4\n          Unreliable: false\n          Value: true\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 255; 170; 255\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.029999999329447746\n          Name: Sequence5\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /pose_graph/path_5\n          Unreliable: false\n          Value: true\n        - Alpha: 1\n          Buffer Length: 1\n          Class: rviz/Path\n          Color: 25; 255; 0\n          Enabled: true\n          Head Diameter: 0.30000001192092896\n          Head Length: 0.20000000298023224\n          Length: 0.30000001192092896\n          Line Style: Lines\n          Line Width: 0.029999999329447746\n          Name: no_loop_path\n          Offset:\n            X: 0\n            Y: 0\n            Z: 0\n          Pose Color: 255; 85; 255\n          Pose Style: None\n          Radius: 0.029999999329447746\n          Shaft Diameter: 0.10000000149011612\n          Shaft Length: 0.10000000149011612\n          Topic: /pose_graph/no_loop_path\n          Unreliable: false\n          Value: true\n      Enabled: true\n      Name: pose_graph\n    - Alpha: 1\n      Buffer Length: 1\n      Class: rviz/Path\n      Color: 164; 0; 0\n      Enabled: true\n      Head Diameter: 0.30000001192092896\n      Head Length: 0.20000000298023224\n      Length: 0.30000001192092896\n      Line Style: Lines\n      Line Width: 0.029999999329447746\n      Name: Path\n      Offset:\n        X: 0\n        Y: 0\n        Z: 0\n      Pose Color: 255; 85; 255\n      Pose Style: None\n      Radius: 0.029999999329447746\n      Shaft Diameter: 0.10000000149011612\n      Shaft Length: 0.10000000149011612\n      Topic: /benchmark_publisher/path\n      Unreliable: false\n      Value: true\n  Enabled: true\n  Global Options:\n    Background Color: 255; 255; 255\n    Default Light: true\n    Fixed Frame: world\n    Frame Rate: 30\n  Name: root\n  Tools:\n    - Class: rviz/MoveCamera\n    - Class: rviz/Select\n    - Class: rviz/FocusCamera\n    - Class: rviz/Measure\n    - Class: rviz/SetInitialPose\n      Theta std deviation: 0.2617993950843811\n      Topic: /initialpose\n      X std deviation: 0.5\n      Y std deviation: 0.5\n    - Class: rviz/SetGoal\n      Topic: /move_base_simple/goal\n    - Class: rviz/PublishPoint\n      Single click: true\n      Topic: /clicked_point\n  Value: true\n  Views:\n    Current:\n      Class: rviz/XYOrbit\n      Distance: 63.901485443115234\n      Enable Stereo Rendering:\n        Stereo Eye Separation: 0.05999999865889549\n        Stereo Focal Distance: 1\n        Swap Stereo Eyes: false\n        Value: false\n      Focal Point:\n        X: 7.038261890411377\n        Y: -13.610830307006836\n        Z: 2.141062395821791e-5\n      Focal Shape Fixed Size: true\n      Focal Shape Size: 0.05000000074505806\n      Invert Z Axis: false\n      Name: Current View\n      Near Clip Distance: 0.009999999776482582\n      Pitch: 0.5597963333129883\n      Target Frame: <Fixed Frame>\n      Value: XYOrbit (rviz)\n      Yaw: 4.052229404449463\n    Saved: ~\nWindow Geometry:\n  Displays:\n    collapsed: false\n  Height: 711\n  Hide Left Dock: false\n  Hide Right Dock: true\n  QMainWindow State: 000000ff00000000fd0000000400000000000001fb0000022cfc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d0000015d0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650000000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001a0000000c9000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000005130000003bfc0100000002fb0000000800540069006d0065010000000000000513000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000003120000022c00000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n  Selection:\n    collapsed: false\n  Time:\n    collapsed: false\n  Tool Properties:\n    collapsed: false\n  Views:\n    collapsed: true\n  Width: 1299\n  X: 67\n  Y: 27\n  loop_match_image:\n    collapsed: false\n  raw_image:\n    collapsed: false\n  tracked image:\n    collapsed: false\n"
  },
  {
    "path": "config/rpvio_sim_config.yaml",
    "content": "%YAML:1.0\n\n#common parameters\nimu_topic: \"/imu_throttled\"\nimage_topic: \"/image\"\nmask_topic: \"/mask\"\noutput_path: \"~/output/\"\n\n#camera calibration \nmodel_type: PINHOLE\ncamera_name: camera\nimage_width: 640\nimage_height: 480\ndistortion_parameters:\n   k1: 0\n   k2: 0\n   p1: 0\n   p2: 0\nprojection_parameters:\n   fx: 320\n   fy: 320\n   cx: 320\n   cy: 240\n\nestimate_extrinsic: 0\n#Rotation from camera frame to imu frame, imu^R_cam\nextrinsicRotation: !!opencv-matrix\n   rows: 3\n   cols: 3\n   dt: d\n   data: [0,0,1,1,0,0,0,1,0]\n\n#Translation from camera frame to imu frame, imu^T_cam\nextrinsicTranslation: !!opencv-matrix\n   rows: 3\n   cols: 1\n   dt: d\n   data: [0.50, 0, 0]\n\n#feature traker parameters\nmax_cnt: 250            # max feature number in feature tracking\nmin_dist: 30            # min distance between two features \nfreq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image \nH_threshold: 1.0        # ransac threshold (pixel)\nshow_track: 1           # publish tracking image as topic\nequalize: 1             # if image is too dark or light, trun on equalize to find enough features\nfisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points\n\n#optimization parameters\nmax_solver_time: 0.04   # max solver itration time (ms), to guarantee real time\nmax_num_iterations: 8   # max solver itrations, to guarantee real time\nkeyframe_parallax: 10.0 # keyframe selection threshold (pixel)\n\n#discrete-time imu noise parameters       \nacc_n: 0.074427240     # accelerometer measurement noise standard deviation.\ngyr_n: 0.002759607     # gyroscope measurement noise standard deviation.\nacc_w: 3.9471004e-7    # accelerometer bias random work noise standard deviation.\ngyr_w: 3.1538983e-8    # gyroscope bias random work noise standard deviation.\ng_norm: 9.80665        # gravity magnitude\n\n#unsynchronization parameters\nestimate_td: 0                  \ntd: -0.03079132514112524        # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)\n\n#rolling shutter parameters\nrolling_shutter: 0                  # 0: global shutter camera, 1: rolling shutter camera\nrolling_shutter_tr: 0               # unit: s. rolling shutter read out time per frame (from data sheet). \n\n#visualization parameters\nsave_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0 \nvisualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results\nvisualize_camera_size: 0.4      # size of camera marker in RVIZ\n"
  },
  {
    "path": "plane_segmentation/RecoverPlane_perpendicular.py",
    "content": "from __future__ import division\nimport os\nimport time\nimport math\nfrom data_loader_new import DataLoader\nfrom net import *\nfrom utils import *\nimport scipy.misc\nimport random\nimport math\nseed = 8964\ntf.set_random_seed(seed)\nnp.random.seed(seed)\n\nclass RecoverPlane(object):\n    def __init__(self):\n        pass   #do nothing\n    # note in python all self.X are class members. And they can be used without declaration.\n    def build_train_graph(self):\n        opt = self.opt\n        self.num_plane = opt.num_plane\n\n        loader = DataLoader(opt.dataset_dir,\n                            opt.batch_size,\n                            opt.img_height,\n                            opt.img_width,\n                            self.num_scales)                           # a class defined in data_loader\n        with tf.name_scope(\"data_loading\"):\n            tgt_image, tgt_depth_stack, tgt_label_stack, intrinsics = loader.load_train_batch() #, tgt2src\n            \n            tgt_image = self.preprocess_image(tgt_image)              #4*192*320*3\n\n\n        with tf.name_scope(\"planeMask_and_planeParam_prediction\"):\n            pred_param, pred_mask, plane_net_endpoints = \\\n                plane_pred_net(tgt_image,\n                             opt.num_plane,\n                             is_training=True)\n            new_shape  = (3,3,8)\n\n\n        with tf.name_scope(\"mask_color\"):\n            # Generate random colors\n            random.seed(seed)\n            colors = random_colors(opt.num_plane)\n            for i in range(opt.num_plane):\n                colors[i] = [(m*255) for m in colors[i]]\n\n        with tf.name_scope(\"compute_loss\"):\n            plane_loss = 0\n            depth_loss = 0\n            perpendicular_loss = 0\n            tgt_image_all = []\n\n            plane_mask_stack_all = []\n            color_plane_mask_stack_all = []\n            tgt_depth_all = []\n            depth_error_all = []\n            non_plane_mask_stack_all = []\n\n\n            for s in range(self.num_scales):\n                # Scale the target images, depth, and label for computing loss at the according scale.\n                curr_tgt_image = tf.image.resize_area(tgt_image, \n                    [int(opt.img_height/(2**s)), int(opt.img_width/(2**s))])\n                curr_depth_stack = tf.image.resize_area(tgt_depth_stack,\n                    [int(opt.img_height / (2 ** s)),int(opt.img_width / (2 ** s))])\n                curr_label_stack = tf.image.resize_area(tgt_label_stack,\n                    [int(opt.img_height / (2 ** s)), int(opt.img_width / (2 ** s))])\n\n                # calculate the plane_loss using cross_entropy\n                # basically assume all the pixels belong to potential plane masks = 1\n                ref_plane_mask = tf.concat([tf.ones(shape=curr_label_stack.get_shape())- curr_label_stack , curr_label_stack], axis=-1)\n                plane_loss += opt.plane_weight * \\\n                              self.compute_plane_reg_loss(pred_mask[s], ref_plane_mask)  # compare with 'ref_plane_mask'\n                              \n                pred_mask_s = tf.nn.softmax(pred_mask[s])\n\n                # get the unscaled ray, k^(-1)q in Eq.3\n                unscaled_ray =  compute_unscaled_ray(curr_tgt_image, intrinsics[:, s, :, :])\n\n                for p in range(opt.num_plane):\n                    # the left equation of Eq.3\n                    left_plane_eq = compute_plane_equation(curr_tgt_image, pred_param[:,p:p+1,:], unscaled_ray, curr_depth_stack)\n                    depth_error = self.compute_depth_error(left_plane_eq, 1.)\n\n\n\n                    # extract each plane_mask\n                    curr_plane = tf.slice(pred_mask_s,\n                                          [0, 0, 0, p],\n                                          [-1, -1, -1, 1])\n\n                    # depth_loss compute the variation of depth_error in predicted plane region\n                    depth_loss += tf.reduce_mean(depth_error  * curr_plane)\n\n                    if p == 0:\n                        depth_error_stack = val2uint8(depth_error,0.3)\n                        plane_mask_stack = curr_plane\n                    else:\n                        depth_error_stack = tf.concat([depth_error_stack,\n                                                       val2uint8(depth_error, 0.3) ], axis=-1)\n\n                        plane_mask_stack = tf.concat([plane_mask_stack,\n                                                      curr_plane], axis=-1)\n\n\n                #normlaized depth for visulaization\n                norm_tgt_depth = ((curr_depth_stack - tf.reduce_min(curr_depth_stack)) /\n                                        (tf.reduce_max(curr_depth_stack) - tf.reduce_min(curr_depth_stack))) * 255\n\n                color_plane_mask = color_mask(self.deprocess_image(curr_tgt_image), pred_mask_s, colors, alpha=0.3)\n\n\n                #stack all different scale results together\n                tgt_image_all.append(curr_tgt_image)\n                tgt_depth_all.append(norm_tgt_depth)\n                depth_error_all.append(depth_error_stack)\n                plane_mask_stack_all.append(plane_mask_stack)\n                color_plane_mask_stack_all.append(color_plane_mask)\n                non_plane_mask_stack_all.append(pred_mask_s[:,:,:,-1:])\n\n            perpendicular_loss = self.compute_perpendicular_error(pred_param , 1)\n\n\n\n            total_loss = depth_loss + plane_loss  + (perpendicular_loss*0.1)\n\n\n\n        with tf.name_scope(\"train_op\"):\n            train_vars = [var for var in tf.trainable_variables()]\n            optim = tf.train.AdamOptimizer(opt.learning_rate,\n                                           opt.beta1,\n                                           opt.beta2)\n\n            self.train_op = slim.learning.create_train_op(total_loss, optim)\n            self.global_step = tf.Variable(0, \n                                           name='global_step', \n                                           trainable=False)\n\n            self.incr_global_step = tf.assign(self.global_step, \n                                              self.global_step+1)\n\n\n        # Collect tensors that are useful later (e.g. tf summary)\n        self.pred_param = pred_param\n        self.steps_per_epoch = loader.steps_per_epoch # how many step is need to finish one epoch\n        self.total_loss = total_loss\n        self.plane_loss = plane_loss\n        self.depth_loss = depth_loss\n        self.perpendicular_loss = perpendicular_loss\n        self.tgt_image_all = tgt_image_all\n        self.tgt_depth_all = tgt_depth_all\n        self.depth_error_all = depth_error_all\n        self.plane_mask_stack_all = plane_mask_stack_all\n        self.color_plane_mask = color_plane_mask_stack_all\n        self.non_plane_mask_stack_all = non_plane_mask_stack_all\n\n\n\n    def compute_plane_reg_loss(self, pred_in, ref):\n        # eq 5 in paper\n        # - max to ensure exp() will not explode to inf\n        pred = pred_in - tf.reduce_max(pred_in, axis=-1,keep_dims=True)\n        pred_plane_only = pred[:, :, :, :-1]\n        # numerical stable implement of\n        # plane_mask = tf.reduce_logsumexp(pred_plane_only, axis=-1) - tf.reduce_logsumexp(pred, axis=-1)\n        # ensure log() will not explode to -inf\n        pred_plane_only_max = tf.reduce_max(pred_plane_only, axis=-1,keep_dims=True)\n\n        plane_mask = tf.reduce_logsumexp(pred_plane_only - pred_plane_only_max, axis=-1,keep_dims=True) + \\\n                            pred_plane_only_max - tf.reduce_logsumexp(pred, axis=-1,keep_dims=True)\n        # combine non plane and plane log(P_pred) together\n        non_plane_mask = pred[:, :, :, -1:] - tf.reduce_logsumexp(pred, axis=-1,keep_dims=True)\n        curr_pred_mask = tf.concat([non_plane_mask, plane_mask], axis=3)\n        \n        # caclulate the cross entropy and return\n        return  -tf.reduce_mean(tf.reduce_sum(ref * curr_pred_mask, axis=-1) )\n\n\n    def compute_depth_error(self,proj_homo,proj_depth):\n\n        diff = proj_homo - proj_depth\n\n        l1_diff = tf.reduce_sum(\n            tf.abs(diff), axis=-1, keep_dims=True\n        )\n\n        depth_error = l1_diff\n\n        return depth_error\n\n\n    def compute_perpendicular_error(self,  normal_vectors, num):\n\n        # Computes the inter-plane Loss function \n\n        new_normal_vector = tf.transpose(normal_vectors , perm=[0, 2,1 ]) \n\n\n        print(\" VECTORIZED Inter Plane LOSS FUNCTION\")\n        perp_result = [] \n        check= {}\n        err = {}  \n        result = tf.matmul( new_normal_vector ,normal_vectors ) #shape of tensors [8, 3,3]\n        angle_result = tf.acos(result )\n        label  = tf.eye( 8, num_columns=3, batch_shape=None)\n        zero = tf.zeros([8, 3, 3])\n        ones = tf.ones([8,3,3])\n        label = tf.where(angle_result > 0.785, zero, ones )\n        mid_res = tf.squared_difference(result, label)\n        res_error  =tf.reduce_sum((mid_res) , [0,1 ,2])\n        perp_loss = 0 \n        perp_loss = res_error/8\n\n        return perp_loss\n\n\n\n    def collect_summaries(self):   #tf.summary can export model param\n        opt = self.opt\n        tf.summary.scalar(\"total_loss\", self.total_loss)\n        tf.summary.scalar(\"depth_loss\", self.depth_loss)\n        tf.summary.scalar(\"perpendicular_loss\", self.perpendicular_loss)\n        tf.summary.scalar(\"plane_loss\", self.plane_loss)\n\n        for s in range(self.num_scales):\n            tf.summary.image('scale%d_target_image' % s, \\\n                             self.deprocess_image(self.tgt_image_all[s]), max_outputs=opt.batch_size)\n\n\n            tf.summary.image('scale%d_norm_depth_image' % s, \\\n                             self.tgt_depth_all[s], max_outputs=opt.batch_size)\n\n            tf.summary.image('scale%d_color_masks' % s, \\\n                             self.color_plane_mask[s], max_outputs=opt.batch_size)\n\n            tf.summary.image('scale%d_non_plane_mask' % s, \\\n                             self.non_plane_mask_stack_all[s], max_outputs=opt.batch_size)\n\n            for p in range(opt.num_plane):\n                tf.summary.image('scale%d_plane_mask_num_%d' % (s, p),\n                                 self.plane_mask_stack_all[s][:, :, :, p: p + 1], max_outputs=opt.batch_size)\n\n                tf.summary.image('scale%d_depth_error_%d' % (s, p),\n                                 self.depth_error_all[s][:, :, :, p: p + 1], max_outputs=opt.batch_size)\n\n                if s == 0:\n                    tf.summary.text(\"plane_num_%d_n\" % (p), tf.as_string(self.pred_param[:, p, :]))\n\n\n    def train(self, opt):\n        # TODO: currently fixed to 4\n        self.num_scales = 4\n        self.opt = opt\n        self.build_train_graph()\n        self.collect_summaries()                        #export the result to tensorboard\n        self.i = 0 \n        with tf.name_scope(\"parameter_count\"):\n            # tf.reduce_prod: compute the prodcut of element across dimensions of a tensors\n            # parameter_count is the number of params\n            parameter_count = tf.reduce_sum([tf.reduce_prod(tf.shape(v)) \\\n                                            for v in tf.trainable_variables()])\n\n        self.saver = tf.train.Saver([var for var in tf.model_variables()] + \\\n                                    [self.global_step],\n                                     max_to_keep=10)\n\n        # save the variables of the model and keep the max memory as 10 files\n        sv = tf.train.Supervisor(logdir=opt.checkpoint_dir, \n                                 save_summaries_secs=0, \n                                 saver=None)\n\n        os.environ['CUDA_VISIBLE_DEVICES'] = opt.gpu\n\n        config = tf.ConfigProto()\n        config.gpu_options.allow_growth = True\n        with sv.managed_session(config=config) as sess:\n            print('Trainable variables: ')\n            for var in tf.trainable_variables():\n                print(var.name)\n            print(\"parameter_count =\", sess.run(parameter_count))\n            if opt.continue_train:\n                if opt.init_checkpoint_file is None:\n                    checkpoint = tf.train.latest_checkpoint(opt.chsummaryeckpoint_dir)\n                else:\n                    checkpoint = opt.init_checkpoint_file\n                print(\"Resume training from previous checkpoint: %s\" % checkpoint)\n                self.saver.restore(sess, checkpoint)\n            start_time = time.time()\n            for step in range(1, opt.max_steps):\n                fetches = {\n                    \"train\": self.train_op,\n                    \"global_step\": self.global_step,\n                    \"incr_global_step\": self.incr_global_step,\n                }\n                print(str(self.i) +\"/800000 ** \" +\"  code name = train_perpendicular.py\"  + \" **perpendicular_loss weight  =0.1\" + \" ** logdir =new_check2\" )\n                self.i +=1\n\n\n                if step % opt.summary_freq == 0:\n                    fetches[\"total_loss\"] = self.total_loss\n                    fetches[\"summary\"] = sv.summary_op\n\n                if step >= opt.max_steps - 100:\n                    fetches[\"masked_res\"] = self.color_plane_mask\n\n                results = sess.run(fetches)\n                gs = results[\"global_step\"]\n\n                if step >= opt.max_steps - 100:\n                    last_res = results[\"masked_res\"]\n                    for i in range(opt.batch_size):\n                        scipy.misc.imsave(opt.checkpoint_dir + \"/res_\" + str(step) +\"_\" + str(i) + \".jpg\", last_res[0][i,:,:,:])\n\n                if step % opt.summary_freq == 0:\n                    sv.summary_writer.add_summary(results[\"summary\"], gs)\n                    train_epoch = math.ceil(gs / self.steps_per_epoch)    # one epoch means all the data in training set is explored once\n                                                                          # steps_per_epoch is the len of the data_batch,\n                                                                          # gs is the time fetch has been run\n                    train_step = gs - (train_epoch - 1) * self.steps_per_epoch\n\n                    # print the progress of every 100 iterations\n                    print(\"Epoch: [%2d] [%5d/%5d] time: %4.4f/it total_loss: %.3f\" \\\n                            % (train_epoch, train_step, self.steps_per_epoch, \\\n                                (time.time() - start_time)/opt.summary_freq,\\\n                                results[\"total_loss\"]\\\n                               ))\n\n                    start_time = time.time()\n\n                if step % opt.save_latest_freq == 0:\n                    self.save(sess, opt.checkpoint_dir, 'latest')\n\n                if step % self.steps_per_epoch == 0:\n                    self.save(sess, opt.checkpoint_dir, gs)\n\n\n    # build network for testing\n    def build_plane_test_graph(self):\n        input_uint8 = tf.placeholder(tf.uint8, [self.batch_size,\n                                                self.img_height,\n                                                self.img_width, 3], name='raw_input')\n        input_mc = self.preprocess_image(input_uint8)\n        with tf.name_scope(\"plane_predication\"):\n            pred_param, pred_masks, plane_net_endpoints = plane_pred_net(\n                input_mc, num_plane=self.num_plane, is_training=False)\n            pred_mask_0 = tf.nn.softmax(pred_masks[0])\n        pred_mask = pred_mask_0\n        self.inputs = input_uint8\n        self.pred_mask = pred_mask\n        self.pred_param = pred_param\n        self.plane_epts = plane_net_endpoints\n\n\n    def preprocess_image(self, image):\n        # Assuming input image is uint8\n        image = tf.image.convert_image_dtype(image, dtype=tf.float32)\n        return image * 2. -1.    # centeralize\n\n    def deprocess_image(self, image):\n        # Assuming input image is float32\n        image = (image + 1.)/2.\n        return tf.image.convert_image_dtype(image, dtype=tf.uint8)\n\n\n    def setup_inference(self,\n                        img_height,\n                        img_width,\n                        num_plane,\n                        batch_size=1):\n        self.img_height = img_height\n        self.img_width = img_width\n        self.num_plane = num_plane\n        self.batch_size = batch_size\n        self.build_plane_test_graph()\n\n\n\n    def inference(self, inputs, sess): #, mode='depth'\n        fetches = {'pred_param':self.pred_param,\n                   'pred_mask':self.pred_mask}\n\n        results = sess.run(fetches, feed_dict={self.inputs:inputs})\n        return results\n\n    def save(self, sess, checkpoint_dir, step):\n        model_name = 'model'\n        print(\" [*] Saving checkpoint to %s...\" % checkpoint_dir)\n        if step == 'latest':\n            self.saver.save(sess, \n                            os.path.join(checkpoint_dir, model_name + '.latest'))\n        else:\n            self.saver.save(sess, \n                            os.path.join(checkpoint_dir, model_name),\n                            global_step=step)\n\n"
  },
  {
    "path": "plane_segmentation/crf_inference.py",
    "content": "# Author: Sudarshan\n\nimport sys\nimport numpy as np\nimport pydensecrf.densecrf as dcrf\nimport matplotlib.pyplot as plt\nimport matplotlib.image as mpimg\nimport cv2\n\nimport os\nimport argparse\n\ntry:\n    from cv2 import imread, imwrite\nexcept ImportError:\n\n\n    from skimage.io import imread, imsave\n    imwrite = imsave\n\n\nfrom pydensecrf.utils import unary_from_labels, create_pairwise_bilateral, create_pairwise_gaussian\n\n\ndef CRF_act(  fn_im ,fn_anno , fn_output , fn_output2 ):\n\n  anno_rgb = imread(fn_anno).astype(np.uint8)\n  anno_lbl = anno_rgb[:,:,0] + (anno_rgb[:,:,1] << 8) + (anno_rgb[:,:,2] << 16)\n\n  colors_not_used, labels = np.unique(anno_lbl, return_inverse=True)\n\n  img = imread(fn_im)\n  img = cv2.resize(img , (320,192), interpolation =cv2.INTER_AREA)\n\n  n_labels   = 4\n\n  colorize = np.empty((len(colors_not_used), 3), np.uint8)\n  colorize[:,0] = (colors_not_used & 0x0000FF)\n  colorize[:,1] = (colors_not_used & 0x0000FF) >> 8\n  colorize[:,2] = (colors_not_used & 0x0000FF) >> 8\n\n\n  d = dcrf.DenseCRF2D(img.shape[1], img.shape[0], n_labels)\n\n\n  U = unary_from_labels(labels, n_labels, gt_prob=0.8, zero_unsure=False)\n\n  d.setUnaryEnergy(U)\n\n\n  feats = create_pairwise_gaussian(sdims=(5, 5), shape=img.shape[:2])\n  d.addPairwiseEnergy(feats, compat=2,\n                        kernel=dcrf.DIAG_KERNEL,\n                        normalization=dcrf.NORMALIZE_SYMMETRIC)\n\n  feats = create_pairwise_bilateral(sdims=(100, 100), schan=(13, 13, 13),\n                                      img=img, chdim=2)\n\n  d.addPairwiseEnergy(feats, compat=7,\n                        kernel=dcrf.DIAG_KERNEL,\n                        normalization=dcrf.NORMALIZE_SYMMETRIC)\n\n\n  Q, tmp1, tmp2 = d.startInference()\n  for i in range(6):\n      print(\"KL-divergence at {}: {}\".format(i, d.klDivergence(Q)))\n      d.stepInference(Q, tmp1, tmp2)\n  MAP = np.argmax(Q, axis=0)\n  MAP = colorize[MAP,:]\n  \n  MAP = MAP.reshape(img.shape)\n  MAP = cv2.morphologyEx(MAP, cv2.MORPH_CLOSE, kernel)\n\n  print(np.shape(MAP))\n  imwrite(fn_output, MAP)\n\n  for rows in range(np.shape(MAP)[0]):\n  \tfor cols in range(np.shape(MAP)[1]):\n\n  \t\tif (MAP[rows][cols][0]  > 0):\n  \t\t\tMAP[rows][cols] =255\n\n  \t\tif (MAP[rows][cols][0]  == 0):\n  \t\t\tMAP[rows][cols] =0\n\n  imwrite(fn_output2, MAP)\n  #imwrite(fn_output, MAP.reshape(img.shape))\n\n\nkernel = (10,10)\n\nparser = argparse.ArgumentParser()\nparser.add_argument('rgb_image_dir', help='path to original rgb images')\nparser.add_argument('labels_dir', help='path to plane_sgmts_modified output dir from planerecover')\nparser.add_argument('output_dir', help='path to output refined masks')\n\nargs = parser.parse_args()\n\nRGB_image_dir = args.rgb_image_dir\nlabels_dir = args.labels_dir\noutput_dir = args.output_dir\noutput_dir2= \"/home/sudarshan/planerecover/CRF/Results_advio_single\"\n\nimages = sorted(os.listdir(RGB_image_dir))\nlabels = sorted(os.listdir(labels_dir))\n\nk = 0\nfor i, j in zip(images ,  labels):\n\tk +=1\n\n\tif ( k >8948):\n  \n\t\t  image_name =  i #\"img\" + str(i) +\".jpg\"\n\t\t  image_path = os.path.join(RGB_image_dir, image_name)\n\n\t\t  labels_name = j #\"OpenLORIS_images_img\" + str(i) +\".png\" \n\t\t  labels_path = os.path.join(labels_dir, labels_name)\n\n\t\t  output_name = \"img\" + str(i) + \"_crf_res\" +\".png\"\n\t\t  output_path = os.path.join(output_dir, output_name)\n\t\t  output_path2 = os.path.join(output_dir2, output_name)\n\n\n\t\t  CRF_act(image_path , labels_path, output_path, output_path2)\n\t\t  \n\n\t\t  print( \"***********\" + str(image_name) +\"/\" + \"****** \" + str(labels_name) + \"*****\" ) \n"
  },
  {
    "path": "plane_segmentation/data_loader_new.py",
    "content": "from __future__ import division\nimport os\nimport random\nimport tensorflow as tf\n\nclass DataLoader(object):\n    def __init__(self, \n                 dataset_dir=None, \n                 batch_size=None, \n                 img_height=None, \n                 img_width=None,\n                 num_scales=None):\n        self.dataset_dir = dataset_dir\n        self.batch_size = batch_size\n        self.img_height = img_height\n        self.img_width = img_width\n        self.num_scales = num_scales\n\n    def load_train_batch(self):\n        \"\"\"Load a batch of training instances.\n        \"\"\"\n        seed = random.randint(0, 2**31 - 1)\n\n        # Load the list of training files into queues and shuffle it\n        file_list = self.format_file_list(self.dataset_dir, 'train_8000_recent_working')\n\n        print(\"**************\" + \"LOADED NEW DATA train_8000_recent_working\" + \"***********\")\n        image_paths_queue = tf.train.string_input_producer(\n            file_list['image_file_list'], \n            seed=seed, \n            shuffle=True)\n\n        cam_paths_queue = tf.train.string_input_producer(\n            file_list['cam_file_list'], \n            seed=seed, \n            shuffle=True)\n\n        depth_paths_queue = tf.train.string_input_producer(\n            file_list['depth_file_list'],\n            seed=seed,\n            shuffle=True)\n\n        label_paths_queue = tf.train.string_input_producer(\n            file_list['label_file_list'],\n            seed=seed,\n            shuffle=True)\n\n        self.steps_per_epoch = int(\n            len(file_list['image_file_list'])//self.batch_size)\n\n        # Load images\n        img_reader = tf.WholeFileReader()\n        _, image_contents = img_reader.read(image_paths_queue)\n        image_seq = tf.image.decode_image(image_contents, channels=3)\n        #\n        tgt_image = tf.reshape(image_seq, [self.img_height, self.img_width, 3])\n\t#tgt_image= tf.cast(tgt_image, tf.float32)\n\t#print( \" **** images loaded ******************\")\n\n        # Load labels\n        label_reader = tf.WholeFileReader()\n        _, label_contents = label_reader.read(label_paths_queue)\n        label_seq = tf.image.decode_png(label_contents)\n        tgt_label = tf.reshape(label_seq, [self.img_height, self.img_width, 1])\n\n        print(\" CHECKS PASSED   \")\n\n        # Load depths\n        depth_reader = tf.WholeFileReader()\n        _, depth_contents = depth_reader.read(depth_paths_queue)\n        # image_seq = tf.image.decode_jpeg(image_contents)\n        tgt_image_detph = tf.image.decode_png(depth_contents,dtype=tf.uint16 ,channels=1)[:,:, 0]\n        tgt_detph = tf.cast(tgt_image_detph, dtype=tf.float32)\n        tgt_detph = tf.reshape(tgt_detph, [ self.img_height, self.img_width, 1]) \\\n                          / tf.constant(100., dtype=tf.float32,shape=[self.img_height, self.img_width, 1])\n\n        # Load camera intrinsics\n        print(\"------------------- intrininsics inncluded ----------\" )\n        cam_reader = tf.TextLineReader()\n        _, raw_cam_contents = cam_reader.read(cam_paths_queue)\n        rec_def = []\n        for i in range(9):\n            rec_def.append([1.])\n        raw_cam_vec = tf.decode_csv(raw_cam_contents,record_defaults=rec_def)\n        raw_cam_vec = tf.stack(raw_cam_vec)\n        intrinsics = tf.reshape(raw_cam_vec, [3, 3])\n        #intrinsics = [ [1169.621094 , 0.000000,  646.295044] , [ 0.000000, 1167.105103 ,489.927032 ] , [ 0.000000 ,0.000000, 1.000000] ]\n\n        # Form training batches\n        tgt_image, tgt_detph, tgt_label, intrinsics = \\\n                tf.train.batch([tgt_image, tgt_detph,tgt_label, intrinsics],\n                               batch_size=self.batch_size) #it will upload 4 batch from the dataset\n\n        # Data augmentation\n        tgt_image, tgt_detph, tgt_label, intrinsics = self.data_augmentation(\n            tgt_image, tgt_detph, tgt_label,intrinsics, self.img_height, self.img_width)\n\n        intrinsics = self.get_multi_scale_intrinsics(\n            intrinsics, self.num_scales)\n\n        return tgt_image, tgt_detph, tgt_label, intrinsics\n\n    def make_intrinsics_matrix(self, fx, fy, cx, cy):\n        # Assumes batch input\n        batch_size = fx.get_shape().as_list()[0]\n        zeros = tf.zeros_like(fx)\n        r1 = tf.stack([fx, zeros, cx], axis=1)\n        r2 = tf.stack([zeros, fy, cy], axis=1)\n        r3 = tf.constant([0.,0.,1.], shape=[1, 3])\n        r3 = tf.tile(r3, [batch_size, 1])\n        intrinsics = tf.stack([r1, r2, r3], axis=1)\n        return intrinsics\n\n    def data_augmentation(self, im, depth, label, intrinsics, out_h, out_w):\n        # Random scaling\n        def random_scaling(im, depth, label, intrinsics):\n            batch_size, in_h, in_w, _ = im.get_shape().as_list()\n            scaling = tf.random_uniform([2], 1, 1.15)\n            x_scaling = scaling[0]\n            y_scaling = scaling[1]\n            out_h = tf.cast(in_h * y_scaling, dtype=tf.int32)\n            out_w = tf.cast(in_w * x_scaling, dtype=tf.int32)\n            im = tf.image.resize_area(im, [out_h, out_w])\n            depth = tf.image.resize_area(depth, [out_h, out_w])\n            label = tf.image.resize_area(label, [out_h, out_w])\n            fx = intrinsics[:,0,0] * x_scaling\n            fy = intrinsics[:,1,1] * y_scaling\n            cx = intrinsics[:,0,2] * x_scaling\n            cy = intrinsics[:,1,2] * y_scaling\n            intrinsics = self.make_intrinsics_matrix(fx, fy, cx, cy)\n            return im, depth, label, intrinsics\n\n        # Random cropping\n        def random_cropping(im, depth, label, intrinsics, out_h, out_w):\n            # batch_size, in_h, in_w, _ = im.get_shape().as_list()\n            batch_size, in_h, in_w, _ = tf.unstack(tf.shape(im))\n            offset_y = tf.random_uniform([1], 0, in_h - out_h + 1, dtype=tf.int32)[0]  #the scale of in_h and out_h can be different\n            offset_x = tf.random_uniform([1], 0, in_w - out_w + 1, dtype=tf.int32)[0]  # because of the scaling process runs before it\n            im = tf.image.crop_to_bounding_box(im, offset_y, offset_x, out_h, out_w)\n            depth = tf.image.crop_to_bounding_box(depth, offset_y, offset_x, out_h, out_w)\n            label = tf.image.crop_to_bounding_box(label, offset_y, offset_x, out_h, out_w)\n            fx = intrinsics[:,0,0]\n            fy = intrinsics[:,1,1]\n            cx = intrinsics[:,0,2] - tf.cast(offset_x, dtype=tf.float32)\n            cy = intrinsics[:,1,2] - tf.cast(offset_y, dtype=tf.float32)\n            intrinsics = self.make_intrinsics_matrix(fx, fy, cx, cy)\n            return im, depth, label, intrinsics\n        im, depth, label, intrinsics = random_scaling(im, depth,label, intrinsics)\n        im, depth, label, intrinsics = random_cropping(im, depth, label, intrinsics, out_h, out_w)\n        im = tf.cast(im, dtype=tf.uint8)\n        return im,depth,label, intrinsics\n\n    def format_file_list(self, data_root, split):\n        with open(data_root + '/%s.txt' % split, 'r') as f:\n            frames = f.readlines()\n        subfolders = [x.split(' ')[0] for x in frames]\n        frame_ids = [x.split(' ')[1][:-1] for x in frames]\n        image_file_list = [os.path.join(data_root, subfolders[i], \n            frame_ids[i] + '.jpg') for i in range(len(frames))]\n        #print(image_file_list)\n\n        cam_file_list = [os.path.join(data_root, subfolders[i], \n            frame_ids[i] + '_cam.txt') for i in range(len(frames))]\n\n        depth_file_list = [os.path.join(data_root, subfolders[i],\n            frame_ids[i] + '_depth.png') for i in range(len(frames))]\n\n        label_file_list = [os.path.join(data_root, subfolders[i],\n            frame_ids[i] + '_label.png') for i in range(len(frames))]\n\n        all_list = {}\n        all_list['image_file_list'] = image_file_list\n        all_list['cam_file_list'] = cam_file_list\n        all_list['depth_file_list'] = depth_file_list\n        all_list['label_file_list']= label_file_list\n        return all_list\n\n\n    def get_multi_scale_intrinsics(self, intrinsics, num_scales):\n        intrinsics_mscale = []\n        # Scale the intrinsics accordingly for each scale\n        for s in range(num_scales):\n            fx = intrinsics[:,0,0]/(2 ** s)\n            fy = intrinsics[:,1,1]/(2 ** s)\n            cx = intrinsics[:,0,2]/(2 ** s)\n            cy = intrinsics[:,1,2]/(2 ** s)\n            intrinsics_mscale.append(\n                self.make_intrinsics_matrix(fx, fy, cx, cy))\n        intrinsics_mscale = tf.stack(intrinsics_mscale, axis=1)\n        return intrinsics_mscale\n"
  },
  {
    "path": "plane_segmentation/inference.py",
    "content": "from __future__ import division\nimport tensorflow as tf\nimport numpy as np\nimport os\nimport random\nimport colorsys\nimport time\nimport scipy.misc\nimport PIL.Image as pil\nimport cv2\n\n\n# from utils_depth_only import *\nfrom RecoverPlane_perpendicular import RecoverPlane\n\n'''\nTest the train result on SYNTHIA (for params)\n@author -- Fengting Yang \n\n@modified by Sudarshan\n\n@usage:\n    test the train result(param) with depth prediction metric. \n\n@Output:\n   1. plane masks and the visualization, \n   2. pred_depth maps\n   3. the statistic metric of depth prediction (see on the terminal)\n\n\n@parameters:\n   1. main parameters coudl be seen in the FLAGS\n   2. intrinsics:       The camera intrinsics, note if the image is resized,  please reset the intrinsics correspondingly\n'''\n\n\nflags = tf.app.flags\nflags.DEFINE_integer(\"batch_size\", 4, \"The size of of a sample batch\")\nflags.DEFINE_integer(\"img_height\", 192, \"Image height\") \nflags.DEFINE_integer(\"img_width\", 320, \"Image width\")\nflags.DEFINE_integer(\"num_plane\",3, \"plane number\")\nflags.DEFINE_boolean(\"use_preprocessed\", True, 'if use the propocessed data we provided for test' )\nflags.DEFINE_string(\"dataset_dir\", '', \"Filtered Dataset directory\")\nflags.DEFINE_string(\"output_dir\", '', \"Output directory\")\nflags.DEFINE_string(\"gpu\", \"0\", \"GPU ID\")\nflags.DEFINE_string(\"test_list\", 'data_pre_processing/SYNTHIA/tst_100.txt', \"Test list\")\nflags.DEFINE_string(\"ckpt_file\", 'pre_trained_model/synthia_498000', \"checkpoint file\")\nFLAGS = flags.FLAGS\n\nos.environ['CUDA_VISIBLE_DEVICES'] = FLAGS.gpu\n\nprint(FLAGS.gpu)\n\n#intrinsics =  np.array(([[133.185088,0.,160.000000], [ 0., 134.587036,96.000000], [0., 0., 1.]]))\nfocalLength_x =133.185075\nfocalLength_y = 134.587036\ncenterX = 160.000000\ncenterY =  96.000000\n\nTEST_LIST = str(FLAGS.test_list)\nnum_test = 100\nMAX_DEPTH = 100.\nMIN_DEPTH = 0.1\n\nseed = 999\n\n\n# ****************************colorful mask part*************************************\n# Usage: apply different color to each plane\n#        the plane determination is based on the plane_threshold = 0.5 now\n#        and the area without additional color are belong to non-plane\n#\n\ndef random_colors(N, bright=True):\n    \"\"\"\n    Generate random colors.\n    To get visually distinct colors, generate them in HSV space then\n    convert to RGB.\n    \"\"\"\n    brightness = 1.0 if bright else 0.7\n    hsv = [(i / N, 1, brightness) for i in range(N)]\n    colors = list(map(lambda c: colorsys.hsv_to_rgb(*c), hsv))\n    random.seed(seed)\n    random.shuffle(colors)\n    return colors\n\n\ndef apply_mask(image, mask, max_mask, color, alpha=0.5):\n    \"\"\"Apply the given mask to the image.\n    \"\"\"\n    for c in range(3):\n        image[:, :, c] = np.where(mask == max_mask,\n                                  image[:, :, c] *\n                                  (1 - alpha) + alpha * color[c] * 255,\n                                  image[:, :, c])\n    return image\n\ndef color_mask(image, pred_masks, colors, alpha=0.5 ):\n    '''do iteration to assign the color to the corrosponding mask\n       Based on experiment, the first plane will be red, and the second one will be green-blue, when num=2\n       and drak blue ,red ,green for num=3\n    '''\n    N = FLAGS.num_plane\n    masked_image = np.copy(image)\n    #new_colors = [ [ 255,125,255 ] , [255,255,255] , [125,255,0]  ] \n   #  masked_image = image  # change the original color as well, so the model could have the plane color when we visualize it\n    max_mask = np.max(pred_masks, axis=-1)\n    for i in range(N):\n        color = colors[i]\n        mask = pred_masks[:,:,i]\n        masked_image = apply_mask(masked_image, mask, max_mask, color, alpha)\n\n    return masked_image\n\n\n#**************************************get the predicted plane mask *******************\ndef thres_mask(pred_masks, num_plane):\n    '''\n        find each plane's mask using the argmax(prob) approach\n    '''\n    thres_mask = np.zeros(pred_masks.shape)                 #num_plane + 1 non-plane\n    max_mask = np.max(pred_masks, axis=-1)  # here for depth prediction, only plane possiblity is used\n    for p in range(num_plane + 1):\n        plane_mask = pred_masks[:, :, p]\n        # in each channel the region with value 1 corresponding to the plane area in this plane\n        thres_mask[:,:,p] = np.where(plane_mask == max_mask, thres_mask[:,:,p]+1., thres_mask[:,:,p])\n\n    return thres_mask\n\n#************************************get depth from plane parameters****************************\ndef meshgrid(height, width, is_homogeneous=True):\n  \"\"\"Construct a 2D meshgrid.\n\n  Args:\n    batch: batch size\n    height: height of the grid\n    width: width of the grid\n    is_homogeneous: whether to return in homogeneous coordinates\n  Returns:\n    x,y grid coordinates [batch, 2 (3 if homogeneous), height, width]\n  \"\"\"\n  x_t = np.matmul(np.ones(shape=[height,1]),np.expand_dims(np.linspace(-1.,1,width),1).T)\n  y_t = np.matmul(np.expand_dims(np.linspace(-1.0, 1.0, height), 1),np.ones(shape=np.stack([1, width])))\n\n  x_t = (x_t + 1.0) * 0.5 * (width - 1)\n  y_t = (y_t + 1.0) * 0.5 * (height - 1)\n  if is_homogeneous:\n    ones = np.ones_like(x_t)\n    coords =np.stack([x_t, y_t, ones], axis=0)\n  else:\n    coords = np.stack([x_t, y_t], axis=0)\n  return coords\n\ndef compute_depth(img, pred_param, num_plane, intrinsics):\n    height = img.shape[0]\n    width  = img.shape[1]\n\n    # Construct pixel grid coordinates\n    pixel_coords = meshgrid(height, width)  # 3*128*416\n\n    cam_coords = np.reshape(pixel_coords, [3, -1])\n\n    unscaled_ray_Q = np.matmul(np.linalg.inv(intrinsics), cam_coords)\n\n    for k in range(num_plane):\n        n_div_d = np.expand_dims(pred_param[ k, :], axis=0)\n        scale =  1./ (np.matmul(n_div_d, np.matmul(np.linalg.inv(intrinsics), cam_coords)) + 1e-10)\n\n\n        plane_based_Q = scale * (unscaled_ray_Q )\n        plane_based_Q = np.reshape(plane_based_Q, [3, height, width])\n        plane_based_Q = np.transpose(plane_based_Q, [1, 2, 0])\n\n        if k == 0:\n            plane_depth_stack =  plane_based_Q[:,:,-1:]\n        else:\n            plane_depth_stack = np.concatenate([plane_depth_stack,\n                                         plane_based_Q[ :,:, -1:]], axis=-1)\n\n    return plane_depth_stack\n\n#*********************************compute depth error**************************************\ndef compute_errors(gt, pred):\n    b_empyt = (gt.size == 0)   # if there is no groundtruth available\n    b_non_zero = np.all(gt * pred) # if one of them are 0 this will return false\n    if b_empyt or not b_non_zero:\n        return [-100, -100, -100, -100, -100, -100, -100]   #this tst image will be ignored\n    thresh = np.maximum((gt / pred), (pred / gt))\n    a1 = (thresh < 1.25   ).mean()\n    a2 = (thresh < 1.25 ** 2).mean()\n    a3 = (thresh < 1.25 ** 3).mean()\n\n    rmse = (gt - pred) ** 2\n    rmse = np.sqrt(rmse.mean())\n\n    rmse_log = (np.log(gt) - np.log(pred)) ** 2\n    rmse_log = np.sqrt(rmse_log.mean())\n\n    abs_rel = np.mean(np.abs(gt - pred) / gt)\n\n    sq_rel = np.mean(((gt - pred)**2) / gt)\n\n    return abs_rel, sq_rel, rmse, rmse_log, a1, a2, a3\n\n\n#*****************************************************************************************************\n\ndef main(_):\n    with open(TEST_LIST, 'r') as f:\n        test_files_list = []\n        #depth_file_list = []\n        test_files = f.readlines()\n        for t in test_files:\n            t_split = t[:-1].split()\n\n            if not FLAGS.use_preprocessed:\n                # use these two lines only if you preprocessed the dataset from scratch\n                test_files_list.append(FLAGS.dataset_dir + '/' +  t_split[-1] )\n                #depth_file_list.append(FLAGS.dataset_dir + '/' + t_split[0] +'/'+ t_split[-1] + '_depth.png')\n            else:\n                # use these two lines if you use our preprocessed dataset\n                if t_split[0] == '22': # seq 22 is not available in our preprocessed dataset, see README for more details\n                    continue\n                test_files_list.append(FLAGS.dataset_dir + '/' + t_split[0] +'/'+ t_split[-1] )\n                #depth_file_list.append(FLAGS.dataset_dir + '/' + t_split[0] +'/'+ t_split[-1] + '_depth.png')\n\n    if not os.path.exists(FLAGS.output_dir):\n        os.makedirs(FLAGS.output_dir)\n    basename = os.path.basename(FLAGS.ckpt_file)\n\n    # to ensure the consistant color map\n    default_top_five_colors = [(0.8, 0.0, 1.0), (0.8, 1.0, 0.0), (0.0, 1.0, 0.4), (1.0, 0.0, 0.0), (0.0, 0.4, 1.0)]\n    if FLAGS.num_plane <= 5:\n        colors = default_top_five_colors\n    else:\n        # Generate random colors\n        colors = random_colors( FLAGS.num_plane)\n        cnt = 0\n        for i in default_top_five_colors:\n            if i not in colors:\n                colors[cnt] = i\n                cnt += 1\n\n\n\n    planeRecover = RecoverPlane()\n    planeRecover.setup_inference(img_height=FLAGS.img_height,\n                        img_width=FLAGS.img_width,\n                        batch_size=FLAGS.batch_size,\n                        num_plane=FLAGS.num_plane\n                       )\n\n    rms = np.zeros(num_test, np.float32)\n    log_rms = np.zeros(num_test, np.float32)\n    abs_rel = np.zeros(num_test, np.float32)\n    sq_rel = np.zeros(num_test, np.float32)\n    a1 = np.zeros(num_test, np.float32)\n    a2 = np.zeros(num_test, np.float32)\n    a3 = np.zeros(num_test, np.float32)\n\n    avg_time = 0.\n    saver = tf.train.Saver([var for var in tf.model_variables()])\n    config = tf.ConfigProto()\n    config.gpu_options.allow_growth = True\n    with tf.Session(config=config) as sess:\n        saver.restore(sess, FLAGS.ckpt_file)\n\n        pred_all_masks = []\n        pred_all_param = []\n        for t in range(0, len(test_files_list), FLAGS.batch_size):\n            if t % 100 == 0:\n                print('processing %s: %d/%d' % (basename, t, len(test_files_list)))\n            inputs = np.zeros(\n                (FLAGS.batch_size, FLAGS.img_height, FLAGS.img_width, 3),\n                dtype=np.uint8)\n            for b in range(FLAGS.batch_size):\n                idx = t + b\n                if idx >= len(test_files_list):\n                    break\n                fh = open(test_files_list[idx], 'r')\n                raw_im =  cv2.imread(test_files_list[idx]) #pil.open(fh)\n                try:\n                    scaled_im = cv2.resize(raw_im , (FLAGS.img_width, FLAGS.img_height), interpolation = cv2.INTER_AREA)\n                    inputs[b] = np.array(scaled_im)\n\n                except:\n\n                    print( str(test_files_list[idx]) + \"is not read\")\n\n            start_time = time.time()\n            pred = planeRecover.inference(inputs, sess)\n            cost_time = time.time() - start_time\n            avg_time += cost_time\n            print(\"No.%d batch cost_time: %f/img\" % (int(t / FLAGS.batch_size), cost_time / FLAGS.batch_size))\n\n\n            for b in range(FLAGS.batch_size):\n                idx = t + b\n                if idx >= len(test_files_list):\n                    break\n\n                color_plane_mask = color_mask(inputs[b], pred['pred_mask'][b, :, :, :], colors, alpha=0.6)\n                \n                thres_masks = thres_mask(pred['pred_mask'][b, :, :, :],\n                                         FLAGS.num_plane)  # this will include non-plane mask at the last channle for depth\n\n                #pred_depth = compute_depth(inputs[b], pred['pred_param'][b, :, :], FLAGS.num_plane, intrinsics)\n                #masked_pred_depth = np.zeros([FLAGS.img_height, FLAGS.img_width, 1])\n                combined_mask = np.zeros([FLAGS.img_height, FLAGS.img_width, 1])\n\n\n                for p in range(FLAGS.num_plane):\n                    #masked_pred_depth += pred_depth[:, :, p: p + 1] * thres_masks[:, :, p: p + 1]\n                    combined_mask += (p + 1) * thres_masks[:, :, p: p + 1]  # for matlab to eval all the plane should start from 1, and the last channle will be 0 as non-plane\n\n\n                name = test_files_list[idx].split('/')\n                if not FLAGS.use_preprocessed:\n                    pic_name = name[-3] + '_' + name[-1]\n                else:\n                    pic_name = name[-2] + '_' + name[-1].replace('.jpg', '.png')\n\n                visual_path = FLAGS.output_dir + '/plane_sgmts_vis/'\n                eval_mask_path = FLAGS.output_dir + '/plane_sgmts/'\n                modified_eval_mask_path = FLAGS.output_dir + '/plane_sgmts_modfied/'\n\n                if not os.path.exists(visual_path):\n                    os.makedirs(visual_path)\n\n                if not os.path.exists(eval_mask_path):\n                    os.makedirs(eval_mask_path)\n                    os.makedirs(modified_eval_mask_path)\n\n                scipy.misc.imsave(visual_path + pic_name, color_plane_mask)\n                #combined_mask = combined_mask\n                \n                cv2.imwrite(eval_mask_path + pic_name, combined_mask)  # misc will normalize the number to 255 not good\n                modified_combined_mask = combined_mask*60\n                cv2.imwrite(modified_eval_mask_path + pic_name, modified_combined_mask)\n                '''\n\n                #*****************************depth eval************************\n\t\t#print(str(depth_file_list[idx]) ) \n\n                gt_depth = cv2.imread(depth_file_list[idx], cv2.IMREAD_ANYCOLOR | cv2.IMREAD_ANYDEPTH)[:, :, 0] / 100.\n\n                gt_height, gt_width = gt_depth.shape\n                masked_pred_depth = cv2.resize(masked_pred_depth, dsize=( gt_width, gt_height),interpolation=cv2.INTER_NEAREST)  \n                plane_area = cv2.resize(combined_mask, dsize=(gt_width, gt_height),interpolation=cv2.INTER_NEAREST)\n                pred_depth_save = masked_pred_depth\n\n\n                mask = np.logical_and(gt_depth > MIN_DEPTH,\n                                      gt_depth < MAX_DEPTH)\n\n                mask = np.logical_and(mask, plane_area > 0) # only eval the plane area\n\n                pred_depth_save = pred_depth_save * mask * 100\n\n                seq = depth_file_list[idx].split('/')\n                if not FLAGS.use_preprocessed:\n                    seq_id = seq[-3]\n                    name_id = seq[-1]\n                else:\n                    seq_id = seq[-2]\n                    name_id = seq[-1].replace('_depth.png', '.png')\n\n                if not os.path.exists(FLAGS.output_dir + '/depth/' ):\n                    os.makedirs(FLAGS.output_dir + '/depth/' )\n                save_pred_path = FLAGS.output_dir + '/depth/' + seq_id + '_' + name_id[:-4] + '_pred.png'\n\n                # save pred_depth and error\n                cv2.imwrite(save_pred_path, pred_depth_save.astype(np.uint16))\n\n                masked_pred_depth[masked_pred_depth < MIN_DEPTH] = MIN_DEPTH\n                masked_pred_depth[masked_pred_depth > MAX_DEPTH] = MAX_DEPTH\n\n                gt_depth_validate = gt_depth[mask]\n                pred_depth_validate = masked_pred_depth[mask]\n\n                abs_rel[idx], sq_rel[idx], rms[idx], log_rms[idx], a1[idx], a2[idx], a3[idx] = \\\n                    compute_errors(gt_depth_validate, pred_depth_validate)\n\n\n                # test if the thres is correct\n                # for p in range(FLAGS.num_plane):\n                #     thres_name = name[-3] + '_' + name[-1][:-4] + '_' + str(p) + '.png'\n                #     scipy.misc.imsave(visual_path + thres_name, thres_masks[:,:,p])\n\n                pred_all_masks.append(pred['pred_mask'][b,:,:,:])\n                pred_all_param.append(pred['pred_param'][b,:,:])\n\n\n        abs_rel = abs_rel[abs_rel != -100]\n        sq_rel = sq_rel[sq_rel != -100]\n        rms = rms[rms != -100]\n        log_rms = log_rms[log_rms != -100]\n        a1 = a1[a1 != -100]\n        a2 = a2[a2 != -100]\n        a3 = a3[a3 != -100]\n\n        print(\"{:>10}, {:>10}, {:>10}, {:>10}, {:>10}, {:>10}, {:>10}\".format('abs_rel', 'sq_rel', 'rms',\n                                                                              'log_rms', 'a1', 'a2', 'a3'))\n        print(\"{:10.4f}, {:10.4f}, {:10.4f}, {:10.4f}, {:10.4f}, {:10.4f}, {:10.4f}\".format(\n            abs_rel.mean(), sq_rel.mean(), rms.mean(), log_rms.mean(), a1.mean(), a2.mean(), a3.mean()))\n        print('avg_time %.4f/img' %(avg_time/num_test))\n\n\t'''\n\nif __name__ == '__main__':\n    tf.app.run()\n"
  },
  {
    "path": "plane_segmentation/net.py",
    "content": "from __future__ import division\nimport tensorflow as tf\nimport tensorflow.contrib.slim as slim\nfrom tensorflow.contrib.layers.python.layers import utils\nimport numpy as np\nSCALING = 1.0\nBIAS = 0.\n\ndef resize_like(inputs, ref):\n    iH, iW = inputs.get_shape()[1], inputs.get_shape()[2]\n    rH, rW = ref.get_shape()[1], ref.get_shape()[2]\n    if iH == rH and iW == rW:\n        return inputs\n    return tf.image.resize_nearest_neighbor(inputs, [rH.value, rW.value])\n\n\ndef plane_pred_net(tgt_image, num_plane, is_training=True):\n    H = tgt_image.get_shape()[1].value\n    W = tgt_image.get_shape()[2].value\n    with tf.variable_scope('depth_net') as sc:                          # design the nn architecture for the depth network\n        end_points_collection = sc.original_name_scope + '_end_points'\n        with slim.arg_scope([slim.conv2d, slim.conv2d_transpose],       #define a conv2d operator with fixed params shown below\n                            normalizer_fn=None,\n                            weights_regularizer=slim.l2_regularizer(0.05), # using l2 regularizer with 0.05 weight\n                            activation_fn=tf.nn.relu,\n                            outputs_collections=end_points_collection):\n\n            #for slim.conv2d the default padding mode = 'same'\n            cnv1  = slim.conv2d(tgt_image, 32,  [7, 7], stride=2, scope='cnv1') #4*96*160*32\n            cnv1b = slim.conv2d(cnv1,  32,  [7, 7], stride=1, scope='cnv1b')\n            cnv2  = slim.conv2d(cnv1b, 64,  [5, 5], stride=2, scope='cnv2')     #4*48*80*64\n            cnv2b = slim.conv2d(cnv2,  64,  [5, 5], stride=1, scope='cnv2b')\n            cnv3  = slim.conv2d(cnv2b, 128, [3, 3], stride=2, scope='cnv3')     #4*24*40*128\n            cnv3b = slim.conv2d(cnv3,  128, [3, 3], stride=1, scope='cnv3b')\n            cnv4  = slim.conv2d(cnv3b, 256, [3, 3], stride=2, scope='cnv4')     #4*12*20*256\n            cnv4b = slim.conv2d(cnv4,  256, [3, 3], stride=1, scope='cnv4b')\n            cnv5  = slim.conv2d(cnv4b, 512, [3, 3], stride=2, scope='cnv5')\n            cnv5b = slim.conv2d(cnv5,  512, [3, 3], stride=1, scope='cnv5b')    # 4*6*10*256\n\n            with tf.variable_scope('param'):\n                cnv6_plane  = slim.conv2d(cnv5b, 512, [3, 3], stride=2, scope='cnv6_plane')         # 4*3*5*256\n                cnv7_plane  = slim.conv2d(cnv6_plane, 512, [3, 3], stride=2, scope='cnv7_plane')    # 4*2*3*256\n                param_pred = slim.conv2d(cnv7_plane, 3*(num_plane), [1, 1], scope='param',          # 4*2*3*3n\n                    stride=1, normalizer_fn=None, activation_fn=None)\n                param_avg = tf.reduce_mean(param_pred, [1, 2])  #4*3n\n                # Empirically we found that scaling by a small constant facilitates training.\n                param_final = 0.01 * tf.reshape(param_avg, [-1, (num_plane), 3]) #4*n*3, 2 for n planes in tgt, B*n*num_param\n\n            with tf.variable_scope('mask'):\n                upcnv5 = slim.conv2d_transpose(cnv5b, 256, [3, 3], stride=2, scope='upcnv5')\n                i5_in  = tf.concat([upcnv5, cnv4b], axis=3)\n                icnv5  = slim.conv2d(i5_in, 256, [3, 3], stride=1, scope='icnv5')                      # 4*12*20*256\n\n                upcnv4 = slim.conv2d_transpose(icnv5, 128, [3, 3], stride=2, scope='upcnv4')           # 4*24*40*128\n                i4_in  = tf.concat([upcnv4, cnv3b], axis=3)\n                icnv4  = slim.conv2d(i4_in, 128, [3, 3], stride=1, scope='icnv4')\n                segm4  = SCALING * slim.conv2d(icnv4, num_plane + 1,   [3, 3], stride=1,\n                        activation_fn=None, normalizer_fn=None, scope='disp4') + BIAS                 # 4*24*40*(1+n)\n                segm4_up = tf.image.resize_bilinear(segm4, [np.int(H/4), np.int(W/4)])\n\n                upcnv3 = slim.conv2d_transpose(icnv4, 64,  [3, 3], stride=2, scope='upcnv3')           # 4*48*80*64\n                i3_in  = tf.concat([upcnv3, cnv2b, segm4_up], axis=3)\n                icnv3  = slim.conv2d(i3_in, 64,  [3, 3], stride=1, scope='icnv3')\n                segm3  = SCALING * slim.conv2d(icnv3, num_plane + 1,   [3, 3], stride=1,               #4*48*80*(1+n)\n                    activation_fn=None, normalizer_fn=None, scope='disp3') + BIAS\n                segm3_up = tf.image.resize_bilinear(segm3, [np.int(H/2), np.int(W/2)])\n\n                upcnv2 = slim.conv2d_transpose(icnv3, 32,  [3, 3], stride=2, scope='upcnv2')            # 4*96*160*32\n                i2_in  = tf.concat([upcnv2, cnv1b, segm3_up], axis=3)\n                icnv2  = slim.conv2d(i2_in, 32,  [3, 3], stride=1, scope='icnv2')\n                segm2  = SCALING * slim.conv2d(icnv2, num_plane + 1,   [3, 3], stride=1,                 #4*96*160*(n+1)\n                    activation_fn=None, normalizer_fn=None, scope='disp2') + BIAS\n                segm2_up = tf.image.resize_bilinear(segm2, [H, W])\n\n                upcnv1 = slim.conv2d_transpose(icnv2, 16,  [3, 3], stride=2, scope='upcnv1')            #4*192*320*16\n                i1_in  = tf.concat([upcnv1, segm2_up], axis=3)\n                icnv1  = slim.conv2d(i1_in, 16,  [3, 3], stride=1, scope='icnv1')                       #4*192*320*(n+1)\n                segm1  = SCALING * slim.conv2d(icnv1, num_plane + 1,   [3, 3], stride=1,\n                    activation_fn=None, normalizer_fn=None, scope='disp1') + BIAS\n\n            end_points = utils.convert_collection_to_dict(end_points_collection)\n            return param_final, [segm1, segm2, segm3, segm4], end_points\n"
  },
  {
    "path": "plane_segmentation/openloris.txt",
    "content": "OpenLORIS_images img5314.jpg\nOpenLORIS_images img96.jpg\nOpenLORIS_images img268.jpg\nOpenLORIS_images img4489.jpg\nOpenLORIS_images img4539.jpg\nOpenLORIS_images img870.jpg\nOpenLORIS_images img197.jpg\nOpenLORIS_images img5397.jpg\nOpenLORIS_images img666.jpg\nOpenLORIS_images img1147.jpg\nOpenLORIS_images img113.jpg\nOpenLORIS_images img2986.jpg\nOpenLORIS_images img1138.jpg\nOpenLORIS_images img2658.jpg\nOpenLORIS_images img2920.jpg\nOpenLORIS_images img360.jpg\nOpenLORIS_images img1074.jpg\nOpenLORIS_images img5622.jpg\nOpenLORIS_images img219.jpg\nOpenLORIS_images img4272.jpg\nOpenLORIS_images img1308.jpg\nOpenLORIS_images img5827.jpg\nOpenLORIS_images img3532.jpg\nOpenLORIS_images img140.jpg\nOpenLORIS_images img3453.jpg\nOpenLORIS_images img2149.jpg\nOpenLORIS_images img891.jpg\nOpenLORIS_images img4264.jpg\nOpenLORIS_images img6088.jpg\nOpenLORIS_images img1665.jpg\nOpenLORIS_images img3305.jpg\nOpenLORIS_images img2530.jpg\nOpenLORIS_images img878.jpg\nOpenLORIS_images img1559.jpg\nOpenLORIS_images img4105.jpg\nOpenLORIS_images img4566.jpg\nOpenLORIS_images img942.jpg\nOpenLORIS_images img1382.jpg\nOpenLORIS_images img480.jpg\nOpenLORIS_images img167.jpg\nOpenLORIS_images img3691.jpg\nOpenLORIS_images img4288.jpg\nOpenLORIS_images img5014.jpg\nOpenLORIS_images img1798.jpg\nOpenLORIS_images img1436.jpg\nOpenLORIS_images img5788.jpg\nOpenLORIS_images img1045.jpg\nOpenLORIS_images img3274.jpg\nOpenLORIS_images img4418.jpg\nOpenLORIS_images img4196.jpg\nOpenLORIS_images img3359.jpg\nOpenLORIS_images img3111.jpg\nOpenLORIS_images img657.jpg\nOpenLORIS_images img4777.jpg\nOpenLORIS_images img807.jpg\nOpenLORIS_images img2580.jpg\nOpenLORIS_images img2800.jpg\nOpenLORIS_images img4918.jpg\nOpenLORIS_images img642.jpg\nOpenLORIS_images img2153.jpg\nOpenLORIS_images img6023.jpg\nOpenLORIS_images img1506.jpg\nOpenLORIS_images img439.jpg\nOpenLORIS_images img5945.jpg\nOpenLORIS_images img2272.jpg\nOpenLORIS_images img3606.jpg\nOpenLORIS_images img1723.jpg\nOpenLORIS_images img3041.jpg\nOpenLORIS_images img2644.jpg\nOpenLORIS_images img2870.jpg\nOpenLORIS_images img1160.jpg\nOpenLORIS_images img1268.jpg\nOpenLORIS_images img1175.jpg\nOpenLORIS_images img998.jpg\nOpenLORIS_images img5307.jpg\nOpenLORIS_images img554.jpg\nOpenLORIS_images img5395.jpg\nOpenLORIS_images img4286.jpg\nOpenLORIS_images img4500.jpg\nOpenLORIS_images img6026.jpg\nOpenLORIS_images img6097.jpg\nOpenLORIS_images img969.jpg\nOpenLORIS_images img810.jpg\nOpenLORIS_images img1377.jpg\nOpenLORIS_images img329.jpg\nOpenLORIS_images img1272.jpg\nOpenLORIS_images img3459.jpg\nOpenLORIS_images img5401.jpg\nOpenLORIS_images img4271.jpg\nOpenLORIS_images img2087.jpg\nOpenLORIS_images img4952.jpg\nOpenLORIS_images img4445.jpg\nOpenLORIS_images img2290.jpg\nOpenLORIS_images img3419.jpg\nOpenLORIS_images img3525.jpg\nOpenLORIS_images img5643.jpg\nOpenLORIS_images img463.jpg\nOpenLORIS_images img5576.jpg\nOpenLORIS_images img3045.jpg\nOpenLORIS_images img6138.jpg\nOpenLORIS_images img4001.jpg\nOpenLORIS_images img6078.jpg\nOpenLORIS_images img3962.jpg\nOpenLORIS_images img3235.jpg\nOpenLORIS_images img4744.jpg\nOpenLORIS_images img66.jpg\nOpenLORIS_images img3429.jpg\nOpenLORIS_images img1444.jpg\nOpenLORIS_images img2902.jpg\nOpenLORIS_images img3772.jpg\nOpenLORIS_images img367.jpg\nOpenLORIS_images img5527.jpg\nOpenLORIS_images img1445.jpg\nOpenLORIS_images img4532.jpg\nOpenLORIS_images img1530.jpg\nOpenLORIS_images img3421.jpg\nOpenLORIS_images img4354.jpg\nOpenLORIS_images img4753.jpg\nOpenLORIS_images img3572.jpg\nOpenLORIS_images img57.jpg\nOpenLORIS_images img412.jpg\nOpenLORIS_images img4799.jpg\nOpenLORIS_images img3783.jpg\nOpenLORIS_images img2500.jpg\nOpenLORIS_images img3537.jpg\nOpenLORIS_images img1705.jpg\nOpenLORIS_images img5201.jpg\nOpenLORIS_images img5507.jpg\nOpenLORIS_images img5960.jpg\nOpenLORIS_images img4234.jpg\nOpenLORIS_images img5849.jpg\nOpenLORIS_images img2685.jpg\nOpenLORIS_images img3275.jpg\nOpenLORIS_images img4832.jpg\nOpenLORIS_images img5173.jpg\nOpenLORIS_images img1985.jpg\nOpenLORIS_images img3257.jpg\nOpenLORIS_images img266.jpg\nOpenLORIS_images img5211.jpg\nOpenLORIS_images img601.jpg\nOpenLORIS_images img2547.jpg\nOpenLORIS_images img5853.jpg\nOpenLORIS_images img1402.jpg\nOpenLORIS_images img1821.jpg\nOpenLORIS_images img1936.jpg\nOpenLORIS_images img2472.jpg\nOpenLORIS_images img4168.jpg\nOpenLORIS_images img3473.jpg\nOpenLORIS_images img4290.jpg\nOpenLORIS_images img1536.jpg\nOpenLORIS_images img1021.jpg\nOpenLORIS_images img2907.jpg\nOpenLORIS_images img3838.jpg\nOpenLORIS_images img1527.jpg\nOpenLORIS_images img5499.jpg\nOpenLORIS_images img5105.jpg\nOpenLORIS_images img5226.jpg\nOpenLORIS_images img4507.jpg\nOpenLORIS_images img372.jpg\nOpenLORIS_images img3671.jpg\nOpenLORIS_images img911.jpg\nOpenLORIS_images img4130.jpg\nOpenLORIS_images img1385.jpg\nOpenLORIS_images img5342.jpg\nOpenLORIS_images img761.jpg\nOpenLORIS_images img819.jpg\nOpenLORIS_images img586.jpg\nOpenLORIS_images img3547.jpg\nOpenLORIS_images img882.jpg\nOpenLORIS_images img1637.jpg\nOpenLORIS_images img3583.jpg\nOpenLORIS_images img3776.jpg\nOpenLORIS_images img481.jpg\nOpenLORIS_images img1575.jpg\nOpenLORIS_images img5620.jpg\nOpenLORIS_images img2332.jpg\nOpenLORIS_images img704.jpg\nOpenLORIS_images img5991.jpg\nOpenLORIS_images img5510.jpg\nOpenLORIS_images img2985.jpg\nOpenLORIS_images img4894.jpg\nOpenLORIS_images img2311.jpg\nOpenLORIS_images img38.jpg\nOpenLORIS_images img3103.jpg\nOpenLORIS_images img4201.jpg\nOpenLORIS_images img4411.jpg\nOpenLORIS_images img3593.jpg\nOpenLORIS_images img5807.jpg\nOpenLORIS_images img2731.jpg\nOpenLORIS_images img3754.jpg\nOpenLORIS_images img4645.jpg\nOpenLORIS_images img3941.jpg\nOpenLORIS_images img3734.jpg\nOpenLORIS_images img1466.jpg\nOpenLORIS_images img2343.jpg\nOpenLORIS_images img5974.jpg\nOpenLORIS_images img2539.jpg\nOpenLORIS_images img3032.jpg\nOpenLORIS_images img4928.jpg\nOpenLORIS_images img4031.jpg\nOpenLORIS_images img445.jpg\nOpenLORIS_images img1831.jpg\nOpenLORIS_images img5585.jpg\nOpenLORIS_images img5791.jpg\nOpenLORIS_images img3983.jpg\nOpenLORIS_images img3637.jpg\nOpenLORIS_images img1778.jpg\nOpenLORIS_images img2952.jpg\nOpenLORIS_images img582.jpg\nOpenLORIS_images img1199.jpg\nOpenLORIS_images img4658.jpg\nOpenLORIS_images img3607.jpg\nOpenLORIS_images img2399.jpg\nOpenLORIS_images img2329.jpg\nOpenLORIS_images img2026.jpg\nOpenLORIS_images img5570.jpg\nOpenLORIS_images img5126.jpg\nOpenLORIS_images img1468.jpg\nOpenLORIS_images img4819.jpg\nOpenLORIS_images img3628.jpg\nOpenLORIS_images img4217.jpg\nOpenLORIS_images img654.jpg\nOpenLORIS_images img860.jpg\nOpenLORIS_images img410.jpg\nOpenLORIS_images img2691.jpg\nOpenLORIS_images img5119.jpg\nOpenLORIS_images img680.jpg\nOpenLORIS_images img3914.jpg\nOpenLORIS_images img5645.jpg\nOpenLORIS_images img1406.jpg\nOpenLORIS_images img5323.jpg\nOpenLORIS_images img418.jpg\nOpenLORIS_images img4270.jpg\nOpenLORIS_images img5487.jpg\nOpenLORIS_images img3969.jpg\nOpenLORIS_images img6048.jpg\nOpenLORIS_images img3324.jpg\nOpenLORIS_images img3599.jpg\nOpenLORIS_images img4811.jpg\nOpenLORIS_images img2327.jpg\nOpenLORIS_images img2361.jpg\nOpenLORIS_images img505.jpg\nOpenLORIS_images img2548.jpg\nOpenLORIS_images img2098.jpg\nOpenLORIS_images img2181.jpg\nOpenLORIS_images img4737.jpg\nOpenLORIS_images img1396.jpg\nOpenLORIS_images img2876.jpg\nOpenLORIS_images img5363.jpg\nOpenLORIS_images img2080.jpg\nOpenLORIS_images img228.jpg\nOpenLORIS_images img1718.jpg\nOpenLORIS_images img4027.jpg\nOpenLORIS_images img3787.jpg\nOpenLORIS_images img2879.jpg\nOpenLORIS_images img1650.jpg\nOpenLORIS_images img5752.jpg\nOpenLORIS_images img5721.jpg\nOpenLORIS_images img1603.jpg\nOpenLORIS_images img632.jpg\nOpenLORIS_images img2968.jpg\nOpenLORIS_images img2219.jpg\nOpenLORIS_images img4451.jpg\nOpenLORIS_images img4429.jpg\nOpenLORIS_images img907.jpg\nOpenLORIS_images img4885.jpg\nOpenLORIS_images img1866.jpg\nOpenLORIS_images img5477.jpg\nOpenLORIS_images img1515.jpg\nOpenLORIS_images img3812.jpg\nOpenLORIS_images img5930.jpg\nOpenLORIS_images img798.jpg\nOpenLORIS_images img4200.jpg\nOpenLORIS_images img2247.jpg\nOpenLORIS_images img2606.jpg\nOpenLORIS_images img598.jpg\nOpenLORIS_images img3574.jpg\nOpenLORIS_images img2498.jpg\nOpenLORIS_images img1302.jpg\nOpenLORIS_images img4736.jpg\nOpenLORIS_images img4881.jpg\nOpenLORIS_images img1817.jpg\nOpenLORIS_images img4559.jpg\nOpenLORIS_images img1130.jpg\nOpenLORIS_images img861.jpg\nOpenLORIS_images img1471.jpg\nOpenLORIS_images img3594.jpg\nOpenLORIS_images img4542.jpg\nOpenLORIS_images img4564.jpg\nOpenLORIS_images img3095.jpg\nOpenLORIS_images img2549.jpg\nOpenLORIS_images img568.jpg\nOpenLORIS_images img2167.jpg\nOpenLORIS_images img2948.jpg\nOpenLORIS_images img3677.jpg\nOpenLORIS_images img5521.jpg\nOpenLORIS_images img2944.jpg\nOpenLORIS_images img550.jpg\nOpenLORIS_images img1340.jpg\nOpenLORIS_images img5346.jpg\nOpenLORIS_images img3524.jpg\nOpenLORIS_images img1314.jpg\nOpenLORIS_images img3864.jpg\nOpenLORIS_images img1967.jpg\nOpenLORIS_images img4497.jpg\nOpenLORIS_images img212.jpg\nOpenLORIS_images img3480.jpg\nOpenLORIS_images img2161.jpg\nOpenLORIS_images img4584.jpg\nOpenLORIS_images img5770.jpg\nOpenLORIS_images img4471.jpg\nOpenLORIS_images img2286.jpg\nOpenLORIS_images img6067.jpg\nOpenLORIS_images img4640.jpg\nOpenLORIS_images img1706.jpg\nOpenLORIS_images img2335.jpg\nOpenLORIS_images img4654.jpg\nOpenLORIS_images img3877.jpg\nOpenLORIS_images img2425.jpg\nOpenLORIS_images img4941.jpg\nOpenLORIS_images img4828.jpg\nOpenLORIS_images img6089.jpg\nOpenLORIS_images img4009.jpg\nOpenLORIS_images img2524.jpg\nOpenLORIS_images img2645.jpg\nOpenLORIS_images img5890.jpg\nOpenLORIS_images img618.jpg\nOpenLORIS_images img5149.jpg\nOpenLORIS_images img4842.jpg\nOpenLORIS_images img472.jpg\nOpenLORIS_images img2018.jpg\nOpenLORIS_images img4848.jpg\nOpenLORIS_images img5804.jpg\nOpenLORIS_images img1484.jpg\nOpenLORIS_images img5528.jpg\nOpenLORIS_images img4227.jpg\nOpenLORIS_images img5864.jpg\nOpenLORIS_images img4981.jpg\nOpenLORIS_images img672.jpg\nOpenLORIS_images img3160.jpg\nOpenLORIS_images img2519.jpg\nOpenLORIS_images img2626.jpg\nOpenLORIS_images img3082.jpg\nOpenLORIS_images img5843.jpg\nOpenLORIS_images img3260.jpg\nOpenLORIS_images img296.jpg\nOpenLORIS_images img3620.jpg\nOpenLORIS_images img3369.jpg\nOpenLORIS_images img5172.jpg\nOpenLORIS_images img5413.jpg\nOpenLORIS_images img3503.jpg\nOpenLORIS_images img5114.jpg\nOpenLORIS_images img2947.jpg\nOpenLORIS_images img2997.jpg\nOpenLORIS_images img6112.jpg\nOpenLORIS_images img1880.jpg\nOpenLORIS_images img2430.jpg\nOpenLORIS_images img417.jpg\nOpenLORIS_images img1085.jpg\nOpenLORIS_images img5188.jpg\nOpenLORIS_images img4249.jpg\nOpenLORIS_images img3775.jpg\nOpenLORIS_images img4207.jpg\nOpenLORIS_images img4275.jpg\nOpenLORIS_images img3999.jpg\nOpenLORIS_images img3793.jpg\nOpenLORIS_images img817.jpg\nOpenLORIS_images img1940.jpg\nOpenLORIS_images img4661.jpg\nOpenLORIS_images img3885.jpg\nOpenLORIS_images img4754.jpg\nOpenLORIS_images img5468.jpg\nOpenLORIS_images img4649.jpg\nOpenLORIS_images img5744.jpg\nOpenLORIS_images img5706.jpg\nOpenLORIS_images img1965.jpg\nOpenLORIS_images img3826.jpg\nOpenLORIS_images img5618.jpg\nOpenLORIS_images img46.jpg\nOpenLORIS_images img800.jpg\nOpenLORIS_images img5666.jpg\nOpenLORIS_images img5580.jpg\nOpenLORIS_images img5443.jpg\nOpenLORIS_images img1494.jpg\nOpenLORIS_images img658.jpg\nOpenLORIS_images img4515.jpg\nOpenLORIS_images img2256.jpg\nOpenLORIS_images img3167.jpg\nOpenLORIS_images img5942.jpg\nOpenLORIS_images img2512.jpg\nOpenLORIS_images img3135.jpg\nOpenLORIS_images img5596.jpg\nOpenLORIS_images img4895.jpg\nOpenLORIS_images img1150.jpg\nOpenLORIS_images img4746.jpg\nOpenLORIS_images img793.jpg\nOpenLORIS_images img3522.jpg\nOpenLORIS_images img264.jpg\nOpenLORIS_images img1740.jpg\nOpenLORIS_images img985.jpg\nOpenLORIS_images img786.jpg\nOpenLORIS_images img249.jpg\nOpenLORIS_images img3785.jpg\nOpenLORIS_images img5391.jpg\nOpenLORIS_images img2196.jpg\nOpenLORIS_images img1295.jpg\nOpenLORIS_images img1561.jpg\nOpenLORIS_images img220.jpg\nOpenLORIS_images img4040.jpg\nOpenLORIS_images img1025.jpg\nOpenLORIS_images img5992.jpg\nOpenLORIS_images img2801.jpg\nOpenLORIS_images img2410.jpg\nOpenLORIS_images img1422.jpg\nOpenLORIS_images img5777.jpg\nOpenLORIS_images img5099.jpg\nOpenLORIS_images img5179.jpg\nOpenLORIS_images img287.jpg\nOpenLORIS_images img2700.jpg\nOpenLORIS_images img3489.jpg\nOpenLORIS_images img3986.jpg\nOpenLORIS_images img5980.jpg\nOpenLORIS_images img150.jpg\nOpenLORIS_images img5293.jpg\nOpenLORIS_images img597.jpg\nOpenLORIS_images img636.jpg\nOpenLORIS_images img3151.jpg\nOpenLORIS_images img4898.jpg\nOpenLORIS_images img3212.jpg\nOpenLORIS_images img4137.jpg\nOpenLORIS_images img715.jpg\nOpenLORIS_images img4607.jpg\nOpenLORIS_images img1545.jpg\nOpenLORIS_images img4548.jpg\nOpenLORIS_images img76.jpg\nOpenLORIS_images img1984.jpg\nOpenLORIS_images img58.jpg\nOpenLORIS_images img5880.jpg\nOpenLORIS_images img1301.jpg\nOpenLORIS_images img5322.jpg\nOpenLORIS_images img4969.jpg\nOpenLORIS_images img4339.jpg\nOpenLORIS_images img1262.jpg\nOpenLORIS_images img627.jpg\nOpenLORIS_images img2825.jpg\nOpenLORIS_images img4120.jpg\nOpenLORIS_images img145.jpg\nOpenLORIS_images img5602.jpg\nOpenLORIS_images img5200.jpg\nOpenLORIS_images img2897.jpg\nOpenLORIS_images img5202.jpg\nOpenLORIS_images img4015.jpg\nOpenLORIS_images img2415.jpg\nOpenLORIS_images img2291.jpg\nOpenLORIS_images img1048.jpg\nOpenLORIS_images img1572.jpg\nOpenLORIS_images img4395.jpg\nOpenLORIS_images img3371.jpg\nOpenLORIS_images img4006.jpg\nOpenLORIS_images img750.jpg\nOpenLORIS_images img1946.jpg\nOpenLORIS_images img5012.jpg\nOpenLORIS_images img1079.jpg\nOpenLORIS_images img5967.jpg\nOpenLORIS_images img3645.jpg\nOpenLORIS_images img4947.jpg\nOpenLORIS_images img974.jpg\nOpenLORIS_images img1814.jpg\nOpenLORIS_images img448.jpg\nOpenLORIS_images img2827.jpg\nOpenLORIS_images img2132.jpg\nOpenLORIS_images img5539.jpg\nOpenLORIS_images img4259.jpg\nOpenLORIS_images img5525.jpg\nOpenLORIS_images img1139.jpg\nOpenLORIS_images img4315.jpg\nOpenLORIS_images img1294.jpg\nOpenLORIS_images img5124.jpg\nOpenLORIS_images img4318.jpg\nOpenLORIS_images img2831.jpg\nOpenLORIS_images img379.jpg\nOpenLORIS_images img1037.jpg\nOpenLORIS_images img2484.jpg\nOpenLORIS_images img3216.jpg\nOpenLORIS_images img3472.jpg\nOpenLORIS_images img602.jpg\nOpenLORIS_images img4123.jpg\nOpenLORIS_images img5497.jpg\nOpenLORIS_images img2354.jpg\nOpenLORIS_images img1989.jpg\nOpenLORIS_images img5356.jpg\nOpenLORIS_images img5533.jpg\nOpenLORIS_images img1057.jpg\nOpenLORIS_images img853.jpg\nOpenLORIS_images img2518.jpg\nOpenLORIS_images img1780.jpg\nOpenLORIS_images img888.jpg\nOpenLORIS_images img3786.jpg\nOpenLORIS_images img2486.jpg\nOpenLORIS_images img223.jpg\nOpenLORIS_images img1350.jpg\nOpenLORIS_images img4362.jpg\nOpenLORIS_images img3273.jpg\nOpenLORIS_images img3742.jpg\nOpenLORIS_images img449.jpg\nOpenLORIS_images img1977.jpg\nOpenLORIS_images img899.jpg\nOpenLORIS_images img3299.jpg\nOpenLORIS_images img4177.jpg\nOpenLORIS_images img5359.jpg\nOpenLORIS_images img3562.jpg\nOpenLORIS_images img5613.jpg\nOpenLORIS_images img422.jpg\nOpenLORIS_images img3121.jpg\nOpenLORIS_images img2793.jpg\nOpenLORIS_images img2762.jpg\nOpenLORIS_images img6125.jpg\nOpenLORIS_images img813.jpg\nOpenLORIS_images img936.jpg\nOpenLORIS_images img1801.jpg\nOpenLORIS_images img1846.jpg\nOpenLORIS_images img5476.jpg\nOpenLORIS_images img5837.jpg\nOpenLORIS_images img4707.jpg\nOpenLORIS_images img365.jpg\nOpenLORIS_images img826.jpg\nOpenLORIS_images img4254.jpg\nOpenLORIS_images img5892.jpg\nOpenLORIS_images img2684.jpg\nOpenLORIS_images img4066.jpg\nOpenLORIS_images img1775.jpg\nOpenLORIS_images img2506.jpg\nOpenLORIS_images img3469.jpg\nOpenLORIS_images img573.jpg\nOpenLORIS_images img386.jpg\nOpenLORIS_images img2072.jpg\nOpenLORIS_images img5517.jpg\nOpenLORIS_images img62.jpg\nOpenLORIS_images img5883.jpg\nOpenLORIS_images img3973.jpg\nOpenLORIS_images img3505.jpg\nOpenLORIS_images img3224.jpg\nOpenLORIS_images img3700.jpg\nOpenLORIS_images img4624.jpg\nOpenLORIS_images img5268.jpg\nOpenLORIS_images img5233.jpg\nOpenLORIS_images img124.jpg\nOpenLORIS_images img2422.jpg\nOpenLORIS_images img4398.jpg\nOpenLORIS_images img3470.jpg\nOpenLORIS_images img1334.jpg\nOpenLORIS_images img2511.jpg\nOpenLORIS_images img965.jpg\nOpenLORIS_images img4750.jpg\nOpenLORIS_images img1945.jpg\nOpenLORIS_images img2651.jpg\nOpenLORIS_images img1349.jpg\nOpenLORIS_images img4678.jpg\nOpenLORIS_images img5327.jpg\nOpenLORIS_images img2611.jpg\nOpenLORIS_images img1426.jpg\nOpenLORIS_images img2744.jpg\nOpenLORIS_images img4237.jpg\nOpenLORIS_images img2244.jpg\nOpenLORIS_images img2582.jpg\nOpenLORIS_images img4487.jpg\nOpenLORIS_images img3022.jpg\nOpenLORIS_images img199.jpg\nOpenLORIS_images img5370.jpg\nOpenLORIS_images img5831.jpg\nOpenLORIS_images img68.jpg\nOpenLORIS_images img1472.jpg\nOpenLORIS_images img4492.jpg\nOpenLORIS_images img5496.jpg\nOpenLORIS_images img1619.jpg\nOpenLORIS_images img2287.jpg\nOpenLORIS_images img758.jpg\nOpenLORIS_images img1476.jpg\nOpenLORIS_images img4148.jpg\nOpenLORIS_images img1848.jpg\nOpenLORIS_images img1276.jpg\nOpenLORIS_images img788.jpg\nOpenLORIS_images img2192.jpg\nOpenLORIS_images img3567.jpg\nOpenLORIS_images img5367.jpg\nOpenLORIS_images img5313.jpg\nOpenLORIS_images img3394.jpg\nOpenLORIS_images img958.jpg\nOpenLORIS_images img237.jpg\nOpenLORIS_images img1644.jpg\nOpenLORIS_images img1832.jpg\nOpenLORIS_images img1787.jpg\nOpenLORIS_images img5351.jpg\nOpenLORIS_images img2302.jpg\nOpenLORIS_images img2980.jpg\nOpenLORIS_images img3071.jpg\nOpenLORIS_images img5915.jpg\nOpenLORIS_images img2101.jpg\nOpenLORIS_images img5402.jpg\nOpenLORIS_images img383.jpg\nOpenLORIS_images img191.jpg\nOpenLORIS_images img6091.jpg\nOpenLORIS_images img3831.jpg\nOpenLORIS_images img4195.jpg\nOpenLORIS_images img4547.jpg\nOpenLORIS_images img2806.jpg\nOpenLORIS_images img3579.jpg\nOpenLORIS_images img2303.jpg\nOpenLORIS_images img3884.jpg\nOpenLORIS_images img4034.jpg\nOpenLORIS_images img3902.jpg\nOpenLORIS_images img168.jpg\nOpenLORIS_images img4024.jpg\nOpenLORIS_images img1087.jpg\nOpenLORIS_images img4767.jpg\nOpenLORIS_images img4953.jpg\nOpenLORIS_images img3454.jpg\nOpenLORIS_images img6056.jpg\nOpenLORIS_images img2996.jpg\nOpenLORIS_images img3017.jpg\nOpenLORIS_images img1080.jpg\nOpenLORIS_images img4555.jpg\nOpenLORIS_images img442.jpg\nOpenLORIS_images img5720.jpg\nOpenLORIS_images img949.jpg\nOpenLORIS_images img5844.jpg\nOpenLORIS_images img4977.jpg\nOpenLORIS_images img2924.jpg\nOpenLORIS_images img3277.jpg\nOpenLORIS_images img2189.jpg\nOpenLORIS_images img101.jpg\nOpenLORIS_images img4454.jpg\nOpenLORIS_images img5485.jpg\nOpenLORIS_images img2151.jpg\nOpenLORIS_images img5054.jpg\nOpenLORIS_images img4038.jpg\nOpenLORIS_images img3254.jpg\nOpenLORIS_images img5414.jpg\nOpenLORIS_images img2250.jpg\nOpenLORIS_images img3733.jpg\nOpenLORIS_images img4477.jpg\nOpenLORIS_images img1157.jpg\nOpenLORIS_images img192.jpg\nOpenLORIS_images img1208.jpg\nOpenLORIS_images img1697.jpg\nOpenLORIS_images img3449.jpg\nOpenLORIS_images img4365.jpg\nOpenLORIS_images img208.jpg\nOpenLORIS_images img321.jpg\nOpenLORIS_images img5338.jpg\nOpenLORIS_images img5100.jpg\nOpenLORIS_images img4216.jpg\nOpenLORIS_images img4091.jpg\nOpenLORIS_images img1239.jpg\nOpenLORIS_images img2673.jpg\nOpenLORIS_images img898.jpg\nOpenLORIS_images img2239.jpg\nOpenLORIS_images img348.jpg\nOpenLORIS_images img3431.jpg\nOpenLORIS_images img664.jpg\nOpenLORIS_images img1763.jpg\nOpenLORIS_images img2843.jpg\nOpenLORIS_images img4541.jpg\nOpenLORIS_images img5652.jpg\nOpenLORIS_images img617.jpg\nOpenLORIS_images img1904.jpg\nOpenLORIS_images img794.jpg\nOpenLORIS_images img3332.jpg\nOpenLORIS_images img4021.jpg\nOpenLORIS_images img631.jpg\nOpenLORIS_images img5852.jpg\nOpenLORIS_images img4738.jpg\nOpenLORIS_images img5558.jpg\nOpenLORIS_images img5065.jpg\nOpenLORIS_images img613.jpg\nOpenLORIS_images img1997.jpg\nOpenLORIS_images img1235.jpg\nOpenLORIS_images img5747.jpg\nOpenLORIS_images img530.jpg\nOpenLORIS_images img5020.jpg\nOpenLORIS_images img3891.jpg\nOpenLORIS_images img3866.jpg\nOpenLORIS_images img1870.jpg\nOpenLORIS_images img2496.jpg\nOpenLORIS_images img5912.jpg\nOpenLORIS_images img895.jpg\nOpenLORIS_images img4887.jpg\nOpenLORIS_images img358.jpg\nOpenLORIS_images img984.jpg\nOpenLORIS_images img6100.jpg\nOpenLORIS_images img1890.jpg\nOpenLORIS_images img310.jpg\nOpenLORIS_images img4745.jpg\nOpenLORIS_images img4505.jpg\nOpenLORIS_images img341.jpg\nOpenLORIS_images img5899.jpg\nOpenLORIS_images img4709.jpg\nOpenLORIS_images img3183.jpg\nOpenLORIS_images img5855.jpg\nOpenLORIS_images img3443.jpg\nOpenLORIS_images img4229.jpg\nOpenLORIS_images img250.jpg\nOpenLORIS_images img4699.jpg\nOpenLORIS_images img4713.jpg\nOpenLORIS_images img3303.jpg\nOpenLORIS_images img5425.jpg\nOpenLORIS_images img4396.jpg\nOpenLORIS_images img1304.jpg\nOpenLORIS_images img2574.jpg\nOpenLORIS_images img3361.jpg\nOpenLORIS_images img3344.jpg\nOpenLORIS_images img5285.jpg\nOpenLORIS_images img5198.jpg\nOpenLORIS_images img4861.jpg\nOpenLORIS_images img5735.jpg\nOpenLORIS_images img5937.jpg\nOpenLORIS_images img4636.jpg\nOpenLORIS_images img285.jpg\nOpenLORIS_images img1852.jpg\nOpenLORIS_images img1253.jpg\nOpenLORIS_images img3368.jpg\nOpenLORIS_images img4221.jpg\nOpenLORIS_images img236.jpg\nOpenLORIS_images img4138.jpg\nOpenLORIS_images img2315.jpg\nOpenLORIS_images img1613.jpg\nOpenLORIS_images img3501.jpg\nOpenLORIS_images img5897.jpg\nOpenLORIS_images img1423.jpg\nOpenLORIS_images img564.jpg\nOpenLORIS_images img98.jpg\nOpenLORIS_images img4870.jpg\nOpenLORIS_images img3389.jpg\nOpenLORIS_images img4440.jpg\nOpenLORIS_images img5832.jpg\nOpenLORIS_images img5495.jpg\nOpenLORIS_images img849.jpg\nOpenLORIS_images img323.jpg\nOpenLORIS_images img5361.jpg\nOpenLORIS_images img3638.jpg\nOpenLORIS_images img3373.jpg\nOpenLORIS_images img1730.jpg\nOpenLORIS_images img78.jpg\nOpenLORIS_images img4206.jpg\nOpenLORIS_images img4533.jpg\nOpenLORIS_images img4638.jpg\nOpenLORIS_images img2245.jpg\nOpenLORIS_images img32.jpg\nOpenLORIS_images img3323.jpg\nOpenLORIS_images img5702.jpg\nOpenLORIS_images img6140.jpg\nOpenLORIS_images img2892.jpg\nOpenLORIS_images img2185.jpg\nOpenLORIS_images img5315.jpg\nOpenLORIS_images img6057.jpg\nOpenLORIS_images img120.jpg\nOpenLORIS_images img1456.jpg\nOpenLORIS_images img4470.jpg\nOpenLORIS_images img1693.jpg\nOpenLORIS_images img1062.jpg\nOpenLORIS_images img4344.jpg\nOpenLORIS_images img343.jpg\nOpenLORIS_images img1621.jpg\nOpenLORIS_images img4248.jpg\nOpenLORIS_images img4436.jpg\nOpenLORIS_images img3291.jpg\nOpenLORIS_images img2465.jpg\nOpenLORIS_images img6042.jpg\nOpenLORIS_images img451.jpg\nOpenLORIS_images img2717.jpg\nOpenLORIS_images img1512.jpg\nOpenLORIS_images img322.jpg\nOpenLORIS_images img698.jpg\nOpenLORIS_images img2918.jpg\nOpenLORIS_images img5362.jpg\nOpenLORIS_images img131.jpg\nOpenLORIS_images img933.jpg\nOpenLORIS_images img291.jpg\nOpenLORIS_images img910.jpg\nOpenLORIS_images img1361.jpg\nOpenLORIS_images img2783.jpg\nOpenLORIS_images img4762.jpg\nOpenLORIS_images img5500.jpg\nOpenLORIS_images img1546.jpg\nOpenLORIS_images img5155.jpg\nOpenLORIS_images img5934.jpg\nOpenLORIS_images img4734.jpg\nOpenLORIS_images img4037.jpg\nOpenLORIS_images img4159.jpg\nOpenLORIS_images img3426.jpg\nOpenLORIS_images img411.jpg\nOpenLORIS_images img5708.jpg\nOpenLORIS_images img692.jpg\nOpenLORIS_images img4121.jpg\nOpenLORIS_images img1920.jpg\nOpenLORIS_images img163.jpg\nOpenLORIS_images img1297.jpg\nOpenLORIS_images img461.jpg\nOpenLORIS_images img5369.jpg\nOpenLORIS_images img2434.jpg\nOpenLORIS_images img4796.jpg\nOpenLORIS_images img2602.jpg\nOpenLORIS_images img5219.jpg\nOpenLORIS_images img4018.jpg\nOpenLORIS_images img3585.jpg\nOpenLORIS_images img4166.jpg\nOpenLORIS_images img4957.jpg\nOpenLORIS_images img580.jpg\nOpenLORIS_images img1091.jpg\nOpenLORIS_images img3413.jpg\nOpenLORIS_images img5080.jpg\nOpenLORIS_images img1585.jpg\nOpenLORIS_images img3977.jpg\nOpenLORIS_images img479.jpg\nOpenLORIS_images img6064.jpg\nOpenLORIS_images img5096.jpg\nOpenLORIS_images img36.jpg\nOpenLORIS_images img4257.jpg\nOpenLORIS_images img2358.jpg\nOpenLORIS_images img2395.jpg\nOpenLORIS_images img2270.jpg\nOpenLORIS_images img3127.jpg\nOpenLORIS_images img3790.jpg\nOpenLORIS_images img4023.jpg\nOpenLORIS_images img3780.jpg\nOpenLORIS_images img1252.jpg\nOpenLORIS_images img4020.jpg\nOpenLORIS_images img1975.jpg\nOpenLORIS_images img2817.jpg\nOpenLORIS_images img2865.jpg\nOpenLORIS_images img5898.jpg\nOpenLORIS_images img3391.jpg\nOpenLORIS_images img1044.jpg\nOpenLORIS_images img5465.jpg\nOpenLORIS_images img2310.jpg\nOpenLORIS_images img634.jpg\nOpenLORIS_images img5834.jpg\nOpenLORIS_images img4698.jpg\nOpenLORIS_images img105.jpg\nOpenLORIS_images img1653.jpg\nOpenLORIS_images img1128.jpg\nOpenLORIS_images img5344.jpg\nOpenLORIS_images img5297.jpg\nOpenLORIS_images img3670.jpg\nOpenLORIS_images img1479.jpg\nOpenLORIS_images img2653.jpg\nOpenLORIS_images img2176.jpg\nOpenLORIS_images img5779.jpg\nOpenLORIS_images img873.jpg\nOpenLORIS_images img2118.jpg\nOpenLORIS_images img304.jpg\nOpenLORIS_images img926.jpg\nOpenLORIS_images img1356.jpg\nOpenLORIS_images img6039.jpg\nOpenLORIS_images img2635.jpg\nOpenLORIS_images img4550.jpg\nOpenLORIS_images img143.jpg\nOpenLORIS_images img3157.jpg\nOpenLORIS_images img486.jpg\nOpenLORIS_images img419.jpg\nOpenLORIS_images img3679.jpg\nOpenLORIS_images img3407.jpg\nOpenLORIS_images img626.jpg\nOpenLORIS_images img3630.jpg\nOpenLORIS_images img3695.jpg\nOpenLORIS_images img587.jpg\nOpenLORIS_images img6120.jpg\nOpenLORIS_images img5385.jpg\nOpenLORIS_images img5197.jpg\nOpenLORIS_images img1368.jpg\nOpenLORIS_images img4163.jpg\nOpenLORIS_images img3072.jpg\nOpenLORIS_images img490.jpg\nOpenLORIS_images img4152.jpg\nOpenLORIS_images img5464.jpg\nOpenLORIS_images img2333.jpg\nOpenLORIS_images img241.jpg\nOpenLORIS_images img4213.jpg\nOpenLORIS_images img2394.jpg\nOpenLORIS_images img3242.jpg\nOpenLORIS_images img1155.jpg\nOpenLORIS_images img2646.jpg\nOpenLORIS_images img5358.jpg\nOpenLORIS_images img5773.jpg\nOpenLORIS_images img4578.jpg\nOpenLORIS_images img4999.jpg\nOpenLORIS_images img1833.jpg\nOpenLORIS_images img366.jpg\nOpenLORIS_images img915.jpg\nOpenLORIS_images img2234.jpg\nOpenLORIS_images img3420.jpg\nOpenLORIS_images img2160.jpg\nOpenLORIS_images img3769.jpg\nOpenLORIS_images img524.jpg\nOpenLORIS_images img5364.jpg\nOpenLORIS_images img1198.jpg\nOpenLORIS_images img842.jpg\nOpenLORIS_images img540.jpg\nOpenLORIS_images img211.jpg\nOpenLORIS_images img3251.jpg\nOpenLORIS_images img6047.jpg\nOpenLORIS_images img6060.jpg\nOpenLORIS_images img2295.jpg\nOpenLORIS_images img2872.jpg\nOpenLORIS_images img1407.jpg\nOpenLORIS_images img1112.jpg\nOpenLORIS_images img493.jpg\nOpenLORIS_images img659.jpg\nOpenLORIS_images img4154.jpg\nOpenLORIS_images img4715.jpg\nOpenLORIS_images img777.jpg\nOpenLORIS_images img2759.jpg\nOpenLORIS_images img181.jpg\nOpenLORIS_images img3314.jpg\nOpenLORIS_images img88.jpg\nOpenLORIS_images img2382.jpg\nOpenLORIS_images img3278.jpg\nOpenLORIS_images img1012.jpg\nOpenLORIS_images img2491.jpg\nOpenLORIS_images img4549.jpg\nOpenLORIS_images img5006.jpg\nOpenLORIS_images img248.jpg\nOpenLORIS_images img1854.jpg\nOpenLORIS_images img3719.jpg\nOpenLORIS_images img5848.jpg\nOpenLORIS_images img5066.jpg\nOpenLORIS_images img2170.jpg\nOpenLORIS_images img968.jpg\nOpenLORIS_images img5787.jpg\nOpenLORIS_images img4510.jpg\nOpenLORIS_images img72.jpg\nOpenLORIS_images img3494.jpg\nOpenLORIS_images img543.jpg\nOpenLORIS_images img28.jpg\nOpenLORIS_images img606.jpg\nOpenLORIS_images img3057.jpg\nOpenLORIS_images img2241.jpg\nOpenLORIS_images img2941.jpg\nOpenLORIS_images img1687.jpg\nOpenLORIS_images img2397.jpg\nOpenLORIS_images img1109.jpg\nOpenLORIS_images img1229.jpg\nOpenLORIS_images img5384.jpg\nOpenLORIS_images img1196.jpg\nOpenLORIS_images img4891.jpg\nOpenLORIS_images img5116.jpg\nOpenLORIS_images img5625.jpg\nOpenLORIS_images img5933.jpg\nOpenLORIS_images img1813.jpg\nOpenLORIS_images img1939.jpg\nOpenLORIS_images img4169.jpg\nOpenLORIS_images img3106.jpg\nOpenLORIS_images img4534.jpg\nOpenLORIS_images img3711.jpg\nOpenLORIS_images img1072.jpg\nOpenLORIS_images img2747.jpg\nOpenLORIS_images img2963.jpg\nOpenLORIS_images img3339.jpg\nOpenLORIS_images img1086.jpg\nOpenLORIS_images img2741.jpg\nOpenLORIS_images img835.jpg\nOpenLORIS_images img2757.jpg\nOpenLORIS_images img3448.jpg\nOpenLORIS_images img5208.jpg\nOpenLORIS_images img5512.jpg\nOpenLORIS_images img4983.jpg\nOpenLORIS_images img4448.jpg\nOpenLORIS_images img5597.jpg\nOpenLORIS_images img977.jpg\nOpenLORIS_images img4732.jpg\nOpenLORIS_images img363.jpg\nOpenLORIS_images img1188.jpg\nOpenLORIS_images img3357.jpg\nOpenLORIS_images img4373.jpg\nOpenLORIS_images img5559.jpg\nOpenLORIS_images img416.jpg\nOpenLORIS_images img144.jpg\nOpenLORIS_images img4774.jpg\nOpenLORIS_images img6117.jpg\nOpenLORIS_images img1906.jpg\nOpenLORIS_images img4901.jpg\nOpenLORIS_images img4464.jpg\nOpenLORIS_images img1222.jpg\nOpenLORIS_images img663.jpg\nOpenLORIS_images img1376.jpg\nOpenLORIS_images img4727.jpg\nOpenLORIS_images img4079.jpg\nOpenLORIS_images img3887.jpg\nOpenLORIS_images img4089.jpg\nOpenLORIS_images img4496.jpg\nOpenLORIS_images img2450.jpg\nOpenLORIS_images img2112.jpg\nOpenLORIS_images img3416.jpg\nOpenLORIS_images img5475.jpg\nOpenLORIS_images img4112.jpg\nOpenLORIS_images img1748.jpg\nOpenLORIS_images img1815.jpg\nOpenLORIS_images img16.jpg\nOpenLORIS_images img2862.jpg\nOpenLORIS_images img1971.jpg\nOpenLORIS_images img3298.jpg\nOpenLORIS_images img4972.jpg\nOpenLORIS_images img1449.jpg\nOpenLORIS_images img2095.jpg\nOpenLORIS_images img1145.jpg\nOpenLORIS_images img2228.jpg\nOpenLORIS_images img17.jpg\nOpenLORIS_images img4665.jpg\nOpenLORIS_images img2324.jpg\nOpenLORIS_images img10.jpg\nOpenLORIS_images img2068.jpg\nOpenLORIS_images img2976.jpg\nOpenLORIS_images img4740.jpg\nOpenLORIS_images img2293.jpg\nOpenLORIS_images img2585.jpg\nOpenLORIS_images img2128.jpg\nOpenLORIS_images img4453.jpg\nOpenLORIS_images img2378.jpg\nOpenLORIS_images img5568.jpg\nOpenLORIS_images img1234.jpg\nOpenLORIS_images img4333.jpg\nOpenLORIS_images img1627.jpg\nOpenLORIS_images img4370.jpg\nOpenLORIS_images img1026.jpg\nOpenLORIS_images img5858.jpg\nOpenLORIS_images img271.jpg\nOpenLORIS_images img1065.jpg\nOpenLORIS_images img5588.jpg\nOpenLORIS_images img608.jpg\nOpenLORIS_images img5802.jpg\nOpenLORIS_images img1887.jpg\nOpenLORIS_images img3478.jpg\nOpenLORIS_images img1544.jpg\nOpenLORIS_images img3846.jpg\nOpenLORIS_images img3346.jpg\nOpenLORIS_images img1244.jpg\nOpenLORIS_images img4342.jpg\nOpenLORIS_images img6122.jpg\nOpenLORIS_images img1076.jpg\nOpenLORIS_images img4184.jpg\nOpenLORIS_images img3644.jpg\nOpenLORIS_images img3325.jpg\nOpenLORIS_images img204.jpg\nOpenLORIS_images img1336.jpg\nOpenLORIS_images img4922.jpg\nOpenLORIS_images img3109.jpg\nOpenLORIS_images img5311.jpg\nOpenLORIS_images img1010.jpg\nOpenLORIS_images img5365.jpg\nOpenLORIS_images img1893.jpg\nOpenLORIS_images img2819.jpg\nOpenLORIS_images img1467.jpg\nOpenLORIS_images img693.jpg\nOpenLORIS_images img4338.jpg\nOpenLORIS_images img6043.jpg\nOpenLORIS_images img3725.jpg\nOpenLORIS_images img4632.jpg\nOpenLORIS_images img4818.jpg\nOpenLORIS_images img67.jpg\nOpenLORIS_images img2786.jpg\nOpenLORIS_images img3622.jpg\nOpenLORIS_images img1285.jpg\nOpenLORIS_images img1383.jpg\nOpenLORIS_images img5282.jpg\nOpenLORIS_images img1991.jpg\nOpenLORIS_images img5203.jpg\nOpenLORIS_images img2435.jpg\nOpenLORIS_images img3993.jpg\nOpenLORIS_images img5216.jpg\nOpenLORIS_images img764.jpg\nOpenLORIS_images img3263.jpg\nOpenLORIS_images img5916.jpg\nOpenLORIS_images img3980.jpg\nOpenLORIS_images img5811.jpg\nOpenLORIS_images img2761.jpg\nOpenLORIS_images img5986.jpg\nOpenLORIS_images img973.jpg\nOpenLORIS_images img869.jpg\nOpenLORIS_images img1905.jpg\nOpenLORIS_images img4378.jpg\nOpenLORIS_images img1192.jpg\nOpenLORIS_images img3805.jpg\nOpenLORIS_images img3873.jpg\nOpenLORIS_images img5064.jpg\nOpenLORIS_images img1795.jpg\nOpenLORIS_images img4979.jpg\nOpenLORIS_images img660.jpg\nOpenLORIS_images img5348.jpg\nOpenLORIS_images img2225.jpg\nOpenLORIS_images img2476.jpg\nOpenLORIS_images img3687.jpg\nOpenLORIS_images img2453.jpg\nOpenLORIS_images img1500.jpg\nOpenLORIS_images img1878.jpg\nOpenLORIS_images img1363.jpg\nOpenLORIS_images img1127.jpg\nOpenLORIS_images img4087.jpg\nOpenLORIS_images img4045.jpg\nOpenLORIS_images img2106.jpg\nOpenLORIS_images img61.jpg\nOpenLORIS_images img2221.jpg\nOpenLORIS_images img1678.jpg\nOpenLORIS_images img5256.jpg\nOpenLORIS_images img1264.jpg\nOpenLORIS_images img4804.jpg\nOpenLORIS_images img4760.jpg\nOpenLORIS_images img4516.jpg\nOpenLORIS_images img1824.jpg\nOpenLORIS_images img2693.jpg\nOpenLORIS_images img1800.jpg\nOpenLORIS_images img2557.jpg\nOpenLORIS_images img1154.jpg\nOpenLORIS_images img948.jpg\nOpenLORIS_images img177.jpg\nOpenLORIS_images img912.jpg\nOpenLORIS_images img2141.jpg\nOpenLORIS_images img4663.jpg\nOpenLORIS_images img4993.jpg\nOpenLORIS_images img2150.jpg\nOpenLORIS_images img1418.jpg\nOpenLORIS_images img3618.jpg\nOpenLORIS_images img5727.jpg\nOpenLORIS_images img1416.jpg\nOpenLORIS_images img5783.jpg\nOpenLORIS_images img1766.jpg\nOpenLORIS_images img1452.jpg\nOpenLORIS_images img2433.jpg\nOpenLORIS_images img2901.jpg\nOpenLORIS_images img443.jpg\nOpenLORIS_images img5136.jpg\nOpenLORIS_images img1243.jpg\nOpenLORIS_images img2283.jpg\nOpenLORIS_images img5519.jpg\nOpenLORIS_images img1116.jpg\nOpenLORIS_images img1177.jpg\nOpenLORIS_images img3699.jpg\nOpenLORIS_images img2526.jpg\nOpenLORIS_images img3834.jpg\nOpenLORIS_images img3531.jpg\nOpenLORIS_images img5554.jpg\nOpenLORIS_images img2735.jpg\nOpenLORIS_images img5630.jpg\nOpenLORIS_images img4392.jpg\nOpenLORIS_images img4942.jpg\nOpenLORIS_images img1306.jpg\nOpenLORIS_images img4414.jpg\nOpenLORIS_images img374.jpg\nOpenLORIS_images img1871.jpg\nOpenLORIS_images img5594.jpg\nOpenLORIS_images img5686.jpg\nOpenLORIS_images img3475.jpg\nOpenLORIS_images img4835.jpg\nOpenLORIS_images img3034.jpg\nOpenLORIS_images img3129.jpg\nOpenLORIS_images img749.jpg\nOpenLORIS_images img396.jpg\nOpenLORIS_images img180.jpg\nOpenLORIS_images img5529.jpg\nOpenLORIS_images img5053.jpg\nOpenLORIS_images img3410.jpg\nOpenLORIS_images img217.jpg\nOpenLORIS_images img4757.jpg\nOpenLORIS_images img258.jpg\nOpenLORIS_images img5851.jpg\nOpenLORIS_images img2401.jpg\nOpenLORIS_images img5372.jpg\nOpenLORIS_images img4484.jpg\nOpenLORIS_images img3358.jpg\nOpenLORIS_images img5252.jpg\nOpenLORIS_images img4681.jpg\nOpenLORIS_images img5766.jpg\nOpenLORIS_images img992.jpg\nOpenLORIS_images img5685.jpg\nOpenLORIS_images img2888.jpg\nOpenLORIS_images img3661.jpg\nOpenLORIS_images img2974.jpg\nOpenLORIS_images img190.jpg\nOpenLORIS_images img5147.jpg\nOpenLORIS_images img4307.jpg\nOpenLORIS_images img3571.jpg\nOpenLORIS_images img3485.jpg\nOpenLORIS_images img1845.jpg\nOpenLORIS_images img2822.jpg\nOpenLORIS_images img2054.jpg\nOpenLORIS_images img2043.jpg\nOpenLORIS_images img2044.jpg\nOpenLORIS_images img4126.jpg\nOpenLORIS_images img5876.jpg\nOpenLORIS_images img3249.jpg\nOpenLORIS_images img3513.jpg\nOpenLORIS_images img4551.jpg\nOpenLORIS_images img1309.jpg\nOpenLORIS_images img4801.jpg\nOpenLORIS_images img3086.jpg\nOpenLORIS_images img1053.jpg\nOpenLORIS_images img3860.jpg\nOpenLORIS_images img225.jpg\nOpenLORIS_images img1437.jpg\nOpenLORIS_images img763.jpg\nOpenLORIS_images img2909.jpg\nOpenLORIS_images img164.jpg\nOpenLORIS_images img3065.jpg\nOpenLORIS_images img1166.jpg\nOpenLORIS_images img2381.jpg\nOpenLORIS_images img6141.jpg\nOpenLORIS_images img2660.jpg\nOpenLORIS_images img5410.jpg\nOpenLORIS_images img5423.jpg\nOpenLORIS_images img3948.jpg\nOpenLORIS_images img4735.jpg\nOpenLORIS_images img2038.jpg\nOpenLORIS_images img5177.jpg\nOpenLORIS_images img1570.jpg\nOpenLORIS_images img3631.jpg\nOpenLORIS_images img3600.jpg\nOpenLORIS_images img6015.jpg\nOpenLORIS_images img2490.jpg\nOpenLORIS_images img5753.jpg\nOpenLORIS_images img4616.jpg\nOpenLORIS_images img1562.jpg\nOpenLORIS_images img5798.jpg\nOpenLORIS_images img2593.jpg\nOpenLORIS_images img2047.jpg\nOpenLORIS_images img5310.jpg\nOpenLORIS_images img56.jpg\nOpenLORIS_images img2428.jpg\nOpenLORIS_images img5491.jpg\nOpenLORIS_images img3457.jpg\nOpenLORIS_images img1590.jpg\nOpenLORIS_images img4404.jpg\nOpenLORIS_images img3543.jpg\nOpenLORIS_images img3715.jpg\nOpenLORIS_images img2147.jpg\nOpenLORIS_images img5341.jpg\nOpenLORIS_images img1159.jpg\nOpenLORIS_images img3718.jpg\nOpenLORIS_images img1659.jpg\nOpenLORIS_images img4840.jpg\nOpenLORIS_images img2598.jpg\nOpenLORIS_images img5083.jpg\nOpenLORIS_images img5722.jpg\nOpenLORIS_images img364.jpg\nOpenLORIS_images img4476.jpg\nOpenLORIS_images img584.jpg\nOpenLORIS_images img3374.jpg\nOpenLORIS_images img1020.jpg\nOpenLORIS_images img5867.jpg\nOpenLORIS_images img2889.jpg\nOpenLORIS_images img4810.jpg\nOpenLORIS_images img5456.jpg\nOpenLORIS_images img4466.jpg\nOpenLORIS_images img4765.jpg\nOpenLORIS_images img5697.jpg\nOpenLORIS_images img578.jpg\nOpenLORIS_images img1929.jpg\nOpenLORIS_images img1387.jpg\nOpenLORIS_images img1560.jpg\nOpenLORIS_images img6.jpg\nOpenLORIS_images img4046.jpg\nOpenLORIS_images img1003.jpg\nOpenLORIS_images img4849.jpg\nOpenLORIS_images img6065.jpg\nOpenLORIS_images img628.jpg\nOpenLORIS_images img1428.jpg\nOpenLORIS_images img1333.jpg\nOpenLORIS_images img1776.jpg\nOpenLORIS_images img1370.jpg\nOpenLORIS_images img3960.jpg\nOpenLORIS_images img2483.jpg\nOpenLORIS_images img119.jpg\nOpenLORIS_images img1664.jpg\nOpenLORIS_images img4421.jpg\nOpenLORIS_images img2319.jpg\nOpenLORIS_images img5174.jpg\nOpenLORIS_images img2654.jpg\nOpenLORIS_images img4186.jpg\nOpenLORIS_images img5140.jpg\nOpenLORIS_images img468.jpg\nOpenLORIS_images img3586.jpg\nOpenLORIS_images img5090.jpg\nOpenLORIS_images img1529.jpg\nOpenLORIS_images img1704.jpg\nOpenLORIS_images img491.jpg\nOpenLORIS_images img6105.jpg\nOpenLORIS_images img1772.jpg\nOpenLORIS_images img4621.jpg\nOpenLORIS_images img4293.jpg\nOpenLORIS_images img2370.jpg\nOpenLORIS_images img5415.jpg\nOpenLORIS_images img5814.jpg\nOpenLORIS_images img3404.jpg\nOpenLORIS_images img5040.jpg\nOpenLORIS_images img863.jpg\nOpenLORIS_images img3692.jpg\nOpenLORIS_images img5029.jpg\nOpenLORIS_images img4531.jpg\nOpenLORIS_images img2016.jpg\nOpenLORIS_images img791.jpg\nOpenLORIS_images img1789.jpg\nOpenLORIS_images img3659.jpg\nOpenLORIS_images img795.jpg\nOpenLORIS_images img1933.jpg\nOpenLORIS_images img5566.jpg\nOpenLORIS_images img2084.jpg\nOpenLORIS_images img3716.jpg\nOpenLORIS_images img6084.jpg\nOpenLORIS_images img679.jpg\nOpenLORIS_images img1631.jpg\nOpenLORIS_images img4866.jpg\nOpenLORIS_images img897.jpg\nOpenLORIS_images img3182.jpg\nOpenLORIS_images img1236.jpg\nOpenLORIS_images img5840.jpg\nOpenLORIS_images img4529.jpg\nOpenLORIS_images img5070.jpg\nOpenLORIS_images img3432.jpg\nOpenLORIS_images img4830.jpg\nOpenLORIS_images img4383.jpg\nOpenLORIS_images img6119.jpg\nOpenLORIS_images img2749.jpg\nOpenLORIS_images img166.jpg\nOpenLORIS_images img2396.jpg\nOpenLORIS_images img3747.jpg\nOpenLORIS_images img4986.jpg\nOpenLORIS_images img3516.jpg\nOpenLORIS_images img2108.jpg\nOpenLORIS_images img2605.jpg\nOpenLORIS_images img4954.jpg\nOpenLORIS_images img2567.jpg\nOpenLORIS_images img1522.jpg\nOpenLORIS_images img5584.jpg\nOpenLORIS_images img3452.jpg\nOpenLORIS_images img4287.jpg\nOpenLORIS_images img6114.jpg\nOpenLORIS_images img69.jpg\nOpenLORIS_images img1580.jpg\nOpenLORIS_images img4439.jpg\nOpenLORIS_images img1646.jpg\nOpenLORIS_images img1429.jpg\nOpenLORIS_images img5416.jpg\nOpenLORIS_images img5176.jpg\nOpenLORIS_images img2254.jpg\nOpenLORIS_images img4987.jpg\nOpenLORIS_images img744.jpg\nOpenLORIS_images img3982.jpg\nOpenLORIS_images img5067.jpg\nOpenLORIS_images img5320.jpg\nOpenLORIS_images img5653.jpg\nOpenLORIS_images img2277.jpg\nOpenLORIS_images img3309.jpg\nOpenLORIS_images img344.jpg\nOpenLORIS_images img2178.jpg\nOpenLORIS_images img1720.jpg\nOpenLORIS_images img3493.jpg\nOpenLORIS_images img3995.jpg\nOpenLORIS_images img1864.jpg\nOpenLORIS_images img792.jpg\nOpenLORIS_images img4651.jpg\nOpenLORIS_images img751.jpg\nOpenLORIS_images img3229.jpg\nOpenLORIS_images img1565.jpg\nOpenLORIS_images img1543.jpg\nOpenLORIS_images img2586.jpg\nOpenLORIS_images img4581.jpg\nOpenLORIS_images img5870.jpg\nOpenLORIS_images img4730.jpg\nOpenLORIS_images img4770.jpg\nOpenLORIS_images img5008.jpg\nOpenLORIS_images img5145.jpg\nOpenLORIS_images img1141.jpg\nOpenLORIS_images img2734.jpg\nOpenLORIS_images img5265.jpg\nOpenLORIS_images img3981.jpg\nOpenLORIS_images img3759.jpg\nOpenLORIS_images img286.jpg\nOpenLORIS_images img340.jpg\nOpenLORIS_images img5039.jpg\nOpenLORIS_images img4565.jpg\nOpenLORIS_images img954.jpg\nOpenLORIS_images img5170.jpg\nOpenLORIS_images img94.jpg\nOpenLORIS_images img319.jpg\nOpenLORIS_images img4049.jpg\nOpenLORIS_images img691.jpg\nOpenLORIS_images img3791.jpg\nOpenLORIS_images img1008.jpg\nOpenLORIS_images img1018.jpg\nOpenLORIS_images img104.jpg\nOpenLORIS_images img2268.jpg\nOpenLORIS_images img3029.jpg\nOpenLORIS_images img1980.jpg\nOpenLORIS_images img1751.jpg\nOpenLORIS_images img4710.jpg\nOpenLORIS_images img5762.jpg\nOpenLORIS_images img885.jpg\nOpenLORIS_images img187.jpg\nOpenLORIS_images img2919.jpg\nOpenLORIS_images img126.jpg\nOpenLORIS_images img532.jpg\nOpenLORIS_images img1818.jpg\nOpenLORIS_images img3558.jpg\nOpenLORIS_images img4684.jpg\nOpenLORIS_images img1270.jpg\nOpenLORIS_images img2803.jpg\nOpenLORIS_images img1379.jpg\nOpenLORIS_images img5031.jpg\nOpenLORIS_images img4173.jpg\nOpenLORIS_images img3581.jpg\nOpenLORIS_images img3651.jpg\nOpenLORIS_images img4176.jpg\nOpenLORIS_images img6118.jpg\nOpenLORIS_images img4330.jpg\nOpenLORIS_images img2236.jpg\nOpenLORIS_images img2402.jpg\nOpenLORIS_images img5642.jpg\nOpenLORIS_images img5923.jpg\nOpenLORIS_images img2155.jpg\nOpenLORIS_images img5872.jpg\nOpenLORIS_images img5259.jpg\nOpenLORIS_images img5286.jpg\nOpenLORIS_images img5590.jpg\nOpenLORIS_images img1915.jpg\nOpenLORIS_images img3172.jpg\nOpenLORIS_images img5741.jpg\nOpenLORIS_images img518.jpg\nOpenLORIS_images img5705.jpg\nOpenLORIS_images img1081.jpg\nOpenLORIS_images img3285.jpg\nOpenLORIS_images img576.jpg\nOpenLORIS_images img3788.jpg\nOpenLORIS_images img3740.jpg\nOpenLORIS_images img3855.jpg\nOpenLORIS_images img2763.jpg\nOpenLORIS_images img3025.jpg\nOpenLORIS_images img3012.jpg\nOpenLORIS_images img980.jpg\nOpenLORIS_images img6121.jpg\nOpenLORIS_images img4352.jpg\nOpenLORIS_images img5106.jpg\nOpenLORIS_images img1695.jpg\nOpenLORIS_images img1577.jpg\nOpenLORIS_images img389.jpg\nOpenLORIS_images img4701.jpg\nOpenLORIS_images img3612.jpg\nOpenLORIS_images img5925.jpg\nOpenLORIS_images img1401.jpg\nOpenLORIS_images img5571.jpg\nOpenLORIS_images img19.jpg\nOpenLORIS_images img5400.jpg\nOpenLORIS_images img2847.jpg\nOpenLORIS_images img4482.jpg\nOpenLORIS_images img4110.jpg\nOpenLORIS_images img3149.jpg\nOpenLORIS_images img1979.jpg\nOpenLORIS_images img5724.jpg\nOpenLORIS_images img1326.jpg\nOpenLORIS_images img916.jpg\nOpenLORIS_images img4834.jpg\nOpenLORIS_images img2505.jpg\nOpenLORIS_images img198.jpg\nOpenLORIS_images img4291.jpg\nOpenLORIS_images img5237.jpg\nOpenLORIS_images img2531.jpg\nOpenLORIS_images img3213.jpg\nOpenLORIS_images img2917.jpg\nOpenLORIS_images img5085.jpg\nOpenLORIS_images img2366.jpg\nOpenLORIS_images img5754.jpg\nOpenLORIS_images img2107.jpg\nOpenLORIS_images img1287.jpg\nOpenLORIS_images img3997.jpg\nOpenLORIS_images img5778.jpg\nOpenLORIS_images img1715.jpg\nOpenLORIS_images img3667.jpg\nOpenLORIS_images img2316.jpg\nOpenLORIS_images img2233.jpg\nOpenLORIS_images img2854.jpg\nOpenLORIS_images img6028.jpg\nOpenLORIS_images img194.jpg\nOpenLORIS_images img2398.jpg\nOpenLORIS_images img4268.jpg\nOpenLORIS_images img2714.jpg\nOpenLORIS_images img2922.jpg\nOpenLORIS_images img5427.jpg\nOpenLORIS_images img5772.jpg\nOpenLORIS_images img5962.jpg\nOpenLORIS_images img4965.jpg\nOpenLORIS_images img4108.jpg\nOpenLORIS_images img1371.jpg\nOpenLORIS_images img774.jpg\nOpenLORIS_images img3313.jpg\nOpenLORIS_images img5676.jpg\nOpenLORIS_images img4313.jpg\nOpenLORIS_images img1969.jpg\nOpenLORIS_images img5016.jpg\nOpenLORIS_images img2960.jpg\nOpenLORIS_images img1925.jpg\nOpenLORIS_images img5659.jpg\nOpenLORIS_images img4587.jpg\nOpenLORIS_images img5205.jpg\nOpenLORIS_images img2652.jpg\nOpenLORIS_images img3861.jpg\nOpenLORIS_images img1844.jpg\nOpenLORIS_images img2711.jpg\nOpenLORIS_images img3221.jpg\nOpenLORIS_images img829.jpg\nOpenLORIS_images img2818.jpg\nOpenLORIS_images img2950.jpg\nOpenLORIS_images img1782.jpg\nOpenLORIS_images img4252.jpg\nOpenLORIS_images img4081.jpg\nOpenLORIS_images img4620.jpg\nOpenLORIS_images img650.jpg\nOpenLORIS_images img4530.jpg\nOpenLORIS_images img498.jpg\nOpenLORIS_images img2367.jpg\nOpenLORIS_images img2939.jpg\nOpenLORIS_images img3577.jpg\nOpenLORIS_images img41.jpg\nOpenLORIS_images img5312.jpg\nOpenLORIS_images img5612.jpg\nOpenLORIS_images img5928.jpg\nOpenLORIS_images img5143.jpg\nOpenLORIS_images img4923.jpg\nOpenLORIS_images img514.jpg\nOpenLORIS_images img2687.jpg\nOpenLORIS_images img4325.jpg\nOpenLORIS_images img2861.jpg\nOpenLORIS_images img63.jpg\nOpenLORIS_images img1129.jpg\nOpenLORIS_images img1218.jpg\nOpenLORIS_images img4927.jpg\nOpenLORIS_images img3904.jpg\nOpenLORIS_images img1729.jpg\nOpenLORIS_images img1872.jpg\nOpenLORIS_images img2391.jpg\nOpenLORIS_images img5032.jpg\nOpenLORIS_images img431.jpg\nOpenLORIS_images img5408.jpg\nOpenLORIS_images img6033.jpg\nOpenLORIS_images img5328.jpg\nOpenLORIS_images img5887.jpg\nOpenLORIS_images img4068.jpg\nOpenLORIS_images img5437.jpg\nOpenLORIS_images img769.jpg\nOpenLORIS_images img5371.jpg\nOpenLORIS_images img1413.jpg\nOpenLORIS_images img938.jpg\nOpenLORIS_images img3610.jpg\nOpenLORIS_images img2408.jpg\nOpenLORIS_images img3741.jpg\nOpenLORIS_images img6052.jpg\nOpenLORIS_images img4569.jpg\nOpenLORIS_images img4407.jpg\nOpenLORIS_images img2377.jpg\nOpenLORIS_images img6104.jpg\nOpenLORIS_images img2742.jpg\nOpenLORIS_images img3840.jpg\nOpenLORIS_images img1549.jpg\nOpenLORIS_images img1791.jpg\nOpenLORIS_images img2823.jpg\nOpenLORIS_images img1736.jpg\nOpenLORIS_images img4673.jpg\nOpenLORIS_images img3233.jpg\nOpenLORIS_images img1764.jpg\nOpenLORIS_images img3031.jpg\nOpenLORIS_images img2575.jpg\nOpenLORIS_images img3591.jpg\nOpenLORIS_images img1884.jpg\nOpenLORIS_images img2452.jpg\nOpenLORIS_images img370.jpg\nOpenLORIS_images img5157.jpg\nOpenLORIS_images img2727.jpg\nOpenLORIS_images img5884.jpg\nOpenLORIS_images img5447.jpg\nOpenLORIS_images img4393.jpg\nOpenLORIS_images img4459.jpg\nOpenLORIS_images img6133.jpg\nOpenLORIS_images img494.jpg\nOpenLORIS_images img5508.jpg\nOpenLORIS_images img1574.jpg\nOpenLORIS_images img2503.jpg\nOpenLORIS_images img3686.jpg\nOpenLORIS_images img1408.jpg\nOpenLORIS_images img5036.jpg\nOpenLORIS_images img3239.jpg\nOpenLORIS_images img4856.jpg\nOpenLORIS_images img1226.jpg\nOpenLORIS_images img4859.jpg\nOpenLORIS_images img5739.jpg\nOpenLORIS_images img4400.jpg\nOpenLORIS_images img3027.jpg\nOpenLORIS_images img1757.jpg\nOpenLORIS_images img721.jpg\nOpenLORIS_images img4441.jpg\nOpenLORIS_images img4490.jpg\nOpenLORIS_images img5920.jpg\nOpenLORIS_images img4962.jpg\nOpenLORIS_images img2011.jpg\nOpenLORIS_images img1271.jpg\nOpenLORIS_images img4179.jpg\nOpenLORIS_images img1934.jpg\nOpenLORIS_images img333.jpg\nOpenLORIS_images img1125.jpg\nOpenLORIS_images img2716.jpg\nOpenLORIS_images img2674.jpg\nOpenLORIS_images img130.jpg\nOpenLORIS_images img1779.jpg\nOpenLORIS_images img1944.jpg\nOpenLORIS_images img3582.jpg\nOpenLORIS_images img3910.jpg\nOpenLORIS_images img993.jpg\nOpenLORIS_images img3744.jpg\nOpenLORIS_images img4479.jpg\nOpenLORIS_images img53.jpg\nOpenLORIS_images img4611.jpg\nOpenLORIS_images img4662.jpg\nOpenLORIS_images img2364.jpg\nOpenLORIS_images img1011.jpg\nOpenLORIS_images img5538.jpg\nOpenLORIS_images img5056.jpg\nOpenLORIS_images img877.jpg\nOpenLORIS_images img5037.jpg\nOpenLORIS_images img4764.jpg\nOpenLORIS_images img2797.jpg\nOpenLORIS_images img2104.jpg\nOpenLORIS_images img80.jpg\nOpenLORIS_images img2551.jpg\nOpenLORIS_images img2159.jpg\nOpenLORIS_images img2752.jpg\nOpenLORIS_images img1534.jpg\nOpenLORIS_images img2029.jpg\nOpenLORIS_images img743.jpg\nOpenLORIS_images img1851.jpg\nOpenLORIS_images img1912.jpg\nOpenLORIS_images img2954.jpg\nOpenLORIS_images img5919.jpg\nOpenLORIS_images img1029.jpg\nOpenLORIS_images img700.jpg\nOpenLORIS_images img2058.jpg\nOpenLORIS_images img3918.jpg\nOpenLORIS_images img5885.jpg\nOpenLORIS_images img2834.jpg\nOpenLORIS_images img2156.jpg\nOpenLORIS_images img2927.jpg\nOpenLORIS_images img1734.jpg\nOpenLORIS_images img5552.jpg\nOpenLORIS_images img3748.jpg\nOpenLORIS_images img3445.jpg\nOpenLORIS_images img5560.jpg\nOpenLORIS_images img3192.jpg\nOpenLORIS_images img5835.jpg\nOpenLORIS_images img4706.jpg\nOpenLORIS_images img4759.jpg\nOpenLORIS_images img1803.jpg\nOpenLORIS_images img5236.jpg\nOpenLORIS_images img3849.jpg\nOpenLORIS_images img6018.jpg\nOpenLORIS_images img881.jpg\nOpenLORIS_images img4787.jpg\nOpenLORIS_images img4050.jpg\nOpenLORIS_images img2821.jpg\nOpenLORIS_images img2973.jpg\nOpenLORIS_images img1352.jpg\nOpenLORIS_images img736.jpg\nOpenLORIS_images img1679.jpg\nOpenLORIS_images img2090.jpg\nOpenLORIS_images img1947.jpg\nOpenLORIS_images img5646.jpg\nOpenLORIS_images img2134.jpg\nOpenLORIS_images img648.jpg\nOpenLORIS_images img3322.jpg\nOpenLORIS_images img1296.jpg\nOpenLORIS_images img1405.jpg\nOpenLORIS_images img1042.jpg\nOpenLORIS_images img5904.jpg\nOpenLORIS_images img2965.jpg\nOpenLORIS_images img2129.jpg\nOpenLORIS_images img5079.jpg\nOpenLORIS_images img97.jpg\nOpenLORIS_images img4.jpg\nOpenLORIS_images img1654.jpg\nOpenLORIS_images img4422.jpg\nOpenLORIS_images img5335.jpg\nOpenLORIS_images img5042.jpg\nOpenLORIS_images img2119.jpg\nOpenLORIS_images img730.jpg\nOpenLORIS_images img884.jpg\nOpenLORIS_images img2376.jpg\nOpenLORIS_images img2588.jpg\nOpenLORIS_images img827.jpg\nOpenLORIS_images img5658.jpg\nOpenLORIS_images img506.jpg\nOpenLORIS_images img4538.jpg\nOpenLORIS_images img2014.jpg\nOpenLORIS_images img1223.jpg\nOpenLORIS_images img2683.jpg\nOpenLORIS_images img702.jpg\nOpenLORIS_images img1414.jpg\nOpenLORIS_images img3252.jpg\nOpenLORIS_images img2926.jpg\nOpenLORIS_images img1438.jpg\nOpenLORIS_images img2163.jpg\nOpenLORIS_images img1178.jpg\nOpenLORIS_images img5586.jpg\nOpenLORIS_images img951.jpg\nOpenLORIS_images img3142.jpg\nOpenLORIS_images img4266.jpg\nOpenLORIS_images img1015.jpg\nOpenLORIS_images img5664.jpg\nOpenLORIS_images img2609.jpg\nOpenLORIS_images img4708.jpg\nOpenLORIS_images img5063.jpg\nOpenLORIS_images img4824.jpg\nOpenLORIS_images img1760.jpg\nOpenLORIS_images img5592.jpg\nOpenLORIS_images img4317.jpg\nOpenLORIS_images img2074.jpg\nOpenLORIS_images img4224.jpg\nOpenLORIS_images img2298.jpg\nOpenLORIS_images img1374.jpg\nOpenLORIS_images img4808.jpg\nOpenLORIS_images img2373.jpg\nOpenLORIS_images img5956.jpg\nOpenLORIS_images img2488.jpg\nOpenLORIS_images img4907.jpg\nOpenLORIS_images img3047.jpg\nOpenLORIS_images img2632.jpg\nOpenLORIS_images img1883.jpg\nOpenLORIS_images img3830.jpg\nOpenLORIS_images img4158.jpg\nOpenLORIS_images img3161.jpg\nOpenLORIS_images img1752.jpg\nOpenLORIS_images img3474.jpg\nOpenLORIS_images img2671.jpg\nOpenLORIS_images img535.jpg\nOpenLORIS_images img1743.jpg\nOpenLORIS_images img3402.jpg\nOpenLORIS_images img3345.jpg\nOpenLORIS_images img2755.jpg\nOpenLORIS_images img1171.jpg\nOpenLORIS_images img3376.jpg\nOpenLORIS_images img4462.jpg\nOpenLORIS_images img5343.jpg\nOpenLORIS_images img4308.jpg\nOpenLORIS_images img1922.jpg\nOpenLORIS_images img1283.jpg\nOpenLORIS_images img5865.jpg\nOpenLORIS_images img4084.jpg\nOpenLORIS_images img2883.jpg\nOpenLORIS_images img5123.jpg\nOpenLORIS_images img4520.jpg\nOpenLORIS_images img4285.jpg\nOpenLORIS_images img4310.jpg\nOpenLORIS_images img4653.jpg\nOpenLORIS_images img85.jpg\nOpenLORIS_images img4687.jpg\nOpenLORIS_images img6031.jpg\nOpenLORIS_images img4951.jpg\nOpenLORIS_images img6073.jpg\nOpenLORIS_images img4452.jpg\nOpenLORIS_images img5663.jpg\nOpenLORIS_images img5021.jpg\nOpenLORIS_images img178.jpg\nOpenLORIS_images img812.jpg\nOpenLORIS_images img787.jpg\nOpenLORIS_images img4129.jpg\nOpenLORIS_images img2925.jpg\nOpenLORIS_images img4643.jpg\nOpenLORIS_images img4614.jpg\nOpenLORIS_images img3968.jpg\nOpenLORIS_images img3295.jpg\nOpenLORIS_images img3580.jpg\nOpenLORIS_images img2784.jpg\nOpenLORIS_images img2978.jpg\nOpenLORIS_images img3749.jpg\nOpenLORIS_images img1503.jpg\nOpenLORIS_images img1683.jpg\nOpenLORIS_images img2923.jpg\nOpenLORIS_images img2046.jpg\nOpenLORIS_images img6055.jpg\nOpenLORIS_images img1993.jpg\nOpenLORIS_images img4780.jpg\nOpenLORIS_images img2001.jpg\nOpenLORIS_images img5988.jpg\nOpenLORIS_images img3929.jpg\nOpenLORIS_images img4172.jpg\nOpenLORIS_images img5003.jpg\nOpenLORIS_images img2565.jpg\nOpenLORIS_images img4423.jpg\nOpenLORIS_images img4609.jpg\nOpenLORIS_images img3271.jpg\nOpenLORIS_images img1935.jpg\nOpenLORIS_images img1640.jpg\nOpenLORIS_images img1446.jpg\nOpenLORIS_images img5101.jpg\nOpenLORIS_images img1960.jpg\nOpenLORIS_images img841.jpg\nOpenLORIS_images img4702.jpg\nOpenLORIS_images img6130.jpg\nOpenLORIS_images img4592.jpg\nOpenLORIS_images img1001.jpg\nOpenLORIS_images img82.jpg\nOpenLORIS_images img5010.jpg\nOpenLORIS_images img1733.jpg\nOpenLORIS_images img1699.jpg\nOpenLORIS_images img2507.jpg\nOpenLORIS_images img1762.jpg\nOpenLORIS_images img2120.jpg\nOpenLORIS_images img1328.jpg\nOpenLORIS_images img3611.jpg\nOpenLORIS_images img3335.jpg\nOpenLORIS_images img3755.jpg\nOpenLORIS_images img6012.jpg\nOpenLORIS_images img3238.jpg\nOpenLORIS_images img1242.jpg\nOpenLORIS_images img87.jpg\nOpenLORIS_images img1897.jpg\nOpenLORIS_images img5820.jpg\nOpenLORIS_images img4474.jpg\nOpenLORIS_images img2647.jpg\nOpenLORIS_images img4676.jpg\nOpenLORIS_images img3145.jpg\nOpenLORIS_images img3603.jpg\nOpenLORIS_images img3085.jpg\nOpenLORIS_images img1686.jpg\nOpenLORIS_images img3091.jpg\nOpenLORIS_images img1313.jpg\nOpenLORIS_images img4625.jpg\nOpenLORIS_images img5616.jpg\nOpenLORIS_images img4070.jpg\nOpenLORIS_images img4245.jpg\nOpenLORIS_images img3803.jpg\nOpenLORIS_images img5567.jpg\nOpenLORIS_images img5279.jpg\nOpenLORIS_images img3155.jpg\nOpenLORIS_images img3136.jpg\nOpenLORIS_images img2812.jpg\nOpenLORIS_images img5637.jpg\nOpenLORIS_images img279.jpg\nOpenLORIS_images img567.jpg\nOpenLORIS_images img2400.jpg\nOpenLORIS_images img2855.jpg\nOpenLORIS_images img2942.jpg\nOpenLORIS_images img4743.jpg\nOpenLORIS_images img946.jpg\nOpenLORIS_images img176.jpg\nOpenLORIS_images img353.jpg\nOpenLORIS_images img4491.jpg\nOpenLORIS_images img5326.jpg\nOpenLORIS_images img18.jpg\nOpenLORIS_images img3520.jpg\nOpenLORIS_images img420.jpg\nOpenLORIS_images img3892.jpg\nOpenLORIS_images img3743.jpg\nOpenLORIS_images img1630.jpg\nOpenLORIS_images img1605.jpg\nOpenLORIS_images img4784.jpg\nOpenLORIS_images img1677.jpg\nOpenLORIS_images img4855.jpg\nOpenLORIS_images img4341.jpg\nOpenLORIS_images img607.jpg\nOpenLORIS_images img3767.jpg\nOpenLORIS_images img3223.jpg\nOpenLORIS_images img1606.jpg\nOpenLORIS_images img3854.jpg\nOpenLORIS_images img2751.jpg\nOpenLORIS_images img3392.jpg\nOpenLORIS_images img3267.jpg\nOpenLORIS_images img4949.jpg\nOpenLORIS_images img3481.jpg\nOpenLORIS_images img1937.jpg\nOpenLORIS_images img3427.jpg\nOpenLORIS_images img3266.jpg\nOpenLORIS_images img1521.jpg\nOpenLORIS_images img3761.jpg\nOpenLORIS_images img1636.jpg\nOpenLORIS_images img3397.jpg\nOpenLORIS_images img3798.jpg\nOpenLORIS_images img1594.jpg\nOpenLORIS_images img6001.jpg\nOpenLORIS_images img5631.jpg\nOpenLORIS_images img4332.jpg\nOpenLORIS_images img5183.jpg\nOpenLORIS_images img1440.jpg\nOpenLORIS_images img2226.jpg\nOpenLORIS_images img5454.jpg\nOpenLORIS_images img3365.jpg\nOpenLORIS_images img3590.jpg\nOpenLORIS_images img4694.jpg\nOpenLORIS_images img5953.jpg\nOpenLORIS_images img5634.jpg\nOpenLORIS_images img2269.jpg\nOpenLORIS_images img5137.jpg\nOpenLORIS_images img6072.jpg\nOpenLORIS_images img1820.jpg\nOpenLORIS_images img1415.jpg\nOpenLORIS_images img6041.jpg\nOpenLORIS_images img4910.jpg\nOpenLORIS_images img3190.jpg\nOpenLORIS_images img5838.jpg\nOpenLORIS_images img4432.jpg\nOpenLORIS_images img2237.jpg\nOpenLORIS_images img5562.jpg\nOpenLORIS_images img3890.jpg\nOpenLORIS_images img5394.jpg\nOpenLORIS_images img1369.jpg\nOpenLORIS_images img5057.jpg\nOpenLORIS_images img1311.jpg\nOpenLORIS_images img460.jpg\nOpenLORIS_images img2666.jpg\nOpenLORIS_images img3828.jpg\nOpenLORIS_images img1451.jpg\nOpenLORIS_images img801.jpg\nOpenLORIS_images img4692.jpg\nOpenLORIS_images img1496.jpg\nOpenLORIS_images img1114.jpg\nOpenLORIS_images img252.jpg\nOpenLORIS_images img5561.jpg\nOpenLORIS_images img1094.jpg\nOpenLORIS_images img5393.jpg\nOpenLORIS_images img1463.jpg\nOpenLORIS_images img963.jpg\nOpenLORIS_images img5058.jpg\nOpenLORIS_images img720.jpg\nOpenLORIS_images img4630.jpg\nOpenLORIS_images img5321.jpg\nOpenLORIS_images img4729.jpg\nOpenLORIS_images img3336.jpg\nOpenLORIS_images img52.jpg\nOpenLORIS_images img2624.jpg\nOpenLORIS_images img2212.jpg\nOpenLORIS_images img31.jpg\nOpenLORIS_images img3839.jpg\nOpenLORIS_images img4262.jpg\nOpenLORIS_images img260.jpg\nOpenLORIS_images img3364.jpg\nOpenLORIS_images img132.jpg\nOpenLORIS_images img3959.jpg\nOpenLORIS_images img3996.jpg\nOpenLORIS_images img4890.jpg\nOpenLORIS_images img5492.jpg\nOpenLORIS_images img5710.jpg\nOpenLORIS_images img3848.jpg\nOpenLORIS_images img2197.jpg\nOpenLORIS_images img5650.jpg\nOpenLORIS_images img625.jpg\nOpenLORIS_images img1903.jpg\nOpenLORIS_images img2839.jpg\nOpenLORIS_images img3664.jpg\nOpenLORIS_images img2955.jpg\nOpenLORIS_images img139.jpg\nOpenLORIS_images img5972.jpg\nOpenLORIS_images img2750.jpg\nOpenLORIS_images img5810.jpg\nOpenLORIS_images img1700.jpg\nOpenLORIS_images img93.jpg\nOpenLORIS_images img473.jpg\nOpenLORIS_images img3310.jpg\nOpenLORIS_images img464.jpg\nOpenLORIS_images img600.jpg\nOpenLORIS_images img3061.jpg\nOpenLORIS_images img5127.jpg\nOpenLORIS_images img2535.jpg\nOpenLORIS_images img1520.jpg\nOpenLORIS_images img4772.jpg\nOpenLORIS_images img3613.jpg\nOpenLORIS_images img4078.jpg\nOpenLORIS_images img2372.jpg\nOpenLORIS_images img3512.jpg\nOpenLORIS_images img1320.jpg\nOpenLORIS_images img4475.jpg\nOpenLORIS_images img688.jpg\nOpenLORIS_images img925.jpg\nOpenLORIS_images img4528.jpg\nOpenLORIS_images img3944.jpg\nOpenLORIS_images img8.jpg\nOpenLORIS_images img1464.jpg\nOpenLORIS_images img976.jpg\nOpenLORIS_images img1203.jpg\nOpenLORIS_images img722.jpg\nOpenLORIS_images img1739.jpg\nOpenLORIS_images img1796.jpg\nOpenLORIS_images img2057.jpg\nOpenLORIS_images img3912.jpg\nOpenLORIS_images img1620.jpg\nOpenLORIS_images img4447.jpg\nOpenLORIS_images img2578.jpg\nOpenLORIS_images img3455.jpg\nOpenLORIS_images img4149.jpg\nOpenLORIS_images img2419.jpg\nOpenLORIS_images img3627.jpg\nOpenLORIS_images img678.jpg\nOpenLORIS_images img768.jpg\nOpenLORIS_images img3465.jpg\nOpenLORIS_images img2993.jpg\nOpenLORIS_images img2516.jpg\nOpenLORIS_images img3568.jpg\nOpenLORIS_images img2146.jpg\nOpenLORIS_images img2092.jpg\nOpenLORIS_images img5267.jpg\nOpenLORIS_images img3675.jpg\nOpenLORIS_images img5511.jpg\nOpenLORIS_images img294.jpg\nOpenLORIS_images img5589.jpg\nOpenLORIS_images img1525.jpg\nOpenLORIS_images img1582.jpg\nOpenLORIS_images img6102.jpg\nOpenLORIS_images img1569.jpg\nOpenLORIS_images img1504.jpg\nOpenLORIS_images img2770.jpg\nOpenLORIS_images img3736.jpg\nOpenLORIS_images img3438.jpg\nOpenLORIS_images img2726.jpg\nOpenLORIS_images img4394.jpg\nOpenLORIS_images img4622.jpg\nOpenLORIS_images img1773.jpg\nOpenLORIS_images img324.jpg\nOpenLORIS_images img5026.jpg\nOpenLORIS_images img2282.jpg\nOpenLORIS_images img1316.jpg\nOpenLORIS_images img5224.jpg\nOpenLORIS_images img3853.jpg\nOpenLORIS_images img4187.jpg\nOpenLORIS_images img709.jpg\nOpenLORIS_images img1528.jpg\nOpenLORIS_images img4691.jpg\nOpenLORIS_images img5309.jpg\nOpenLORIS_images img1675.jpg\nOpenLORIS_images img719.jpg\nOpenLORIS_images img361.jpg\nOpenLORIS_images img1508.jpg\nOpenLORIS_images img3218.jpg\nOpenLORIS_images img5532.jpg\nOpenLORIS_images img1354.jpg\nOpenLORIS_images img2130.jpg\nOpenLORIS_images img1000.jpg\nOpenLORIS_images img205.jpg\nOpenLORIS_images img6054.jpg\nOpenLORIS_images img1224.jpg\nOpenLORIS_images img6046.jpg\nOpenLORIS_images img1923.jpg\nOpenLORIS_images img1784.jpg\nOpenLORIS_images img3589.jpg\nOpenLORIS_images img3110.jpg\nOpenLORIS_images img3370.jpg\nOpenLORIS_images img3518.jpg\nOpenLORIS_images img920.jpg\nOpenLORIS_images img5349.jpg\nOpenLORIS_images img3847.jpg\nOpenLORIS_images img4157.jpg\nOpenLORIS_images img2667.jpg\nOpenLORIS_images img4135.jpg\nOpenLORIS_images img394.jpg\nOpenLORIS_images img4872.jpg\nOpenLORIS_images img5654.jpg\nOpenLORIS_images img5218.jpg\nOpenLORIS_images img3262.jpg\nOpenLORIS_images img5287.jpg\nOpenLORIS_images img4008.jpg\nOpenLORIS_images img3070.jpg\nOpenLORIS_images img55.jpg\nOpenLORIS_images img3560.jpg\nOpenLORIS_images img1918.jpg\nOpenLORIS_images img3113.jpg\nOpenLORIS_images img2957.jpg\nOpenLORIS_images img3833.jpg\nOpenLORIS_images img2934.jpg\nOpenLORIS_images img4375.jpg\nOpenLORIS_images img4030.jpg\nOpenLORIS_images img159.jpg\nOpenLORIS_images img6129.jpg\nOpenLORIS_images img4679.jpg\nOpenLORIS_images img1847.jpg\nOpenLORIS_images img871.jpg\nOpenLORIS_images img4806.jpg\nOpenLORIS_images img3334.jpg\nOpenLORIS_images img2110.jpg\nOpenLORIS_images img4600.jpg\nOpenLORIS_images img453.jpg\nOpenLORIS_images img5574.jpg\nOpenLORIS_images img6126.jpg\nOpenLORIS_images img2307.jpg\nOpenLORIS_images img5672.jpg\nOpenLORIS_images img5520.jpg\nOpenLORIS_images img2171.jpg\nOpenLORIS_images img2480.jpg\nOpenLORIS_images img3852.jpg\nOpenLORIS_images img815.jpg\nOpenLORIS_images img2051.jpg\nOpenLORIS_images img561.jpg\nOpenLORIS_images img3814.jpg\nOpenLORIS_images img5452.jpg\nOpenLORIS_images img1865.jpg\nOpenLORIS_images img4019.jpg\nOpenLORIS_images img413.jpg\nOpenLORIS_images img5167.jpg\nOpenLORIS_images img4919.jpg\nOpenLORIS_images img5220.jpg\nOpenLORIS_images img2114.jpg\nOpenLORIS_images img5248.jpg\nOpenLORIS_images img816.jpg\nOpenLORIS_images img1460.jpg\nOpenLORIS_images img1838.jpg\nOpenLORIS_images img1427.jpg\nOpenLORIS_images img4238.jpg\nOpenLORIS_images img3137.jpg\nOpenLORIS_images img4864.jpg\nOpenLORIS_images img3490.jpg\nOpenLORIS_images img5823.jpg\nOpenLORIS_images img5678.jpg\nOpenLORIS_images img1550.jpg\nOpenLORIS_images img5043.jpg\nOpenLORIS_images img2690.jpg\nOpenLORIS_images img2183.jpg\nOpenLORIS_images img557.jpg\nOpenLORIS_images img3746.jpg\nOpenLORIS_images img2613.jpg\nOpenLORIS_images img879.jpg\nOpenLORIS_images img1513.jpg\nOpenLORIS_images img4067.jpg\nOpenLORIS_images img2374.jpg\nOpenLORIS_images img3991.jpg\nOpenLORIS_images img805.jpg\nOpenLORIS_images img6139.jpg\nOpenLORIS_images img3804.jpg\nOpenLORIS_images img1325.jpg\nOpenLORIS_images img923.jpg\nOpenLORIS_images img3802.jpg\nOpenLORIS_images img3060.jpg\nOpenLORIS_images img917.jpg\nOpenLORIS_images img5420.jpg\nOpenLORIS_images img784.jpg\nOpenLORIS_images img1373.jpg\nOpenLORIS_images img3510.jpg\nOpenLORIS_images img170.jpg\nOpenLORIS_images img640.jpg\nOpenLORIS_images img5822.jpg\nOpenLORIS_images img1708.jpg\nOpenLORIS_images img2172.jpg\nOpenLORIS_images img5516.jpg\nOpenLORIS_images img5164.jpg\nOpenLORIS_images img3412.jpg\nOpenLORIS_images img2882.jpg\nOpenLORIS_images img703.jpg\nOpenLORIS_images img369.jpg\nOpenLORIS_images img1986.jpg\nOpenLORIS_images img5290.jpg\nOpenLORIS_images img4631.jpg\nOpenLORIS_images img3857.jpg\nOpenLORIS_images img504.jpg\nOpenLORIS_images img11.jpg\nOpenLORIS_images img154.jpg\nOpenLORIS_images img5940.jpg\nOpenLORIS_images img919.jpg\nOpenLORIS_images img6127.jpg\nOpenLORIS_images img316.jpg\nOpenLORIS_images img706.jpg\nOpenLORIS_images img4463.jpg\nOpenLORIS_images img2805.jpg\nOpenLORIS_images img5757.jpg\nOpenLORIS_images img1395.jpg\nOpenLORIS_images img2679.jpg\nOpenLORIS_images img746.jpg\nOpenLORIS_images img979.jpg\nOpenLORIS_images img6049.jpg\nOpenLORIS_images img2349.jpg\nOpenLORIS_images img3990.jpg\nOpenLORIS_images img4655.jpg\nOpenLORIS_images img3681.jpg\nOpenLORIS_images img5555.jpg\nOpenLORIS_images img4690.jpg\nOpenLORIS_images img1359.jpg\nOpenLORIS_images img3689.jpg\nOpenLORIS_images img1254.jpg\nOpenLORIS_images img3966.jpg\nOpenLORIS_images img5984.jpg\nOpenLORIS_images img3909.jpg\nOpenLORIS_images img4428.jpg\nOpenLORIS_images img2048.jpg\nOpenLORIS_images img1310.jpg\nOpenLORIS_images img5373.jpg\nOpenLORIS_images img1070.jpg\nOpenLORIS_images img2033.jpg\nOpenLORIS_images img1671.jpg\nOpenLORIS_images img1491.jpg\nOpenLORIS_images img950.jpg\nOpenLORIS_images img5151.jpg\nOpenLORIS_images img5731.jpg\nOpenLORIS_images img2808.jpg\nOpenLORIS_images img5.jpg\nOpenLORIS_images img1781.jpg\nOpenLORIS_images img1187.jpg\nOpenLORIS_images img5061.jpg\nOpenLORIS_images img5979.jpg\nOpenLORIS_images img156.jpg\nOpenLORIS_images img4719.jpg\nOpenLORIS_images img4851.jpg\nOpenLORIS_images img1698.jpg\nOpenLORIS_images img5502.jpg\nOpenLORIS_images img526.jpg\nOpenLORIS_images img1538.jpg\nOpenLORIS_images img5522.jpg\nOpenLORIS_images img4170.jpg\nOpenLORIS_images img5330.jpg\nOpenLORIS_images img4956.jpg\nOpenLORIS_images img2062.jpg\nOpenLORIS_images img5769.jpg\nOpenLORIS_images img4813.jpg\nOpenLORIS_images img1737.jpg\nOpenLORIS_images img1120.jpg\nOpenLORIS_images img327.jpg\nOpenLORIS_images img5603.jpg\nOpenLORIS_images img5981.jpg\nOpenLORIS_images img5729.jpg\nOpenLORIS_images img2055.jpg\nOpenLORIS_images img4230.jpg\nOpenLORIS_images img3152.jpg\nOpenLORIS_images img2758.jpg\nOpenLORIS_images img1597.jpg\nOpenLORIS_images img6006.jpg\nOpenLORIS_images img3604.jpg\nOpenLORIS_images img4716.jpg\nOpenLORIS_images img1389.jpg\nOpenLORIS_images img3851.jpg\nOpenLORIS_images img4667.jpg\nOpenLORIS_images img3669.jpg\nOpenLORIS_images img3265.jpg\nOpenLORIS_images img1458.jpg\nOpenLORIS_images img4402.jpg\nOpenLORIS_images img3561.jpg\nOpenLORIS_images img2990.jpg\nOpenLORIS_images img141.jpg\nOpenLORIS_images img2860.jpg\nOpenLORIS_images img1526.jpg\nOpenLORIS_images img4633.jpg\nOpenLORIS_images img1043.jpg\nOpenLORIS_images img852.jpg\nOpenLORIS_images img641.jpg\nOpenLORIS_images img1842.jpg\nOpenLORIS_images img3964.jpg\nOpenLORIS_images img4236.jpg\nOpenLORIS_images img224.jpg\nOpenLORIS_images img2544.jpg\nOpenLORIS_images img203.jpg\nOpenLORIS_images img3311.jpg\nOpenLORIS_images img4151.jpg\nOpenLORIS_images img982.jpg\nOpenLORIS_images img3506.jpg\nOpenLORIS_images img2796.jpg\nOpenLORIS_images img1667.jpg\nOpenLORIS_images img2089.jpg\nOpenLORIS_images img1439.jpg\nOpenLORIS_images img5019.jpg\nOpenLORIS_images img3942.jpg\nOpenLORIS_images img3958.jpg\nOpenLORIS_images img2930.jpg\nOpenLORIS_images img1995.jpg\nOpenLORIS_images img2995.jpg\nOpenLORIS_images img834.jpg\nOpenLORIS_images img404.jpg\nOpenLORIS_images img4558.jpg\nOpenLORIS_images img4219.jpg\nOpenLORIS_images img3845.jpg\nOpenLORIS_images img3458.jpg\nOpenLORIS_images img1900.jpg\nOpenLORIS_images img5891.jpg\nOpenLORIS_images img5103.jpg\nOpenLORIS_images img3935.jpg\nOpenLORIS_images img5905.jpg\nOpenLORIS_images img2656.jpg\nOpenLORIS_images img4000.jpg\nOpenLORIS_images img4345.jpg\nOpenLORIS_images img4016.jpg\nOpenLORIS_images img4299.jpg\nOpenLORIS_images img1111.jpg\nOpenLORIS_images img5985.jpg\nOpenLORIS_images img5869.jpg\nOpenLORIS_images img3782.jpg\nOpenLORIS_images img25.jpg\nOpenLORIS_images img2649.jpg\nOpenLORIS_images img776.jpg\nOpenLORIS_images img3673.jpg\nOpenLORIS_images img2698.jpg\nOpenLORIS_images img2597.jpg\nOpenLORIS_images img4502.jpg\nOpenLORIS_images img502.jpg\nOpenLORIS_images img971.jpg\nOpenLORIS_images img2105.jpg\nOpenLORIS_images img2633.jpg\nOpenLORIS_images img1531.jpg\nOpenLORIS_images img1221.jpg\nOpenLORIS_images img3317.jpg\nOpenLORIS_images img4888.jpg\nOpenLORIS_images img2991.jpg\nOpenLORIS_images img476.jpg\nOpenLORIS_images img149.jpg\nOpenLORIS_images img2131.jpg\nOpenLORIS_images img401.jpg\nOpenLORIS_images img4988.jpg\nOpenLORIS_images img2875.jpg\nOpenLORIS_images img2732.jpg\nOpenLORIS_images img4902.jpg\nOpenLORIS_images img5007.jpg\nOpenLORIS_images img5524.jpg\nOpenLORIS_images img3597.jpg\nOpenLORIS_images img4779.jpg\nOpenLORIS_images img2203.jpg\nOpenLORIS_images img4175.jpg\nOpenLORIS_images img943.jpg\nOpenLORIS_images img3717.jpg\nOpenLORIS_images img922.jpg\nOpenLORIS_images img313.jpg\nOpenLORIS_images img5742.jpg\nOpenLORIS_images img5745.jpg\nOpenLORIS_images img3016.jpg\nOpenLORIS_images img2662.jpg\nOpenLORIS_images img2030.jpg\nOpenLORIS_images img3608.jpg\nOpenLORIS_images img3201.jpg\nOpenLORIS_images img3495.jpg\nOpenLORIS_images img1505.jpg\nOpenLORIS_images img2916.jpg\nOpenLORIS_images img5969.jpg\nOpenLORIS_images img5895.jpg\nOpenLORIS_images img3546.jpg\nOpenLORIS_images img22.jpg\nOpenLORIS_images img4494.jpg\nOpenLORIS_images img2082.jpg\nOpenLORIS_images img5977.jpg\nOpenLORIS_images img2579.jpg\nOpenLORIS_images img3081.jpg\nOpenLORIS_images img4127.jpg\nOpenLORIS_images img4356.jpg\nOpenLORIS_images img4789.jpg\nOpenLORIS_images img2807.jpg\nOpenLORIS_images img4839.jpg\nOpenLORIS_images img5129.jpg\nOpenLORIS_images img1424.jpg\nOpenLORIS_images img2021.jpg\nOpenLORIS_images img2385.jpg\nOpenLORIS_images img4950.jpg\nOpenLORIS_images img1981.jpg\nOpenLORIS_images img3425.jpg\nOpenLORIS_images img2515.jpg\nOpenLORIS_images img5353.jpg\nOpenLORIS_images img3946.jpg\nOpenLORIS_images img1282.jpg\nOpenLORIS_images img821.jpg\nOpenLORIS_images img525.jpg\nOpenLORIS_images img4827.jpg\nOpenLORIS_images img499.jpg\nOpenLORIS_images img482.jpg\nOpenLORIS_images img3138.jpg\nOpenLORIS_images img90.jpg\nOpenLORIS_images img4147.jpg\nOpenLORIS_images img3674.jpg\nOpenLORIS_images img336.jpg\nOpenLORIS_images img3987.jpg\nOpenLORIS_images img4896.jpg\nOpenLORIS_images img4415.jpg\nOpenLORIS_images img4669.jpg\nOpenLORIS_images img5929.jpg\nOpenLORIS_images img5601.jpg\nOpenLORIS_images img4795.jpg\nOpenLORIS_images img1201.jpg\nOpenLORIS_images img5550.jpg\nOpenLORIS_images img838.jpg\nOpenLORIS_images img4258.jpg\nOpenLORIS_images img1263.jpg\nOpenLORIS_images img347.jpg\nOpenLORIS_images img712.jpg\nOpenLORIS_images img3724.jpg\nOpenLORIS_images img785.jpg\nOpenLORIS_images img5148.jpg\nOpenLORIS_images img184.jpg\nOpenLORIS_images img1966.jpg\nOpenLORIS_images img3067.jpg\nOpenLORIS_images img3653.jpg\nOpenLORIS_images img444.jpg\nOpenLORIS_images img4944.jpg\nOpenLORIS_images img4897.jpg\nOpenLORIS_images img1859.jpg\nOpenLORIS_images img4989.jpg\nOpenLORIS_images img5225.jpg\nOpenLORIS_images img3905.jpg\nOpenLORIS_images img1394.jpg\nOpenLORIS_images img5263.jpg\nOpenLORIS_images img4425.jpg\nOpenLORIS_images img4936.jpg\nOpenLORIS_images img4155.jpg\nOpenLORIS_images img5196.jpg\nOpenLORIS_images img3329.jpg\nOpenLORIS_images img91.jpg\nOpenLORIS_images img2495.jpg\nOpenLORIS_images img857.jpg\nOpenLORIS_images img1786.jpg\nOpenLORIS_images img782.jpg\nOpenLORIS_images img4508.jpg\nOpenLORIS_images img3395.jpg\nOpenLORIS_images img5186.jpg\nOpenLORIS_images img1658.jpg\nOpenLORIS_images img3656.jpg\nOpenLORIS_images img2009.jpg\nOpenLORIS_images img40.jpg\nOpenLORIS_images img5579.jpg\nOpenLORIS_images img4156.jpg\nOpenLORIS_images img727.jpg\nOpenLORIS_images img477.jpg\nOpenLORIS_images img5354.jpg\nOpenLORIS_images img2777.jpg\nOpenLORIS_images img2175.jpg\nOpenLORIS_images img2133.jpg\nOpenLORIS_images img3259.jpg\nOpenLORIS_images img3214.jpg\nOpenLORIS_images img2330.jpg\nOpenLORIS_images img2297.jpg\nOpenLORIS_images img685.jpg\nOpenLORIS_images img1122.jpg\nOpenLORIS_images img2802.jpg\nOpenLORIS_images img931.jpg\nOpenLORIS_images img5435.jpg\nOpenLORIS_images img3550.jpg\nOpenLORIS_images img5175.jpg\nOpenLORIS_images img3337.jpg\nOpenLORIS_images img3203.jpg\nOpenLORIS_images img1317.jpg\nOpenLORIS_images img5565.jpg\nOpenLORIS_images img5661.jpg\nOpenLORIS_images img5378.jpg\nOpenLORIS_images img4460.jpg\nOpenLORIS_images img3643.jpg\nOpenLORIS_images img3342.jpg\nOpenLORIS_images img2989.jpg\nOpenLORIS_images img1344.jpg\nOpenLORIS_images img3244.jpg\nOpenLORIS_images img3751.jpg\nOpenLORIS_images img5995.jpg\nOpenLORIS_images img5514.jpg\nOpenLORIS_images img1181.jpg\nOpenLORIS_images img978.jpg\nOpenLORIS_images img864.jpg\nOpenLORIS_images img3101.jpg\nOpenLORIS_images img2086.jpg\nOpenLORIS_images img4253.jpg\nOpenLORIS_images img4133.jpg\nOpenLORIS_images img5463.jpg\nOpenLORIS_images img2195.jpg\nOpenLORIS_images img5111.jpg\nOpenLORIS_images img4946.jpg\nOpenLORIS_images img1331.jpg\nOpenLORIS_images img4247.jpg\nOpenLORIS_images img399.jpg\nOpenLORIS_images img4825.jpg\nOpenLORIS_images img2070.jpg\nOpenLORIS_images img2514.jpg\nOpenLORIS_images img5340.jpg\nOpenLORIS_images img6036.jpg\nOpenLORIS_images img3048.jpg\nOpenLORIS_images img406.jpg\nOpenLORIS_images img3936.jpg\nOpenLORIS_images img246.jpg\nOpenLORIS_images img2768.jpg\nOpenLORIS_images img4593.jpg\nOpenLORIS_images img4320.jpg\nOpenLORIS_images img4359.jpg\nOpenLORIS_images img4590.jpg\nOpenLORIS_images img5633.jpg\nOpenLORIS_images img3781.jpg\nOpenLORIS_images img2280.jpg\nOpenLORIS_images img5483.jpg\nOpenLORIS_images img4629.jpg\nOpenLORIS_images img3033.jpg\nOpenLORIS_images img5958.jpg\nOpenLORIS_images img4430.jpg\nOpenLORIS_images img1881.jpg\nOpenLORIS_images img5331.jpg\nOpenLORIS_images img4568.jpg\nOpenLORIS_images img513.jpg\nOpenLORIS_images img4517.jpg\nOpenLORIS_images img2525.jpg\nOpenLORIS_images img2877.jpg\nOpenLORIS_images img5806.jpg\nOpenLORIS_images img2368.jpg\nOpenLORIS_images img2351.jpg\nOpenLORIS_images img596.jpg\nOpenLORIS_images img808.jpg\nOpenLORIS_images img4090.jpg\nOpenLORIS_images img3408.jpg\nOpenLORIS_images img4557.jpg\nOpenLORIS_images img4961.jpg\nOpenLORIS_images img1735.jpg\nOpenLORIS_images img5619.jpg\nOpenLORIS_images img4647.jpg\nOpenLORIS_images img1063.jpg\nOpenLORIS_images img4688.jpg\nOpenLORIS_images img3043.jpg\nOpenLORIS_images img2462.jpg\nOpenLORIS_images img5734.jpg\nOpenLORIS_images img6044.jpg\nOpenLORIS_images img1106.jpg\nOpenLORIS_images img3605.jpg\nOpenLORIS_images img5569.jpg\nOpenLORIS_images img5376.jpg\nOpenLORIS_images img2874.jpg\nOpenLORIS_images img4438.jpg\nOpenLORIS_images img409.jpg\nOpenLORIS_images img2460.jpg\nOpenLORIS_images img1137.jpg\nOpenLORIS_images img2265.jpg\nOpenLORIS_images img1421.jpg\nOpenLORIS_images img3949.jpg\nOpenLORIS_images img929.jpg\nOpenLORIS_images img5626.jpg\nOpenLORIS_images img2257.jpg\nOpenLORIS_images img3957.jpg\nOpenLORIS_images img3040.jpg\nOpenLORIS_images img4444.jpg\nOpenLORIS_images img1384.jpg\nOpenLORIS_images img6009.jpg\nOpenLORIS_images img2059.jpg\nOpenLORIS_images img677.jpg\nOpenLORIS_images img4948.jpg\nOpenLORIS_images img3466.jpg\nOpenLORIS_images img4775.jpg\nOpenLORIS_images img3207.jpg\nOpenLORIS_images img1746.jpg\nOpenLORIS_images img3286.jpg\nOpenLORIS_images img1692.jpg\nOpenLORIS_images img3165.jpg\nOpenLORIS_images img3050.jpg\nOpenLORIS_images img991.jpg\nOpenLORIS_images img6136.jpg\nOpenLORIS_images img1956.jpg\nOpenLORIS_images img3122.jpg\nOpenLORIS_images img4057.jpg\nOpenLORIS_images img732.jpg\nOpenLORIS_images img4125.jpg\nOpenLORIS_images img240.jpg\nOpenLORIS_images img4975.jpg\nOpenLORIS_images img5563.jpg\nOpenLORIS_images img5484.jpg\nOpenLORIS_images img1805.jpg\nOpenLORIS_images img5469.jpg\nOpenLORIS_images img5771.jpg\nOpenLORIS_images img408.jpg\nOpenLORIS_images img5299.jpg\nOpenLORIS_images img99.jpg\nOpenLORIS_images img4305.jpg\nOpenLORIS_images img5453.jpg\nOpenLORIS_images img956.jpg\nOpenLORIS_images img339.jpg\nOpenLORIS_images img2404.jpg\nOpenLORIS_images img5122.jpg\nOpenLORIS_images img3194.jpg\nOpenLORIS_images img3665.jpg\nOpenLORIS_images img4792.jpg\nOpenLORIS_images img4334.jpg\nOpenLORIS_images img781.jpg\nOpenLORIS_images img102.jpg\nOpenLORIS_images img1622.jpg\nOpenLORIS_images img4061.jpg\nOpenLORIS_images img1098.jpg\nOpenLORIS_images img5449.jpg\nOpenLORIS_images img4377.jpg\nOpenLORIS_images img5882.jpg\nOpenLORIS_images img4675.jpg\nOpenLORIS_images img4863.jpg\nOpenLORIS_images img2894.jpg\nOpenLORIS_images img3823.jpg\nOpenLORIS_images img1899.jpg\nOpenLORIS_images img2621.jpg\nOpenLORIS_images img2253.jpg\nOpenLORIS_images img6037.jpg\nOpenLORIS_images img5135.jpg\nOpenLORIS_images img1602.jpg\nOpenLORIS_images img4626.jpg\nOpenLORIS_images img2251.jpg\nOpenLORIS_images img2006.jpg\nOpenLORIS_images img579.jpg\nOpenLORIS_images img1480.jpg\nOpenLORIS_images img3237.jpg\nOpenLORIS_images img3955.jpg\nOpenLORIS_images img3529.jpg\nOpenLORIS_images img4413.jpg\nOpenLORIS_images img2765.jpg\nOpenLORIS_images img4666.jpg\nOpenLORIS_images img5715.jpg\nOpenLORIS_images img3384.jpg\nOpenLORIS_images img5537.jpg\nOpenLORIS_images img5381.jpg\nOpenLORIS_images img3319.jpg\nOpenLORIS_images img2464.jpg\nOpenLORIS_images img4442.jpg\nOpenLORIS_images img2359.jpg\nOpenLORIS_images img371.jpg\nOpenLORIS_images img2591.jpg\nOpenLORIS_images img5623.jpg\nOpenLORIS_images img3953.jpg\nOpenLORIS_images img1398.jpg\nOpenLORIS_images img1507.jpg\nOpenLORIS_images img4483.jpg\nOpenLORIS_images img757.jpg\nOpenLORIS_images img5227.jpg\nOpenLORIS_images img297.jpg\nOpenLORIS_images img2184.jpg\nOpenLORIS_images img566.jpg\nOpenLORIS_images img5192.jpg\nOpenLORIS_images img1604.jpg\nOpenLORIS_images img3255.jpg\nOpenLORIS_images img3134.jpg\nOpenLORIS_images img3343.jpg\nOpenLORIS_images img5818.jpg\nOpenLORIS_images img2473.jpg\nOpenLORIS_images img1616.jpg\nOpenLORIS_images img4124.jpg\nOpenLORIS_images img3302.jpg\nOpenLORIS_images img300.jpg\nOpenLORIS_images img3174.jpg\nOpenLORIS_images img3820.jpg\nOpenLORIS_images img288.jpg\nOpenLORIS_images img2610.jpg\nOpenLORIS_images img1586.jpg\nOpenLORIS_images img325.jpg\nOpenLORIS_images img331.jpg\nOpenLORIS_images img4563.jpg\nOpenLORIS_images img3069.jpg\nOpenLORIS_images img2201.jpg\nOpenLORIS_images img2529.jpg\nOpenLORIS_images img1694.jpg\nOpenLORIS_images img5479.jpg\nOpenLORIS_images img376.jpg\nOpenLORIS_images img2615.jpg\nOpenLORIS_images img1645.jpg\nOpenLORIS_images img3883.jpg\nOpenLORIS_images img4095.jpg\nOpenLORIS_images img4174.jpg\nOpenLORIS_images img934.jpg\nOpenLORIS_images img635.jpg\nOpenLORIS_images img638.jpg\nOpenLORIS_images img2100.jpg\nOpenLORIS_images img5478.jpg\nOpenLORIS_images img3171.jpg\nOpenLORIS_images img42.jpg\nOpenLORIS_images img620.jpg\nOpenLORIS_images img2977.jpg\nOpenLORIS_images img649.jpg\nOpenLORIS_images img5911.jpg\nOpenLORIS_images img5171.jpg\nOpenLORIS_images img1614.jpg\nOpenLORIS_images img2556.jpg\nOpenLORIS_images img5736.jpg\nOpenLORIS_images img5210.jpg\nOpenLORIS_images img5023.jpg\nOpenLORIS_images img3502.jpg\nOpenLORIS_images img5829.jpg\nOpenLORIS_images img3886.jpg\nOpenLORIS_images img318.jpg\nOpenLORIS_images img896.jpg\nOpenLORIS_images img1792.jpg\nOpenLORIS_images img4573.jpg\nOpenLORIS_images img6143.jpg\nOpenLORIS_images img4865.jpg\nOpenLORIS_images img4278.jpg\nOpenLORIS_images img4718.jpg\nOpenLORIS_images img2536.jpg\nOpenLORIS_images img5599.jpg\nOpenLORIS_images img2828.jpg\nOpenLORIS_images img2641.jpg\nOpenLORIS_images img2458.jpg\nOpenLORIS_images img1016.jpg\nOpenLORIS_images img147.jpg\nOpenLORIS_images img1576.jpg\nOpenLORIS_images img1535.jpg\nOpenLORIS_images img3415.jpg\nOpenLORIS_images img2260.jpg\nOpenLORIS_images img4094.jpg\nOpenLORIS_images img1633.jpg\nOpenLORIS_images img37.jpg\nOpenLORIS_images img3760.jpg\nOpenLORIS_images img2572.jpg\nOpenLORIS_images img2463.jpg\nOpenLORIS_images img5448.jpg\nOpenLORIS_images img3433.jpg\nOpenLORIS_images img5812.jpg\nOpenLORIS_images img745.jpg\nOpenLORIS_images img4674.jpg\nOpenLORIS_images img1703.jpg\nOpenLORIS_images img4363.jpg\nOpenLORIS_images img2951.jpg\nOpenLORIS_images img2792.jpg\nOpenLORIS_images img6076.jpg\nOpenLORIS_images img1194.jpg\nOpenLORIS_images img2121.jpg\nOpenLORIS_images img5639.jpg\nOpenLORIS_images img1674.jpg\nOpenLORIS_images img2903.jpg\nOpenLORIS_images img1165.jpg\nOpenLORIS_images img4798.jpg\nOpenLORIS_images img3778.jpg\nOpenLORIS_images img4223.jpg\nOpenLORIS_images img242.jpg\nOpenLORIS_images img4660.jpg\nOpenLORIS_images img45.jpg\nOpenLORIS_images img883.jpg\nOpenLORIS_images img3898.jpg\nOpenLORIS_images img1110.jpg\nOpenLORIS_images img2493.jpg\nOpenLORIS_images img160.jpg\nOpenLORIS_images img3000.jpg\nOpenLORIS_images img4644.jpg\nOpenLORIS_images img4461.jpg\nOpenLORIS_images img1835.jpg\nOpenLORIS_images img5556.jpg\nOpenLORIS_images img3170.jpg\nOpenLORIS_images img1047.jpg\nOpenLORIS_images img277.jpg\nOpenLORIS_images img3542.jpg\nOpenLORIS_images img5433.jpg\nOpenLORIS_images img811.jpg\nOpenLORIS_images img2064.jpg\nOpenLORIS_images img299.jpg\nOpenLORIS_images img1180.jpg\nOpenLORIS_images img3094.jpg\nOpenLORIS_images img1195.jpg\nOpenLORIS_images img4146.jpg\nOpenLORIS_images img1176.jpg\nOpenLORIS_images img3943.jpg\nOpenLORIS_images img274.jpg\nOpenLORIS_images img3463.jpg\nOpenLORIS_images img3829.jpg\nOpenLORIS_images img4899.jpg\nOpenLORIS_images img972.jpg\nOpenLORIS_images img5387.jpg\nOpenLORIS_images img503.jpg\nOpenLORIS_images img3042.jpg\nOpenLORIS_images img1245.jpg\nOpenLORIS_images img4980.jpg\nOpenLORIS_images img1691.jpg\nOpenLORIS_images img512.jpg\nOpenLORIS_images img3710.jpg\nOpenLORIS_images img5276.jpg\nOpenLORIS_images img5573.jpg\nOpenLORIS_images img3001.jpg\nOpenLORIS_images img3595.jpg\nOpenLORIS_images img4472.jpg\nOpenLORIS_images img1230.jpg\nOpenLORIS_images img4495.jpg\nOpenLORIS_images img3387.jpg\nOpenLORIS_images img1794.jpg\nOpenLORIS_images img2103.jpg\nOpenLORIS_images img2028.jpg\nOpenLORIS_images img593.jpg\nOpenLORIS_images img2705.jpg\nOpenLORIS_images img5235.jpg\nOpenLORIS_images img3159.jpg\nOpenLORIS_images img1322.jpg\nOpenLORIS_images img2390.jpg\nOpenLORIS_images img3975.jpg\nOpenLORIS_images img1002.jpg\nOpenLORIS_images img2837.jpg\nOpenLORIS_images img2352.jpg\nOpenLORIS_images img2618.jpg\nOpenLORIS_images img952.jpg\nOpenLORIS_images img2670.jpg\nOpenLORIS_images img1511.jpg\nOpenLORIS_images img1711.jpg\nOpenLORIS_images img1742.jpg\nOpenLORIS_images img4553.jpg\nOpenLORIS_images img4873.jpg\nOpenLORIS_images img3283.jpg\nOpenLORIS_images img1225.jpg\nOpenLORIS_images img3523.jpg\nOpenLORIS_images img471.jpg\nOpenLORIS_images img5089.jpg\nOpenLORIS_images img5215.jpg\nOpenLORIS_images img4276.jpg\nOpenLORIS_images img6108.jpg\nOpenLORIS_images img1179.jpg\nOpenLORIS_images img4838.jpg\nOpenLORIS_images img5489.jpg\nOpenLORIS_images img5366.jpg\nOpenLORIS_images img2933.jpg\nOpenLORIS_images img5139.jpg\nOpenLORIS_images img1709.jpg\nOpenLORIS_images img5581.jpg\nOpenLORIS_images img6092.jpg\nOpenLORIS_images img5583.jpg\nOpenLORIS_images img317.jpg\nOpenLORIS_images img5765.jpg\nOpenLORIS_images img414.jpg\nOpenLORIS_images img4615.jpg\nOpenLORIS_images img345.jpg\nOpenLORIS_images img385.jpg\nOpenLORIS_images img5431.jpg\nOpenLORIS_images img3698.jpg\nOpenLORIS_images img391.jpg\nOpenLORIS_images img3534.jpg\nOpenLORIS_images img4554.jpg\nOpenLORIS_images img2760.jpg\nOpenLORIS_images img2209.jpg\nOpenLORIS_images img3930.jpg\nOpenLORIS_images img1485.jpg\nOpenLORIS_images img2304.jpg\nOpenLORIS_images img5388.jpg\nOpenLORIS_images img4945.jpg\nOpenLORIS_images img133.jpg\nOpenLORIS_images img395.jpg\nOpenLORIS_images img1681.jpg\nOpenLORIS_images img4552.jpg\nOpenLORIS_images img4793.jpg\nOpenLORIS_images img2878.jpg\nOpenLORIS_images img5352.jpg\nOpenLORIS_images img349.jpg\nOpenLORIS_images img1541.jpg\nOpenLORIS_images img1107.jpg\nOpenLORIS_images img818.jpg\nOpenLORIS_images img4153.jpg\nOpenLORIS_images img1894.jpg\nOpenLORIS_images img3623.jpg\nOpenLORIS_images img3390.jpg\nOpenLORIS_images img4973.jpg\nOpenLORIS_images img1324.jpg\nOpenLORIS_images img2012.jpg\nOpenLORIS_images img5906.jpg\nOpenLORIS_images img3279.jpg\nOpenLORIS_images img282.jpg\nOpenLORIS_images img2442.jpg\nOpenLORIS_images img1078.jpg\nOpenLORIS_images img2736.jpg\nOpenLORIS_images img1927.jpg\nOpenLORIS_images img880.jpg\nOpenLORIS_images img290.jpg\nOpenLORIS_images img1808.jpg\nOpenLORIS_images img755.jpg\nOpenLORIS_images img5582.jpg\nOpenLORIS_images img4971.jpg\nOpenLORIS_images img1517.jpg\nOpenLORIS_images img3680.jpg\nOpenLORIS_images img424.jpg\nOpenLORIS_images img569.jpg\nOpenLORIS_images img3158.jpg\nOpenLORIS_images img2739.jpg\nOpenLORIS_images img1793.jpg\nOpenLORIS_images img3538.jpg\nOpenLORIS_images img4401.jpg\nOpenLORIS_images img3876.jpg\nOpenLORIS_images img668.jpg\nOpenLORIS_images img1554.jpg\nOpenLORIS_images img1839.jpg\nOpenLORIS_images img1680.jpg\nOpenLORIS_images img4140.jpg\nOpenLORIS_images img2010.jpg\nOpenLORIS_images img3551.jpg\nOpenLORIS_images img4595.jpg\nOpenLORIS_images img3601.jpg\nOpenLORIS_images img5389.jpg\nOpenLORIS_images img2868.jpg\nOpenLORIS_images img3922.jpg\nOpenLORIS_images img1133.jpg\nOpenLORIS_images img3906.jpg\nOpenLORIS_images img5474.jpg\nOpenLORIS_images img4978.jpg\nOpenLORIS_images img2162.jpg\nOpenLORIS_images img3967.jpg\nOpenLORIS_images img4943.jpg\nOpenLORIS_images img5458.jpg\nOpenLORIS_images img740.jpg\nOpenLORIS_images img5038.jpg\nOpenLORIS_images img2467.jpg\nOpenLORIS_images img4499.jpg\nOpenLORIS_images img3988.jpg\nOpenLORIS_images img4619.jpg\nOpenLORIS_images img4403.jpg\nOpenLORIS_images img5548.jpg\nOpenLORIS_images img729.jpg\nOpenLORIS_images img354.jpg\nOpenLORIS_images img5258.jpg\nOpenLORIS_images img1910.jpg\nOpenLORIS_images img2238.jpg\nOpenLORIS_images img330.jpg\nOpenLORIS_images img5842.jpg\nOpenLORIS_images img1957.jpg\nOpenLORIS_images img1608.jpg\nOpenLORIS_images img5799.jpg\nOpenLORIS_images img3075.jpg\nOpenLORIS_images img5966.jpg\nOpenLORIS_images img4518.jpg\nOpenLORIS_images img4467.jpg\nOpenLORIS_images img5450.jpg\nOpenLORIS_images img4976.jpg\nOpenLORIS_images img3400.jpg\nOpenLORIS_images img2521.jpg\nOpenLORIS_images img1822.jpg\nOpenLORIS_images img6059.jpg\nOpenLORIS_images img3092.jpg\nOpenLORIS_images img1261.jpg\nOpenLORIS_images img437.jpg\nOpenLORIS_images img4458.jpg\nOpenLORIS_images img454.jpg\nOpenLORIS_images img5669.jpg\nOpenLORIS_images img2826.jpg\nOpenLORIS_images img4102.jpg\nOpenLORIS_images img775.jpg\nOpenLORIS_images img4171.jpg\nOpenLORIS_images img5544.jpg\nOpenLORIS_images img2384.jpg\nOpenLORIS_images img2501.jpg\nOpenLORIS_images img3508.jpg\nOpenLORIS_images img639.jpg\nOpenLORIS_images img5635.jpg\nOpenLORIS_images img3331.jpg\nOpenLORIS_images img334.jpg\nOpenLORIS_images img4880.jpg\nOpenLORIS_images img6017.jpg\nOpenLORIS_images img1417.jpg\nOpenLORIS_images img1269.jpg\nOpenLORIS_images img356.jpg\nOpenLORIS_images img4051.jpg\nOpenLORIS_images img1191.jpg\nOpenLORIS_images img2689.jpg\nOpenLORIS_images img5572.jpg\nOpenLORIS_images img4803.jpg\nOpenLORIS_images img5526.jpg\nOpenLORIS_images img487.jpg\nOpenLORIS_images img5417.jpg\nOpenLORIS_images img5159.jpg\nOpenLORIS_images img213.jpg\nOpenLORIS_images img3446.jpg\nOpenLORIS_images img5386.jpg\nOpenLORIS_images img243.jpg\nOpenLORIS_images img2620.jpg\nOpenLORIS_images img270.jpg\nOpenLORIS_images img5922.jpg\nOpenLORIS_images img5027.jpg\nOpenLORIS_images img4349.jpg\nOpenLORIS_images img5336.jpg\nOpenLORIS_images img1533.jpg\nOpenLORIS_images img694.jpg\nOpenLORIS_images img2866.jpg\nOpenLORIS_images img2004.jpg\nOpenLORIS_images img3062.jpg\nOpenLORIS_images img3166.jpg\nOpenLORIS_images img2053.jpg\nOpenLORIS_images img1853.jpg\nOpenLORIS_images img3297.jpg\nOpenLORIS_images img5604.jpg\nOpenLORIS_images img2431.jpg\nOpenLORIS_images img2109.jpg\nOpenLORIS_images img2891.jpg\nOpenLORIS_images img6070.jpg\nOpenLORIS_images img64.jpg\nOpenLORIS_images img5970.jpg\nOpenLORIS_images img5684.jpg\nOpenLORIS_images img4908.jpg\nOpenLORIS_images img4963.jpg\nOpenLORIS_images img4420.jpg\nOpenLORIS_images img2619.jpg\nOpenLORIS_images img3131.jpg\nOpenLORIS_images img1260.jpg\nOpenLORIS_images img1731.jpg\nOpenLORIS_images img2426.jpg\nOpenLORIS_images img623.jpg\nOpenLORIS_images img1435.jpg\nOpenLORIS_images img1970.jpg\nOpenLORIS_images img1982.jpg\nOpenLORIS_images img4501.jpg\nOpenLORIS_images img4915.jpg\nOpenLORIS_images img2451.jpg\nOpenLORIS_images img3763.jpg\nOpenLORIS_images img1217.jpg\nOpenLORIS_images img6111.jpg\nOpenLORIS_images img843.jpg\nOpenLORIS_images img4433.jpg\nOpenLORIS_images img2350.jpg\nOpenLORIS_images img1747.jpg\nOpenLORIS_images img3210.jpg\nOpenLORIS_images img1347.jpg\nOpenLORIS_images img195.jpg\nOpenLORIS_images img3087.jpg\nOpenLORIS_images img1721.jpg\nOpenLORIS_images img2027.jpg\nOpenLORIS_images img5780.jpg\nOpenLORIS_images img4982.jpg\nOpenLORIS_images img2799.jpg\nOpenLORIS_images img1804.jpg\nOpenLORIS_images img4693.jpg\nOpenLORIS_images img5746.jpg\nOpenLORIS_images img1228.jpg\nOpenLORIS_images img1232.jpg\nOpenLORIS_images img4652.jpg\nOpenLORIS_images img4455.jpg\nOpenLORIS_images img3730.jpg\nOpenLORIS_images img2850.jpg\nOpenLORIS_images img5909.jpg\nOpenLORIS_images img1327.jpg\nOpenLORIS_images img562.jpg\nOpenLORIS_images img1038.jpg\nOpenLORIS_images img3530.jpg\nOpenLORIS_images img741.jpg\nOpenLORIS_images img5078.jpg\nOpenLORIS_images img3559.jpg\nOpenLORIS_images img2369.jpg\nOpenLORIS_images img488.jpg\nOpenLORIS_images img2296.jpg\nOpenLORIS_images img2914.jpg\nOpenLORIS_images img5924.jpg\nOpenLORIS_images img5760.jpg\nOpenLORIS_images img4817.jpg\nOpenLORIS_images img551.jpg\nOpenLORIS_images img1990.jpg\nOpenLORIS_images img3566.jpg\nOpenLORIS_images img1913.jpg\nOpenLORIS_images img4938.jpg\nOpenLORIS_images img3720.jpg\nOpenLORIS_images img3843.jpg\nOpenLORIS_images img1364.jpg\nOpenLORIS_images img5918.jpg\nOpenLORIS_images img81.jpg\nOpenLORIS_images img6003.jpg\nOpenLORIS_images img4575.jpg\nOpenLORIS_images img3202.jpg\nOpenLORIS_images img1168.jpg\nOpenLORIS_images img2695.jpg\nOpenLORIS_images img3901.jpg\nOpenLORIS_images img4119.jpg\nOpenLORIS_images img1032.jpg\nOpenLORIS_images img4933.jpg\nOpenLORIS_images img4883.jpg\nOpenLORIS_images img840.jpg\nOpenLORIS_images img3130.jpg\nOpenLORIS_images img4869.jpg\nOpenLORIS_images img2007.jpg\nOpenLORIS_images img3193.jpg\nOpenLORIS_images img2022.jpg\nOpenLORIS_images img5350.jpg\nOpenLORIS_images img3818.jpg\nOpenLORIS_images img2308.jpg\nOpenLORIS_images img1093.jpg\nOpenLORIS_images img3250.jpg\nOpenLORIS_images img1442.jpg\nOpenLORIS_images img4399.jpg\nOpenLORIS_images img2455.jpg\nOpenLORIS_images img3098.jpg\nOpenLORIS_images img3347.jpg\nOpenLORIS_images img5015.jpg\nOpenLORIS_images img2833.jpg\nOpenLORIS_images img247.jpg\nOpenLORIS_images img3222.jpg\nOpenLORIS_images img3184.jpg\nOpenLORIS_images img3979.jpg\nOpenLORIS_images img4273.jpg\nOpenLORIS_images img1388.jpg\nOpenLORIS_images img1172.jpg\nOpenLORIS_images img1089.jpg\nOpenLORIS_images img4837.jpg\nOpenLORIS_images img162.jpg\nOpenLORIS_images img4065.jpg\nOpenLORIS_images img3126.jpg\nOpenLORIS_images img1459.jpg\nOpenLORIS_images img2859.jpg\nOpenLORIS_images img4251.jpg\nOpenLORIS_images img6128.jpg\nOpenLORIS_images img5118.jpg\nOpenLORIS_images img5001.jpg\nOpenLORIS_images img3703.jpg\nOpenLORIS_images img5629.jpg\nOpenLORIS_images img1612.jpg\nOpenLORIS_images img4481.jpg\nOpenLORIS_images img1004.jpg\nOpenLORIS_images img1964.jpg\nOpenLORIS_images img73.jpg\nOpenLORIS_images img281.jpg\nOpenLORIS_images img3511.jpg\nOpenLORIS_images img4386.jpg\nOpenLORIS_images img5438.jpg\nOpenLORIS_images img2657.jpg\nOpenLORIS_images img2215.jpg\nOpenLORIS_images img1611.jpg\nOpenLORIS_images img1280.jpg\nOpenLORIS_images img1840.jpg\nOpenLORIS_images img2008.jpg\nOpenLORIS_images img778.jpg\nOpenLORIS_images img2835.jpg\nOpenLORIS_images img1135.jpg\nOpenLORIS_images img2754.jpg\nOpenLORIS_images img4668.jpg\nOpenLORIS_images img5445.jpg\nOpenLORIS_images img174.jpg\nOpenLORIS_images img4306.jpg\nOpenLORIS_images img5606.jpg\nOpenLORIS_images img515.jpg\nOpenLORIS_images img2243.jpg\nOpenLORIS_images img3321.jpg\nOpenLORIS_images img5914.jpg\nOpenLORIS_images img1335.jpg\nOpenLORIS_images img3668.jpg\nOpenLORIS_images img2804.jpg\nOpenLORIS_images img1701.jpg\nOpenLORIS_images img1744.jpg\nOpenLORIS_images img1509.jpg\nOpenLORIS_images img3588.jpg\nOpenLORIS_images img1750.jpg\nOpenLORIS_images img2669.jpg\nOpenLORIS_images img3937.jpg\nOpenLORIS_images img3028.jpg\nOpenLORIS_images img4426.jpg\nOpenLORIS_images img5002.jpg\nOpenLORIS_images img5605.jpg\nOpenLORIS_images img5195.jpg\nOpenLORIS_images img5150.jpg\nOpenLORIS_images img6035.jpg\nOpenLORIS_images img4931.jpg\nOpenLORIS_images img5850.jpg\nOpenLORIS_images img4610.jpg\nOpenLORIS_images img2905.jpg\nOpenLORIS_images img4556.jpg\nOpenLORIS_images img3178.jpg\nOpenLORIS_images img673.jpg\nOpenLORIS_images img1788.jpg\nOpenLORIS_images img4316.jpg\nOpenLORIS_images img1450.jpg\nOpenLORIS_images img2569.jpg\nOpenLORIS_images img2881.jpg\nOpenLORIS_images img2625.jpg\nOpenLORIS_images img847.jpg\nOpenLORIS_images img696.jpg\nOpenLORIS_images img5789.jpg\nOpenLORIS_images img3868.jpg\nOpenLORIS_images img4256.jpg\nOpenLORIS_images img134.jpg\nOpenLORIS_images img2627.jpg\nOpenLORIS_images img1202.jpg\nOpenLORIS_images img1465.jpg\nOpenLORIS_images img5726.jpg\nOpenLORIS_images img5275.jpg\nOpenLORIS_images img2347.jpg\nOpenLORIS_images img314.jpg\nOpenLORIS_images img3354.jpg\nOpenLORIS_images img6061.jpg\nOpenLORIS_images img6109.jpg\nOpenLORIS_images img583.jpg\nOpenLORIS_images img5239.jpg\nOpenLORIS_images img1607.jpg\nOpenLORIS_images img2124.jpg\nOpenLORIS_images img2719.jpg\nOpenLORIS_images img4618.jpg\nOpenLORIS_images img4190.jpg\nOpenLORIS_images img5204.jpg\nOpenLORIS_images img3484.jpg\nOpenLORIS_images img2541.jpg\nOpenLORIS_images img5598.jpg\nOpenLORIS_images img3498.jpg\nOpenLORIS_images img3125.jpg\nOpenLORIS_images img434.jpg\nOpenLORIS_images img2589.jpg\nOpenLORIS_images img5077.jpg\nOpenLORIS_images img1928.jpg\nOpenLORIS_images img5301.jpg\nOpenLORIS_images img1676.jpg\nOpenLORIS_images img5229.jpg\nOpenLORIS_images img3168.jpg\nOpenLORIS_images img4228.jpg\nOpenLORIS_images img3141.jpg\nOpenLORIS_images img3998.jpg\nOpenLORIS_images img6062.jpg\nOpenLORIS_images img5379.jpg\nOpenLORIS_images img3306.jpg\nOpenLORIS_images img2815.jpg\nOpenLORIS_images img202.jpg\nOpenLORIS_images img2910.jpg\nOpenLORIS_images img4059.jpg\nOpenLORIS_images img4343.jpg\nOpenLORIS_images img3088.jpg\nOpenLORIS_images img1710.jpg\nOpenLORIS_images img1126.jpg\nOpenLORIS_images img5436.jpg\nOpenLORIS_images img4585.jpg\nOpenLORIS_images img2482.jpg\nOpenLORIS_images img4893.jpg\nOpenLORIS_images img4109.jpg\nOpenLORIS_images img1386.jpg\nOpenLORIS_images img4302.jpg\nOpenLORIS_images img4836.jpg\nOpenLORIS_images img4850.jpg\nOpenLORIS_images img244.jpg\nOpenLORIS_images img767.jpg\nOpenLORIS_images img5024.jpg\nOpenLORIS_images img2139.jpg\nOpenLORIS_images img4443.jpg\nOpenLORIS_images img4167.jpg\nOpenLORIS_images img2013.jpg\nOpenLORIS_images img6040.jpg\nOpenLORIS_images img2738.jpg\nOpenLORIS_images img3156.jpg\nOpenLORIS_images img3146.jpg\nOpenLORIS_images img2527.jpg\nOpenLORIS_images img4527.jpg\nOpenLORIS_images img1891.jpg\nOpenLORIS_images img2775.jpg\nOpenLORIS_images img275.jpg\nOpenLORIS_images img4143.jpg\nOpenLORIS_images img3163.jpg\nOpenLORIS_images img5180.jpg\nOpenLORIS_images img3696.jpg\nOpenLORIS_images img5668.jpg\nOpenLORIS_images img1907.jpg\nOpenLORIS_images img5973.jpg\nOpenLORIS_images img2956.jpg\nOpenLORIS_images img900.jpg\nOpenLORIS_images img1321.jpg\nOpenLORIS_images img5515.jpg\nOpenLORIS_images img4212.jpg\nOpenLORIS_images img4506.jpg\nOpenLORIS_images img1441.jpg\nOpenLORIS_images img4351.jpg\nOpenLORIS_images img5936.jpg\nOpenLORIS_images img6124.jpg\nOpenLORIS_images img3209.jpg\nOpenLORIS_images img2481.jpg\nOpenLORIS_images img3270.jpg\nOpenLORIS_images img2263.jpg\nOpenLORIS_images img1583.jpg\nOpenLORIS_images img5621.jpg\nOpenLORIS_images img3713.jpg\nOpenLORIS_images img4498.jpg\nOpenLORIS_images img4924.jpg\nOpenLORIS_images img2542.jpg\nOpenLORIS_images img3312.jpg\nOpenLORIS_images img6075.jpg\nOpenLORIS_images img4243.jpg\nOpenLORIS_images img500.jpg\nOpenLORIS_images img4860.jpg\nOpenLORIS_images img1410.jpg\nOpenLORIS_images img2125.jpg\nOpenLORIS_images img4043.jpg\nOpenLORIS_images img3889.jpg\nOpenLORIS_images img2756.jpg\nOpenLORIS_images img1481.jpg\nOpenLORIS_images img5498.jpg\nOpenLORIS_images img3813.jpg\nOpenLORIS_images img3971.jpg\nOpenLORIS_images img3816.jpg\nOpenLORIS_images img3762.jpg\nOpenLORIS_images img127.jpg\nOpenLORIS_images img1077.jpg\nOpenLORIS_images img2276.jpg\nOpenLORIS_images img5270.jpg\nOpenLORIS_images img4369.jpg\nOpenLORIS_images img4511.jpg\nOpenLORIS_images img4298.jpg\nOpenLORIS_images img4656.jpg\nOpenLORIS_images img483.jpg\nOpenLORIS_images img2470.jpg\nOpenLORIS_images img868.jpg\nOpenLORIS_images img2573.jpg\nOpenLORIS_images img5951.jpg\nOpenLORIS_images img3896.jpg\nOpenLORIS_images img4996.jpg\nOpenLORIS_images img3617.jpg\nOpenLORIS_images img3372.jpg\nOpenLORIS_images img4284.jpg\nOpenLORIS_images img5254.jpg\nOpenLORIS_images img616.jpg\nOpenLORIS_images img3066.jpg\nOpenLORIS_images img1169.jpg\nOpenLORIS_images img3393.jpg\nOpenLORIS_images img6068.jpg\nOpenLORIS_images img1886.jpg\nOpenLORIS_images img4782.jpg\nOpenLORIS_images img5222.jpg\nOpenLORIS_images img6134.jpg\nOpenLORIS_images img708.jpg\nOpenLORIS_images img3690.jpg\nOpenLORIS_images img592.jpg\nOpenLORIS_images img5805.jpg\nOpenLORIS_images img814.jpg\nOpenLORIS_images img3913.jpg\nOpenLORIS_images img5045.jpg\nOpenLORIS_images img5617.jpg\nOpenLORIS_images img5217.jpg\nOpenLORIS_images img3403.jpg\nOpenLORIS_images img3246.jpg\nOpenLORIS_images img4391.jpg\nOpenLORIS_images img5683.jpg\nOpenLORIS_images img6083.jpg\nOpenLORIS_images img4769.jpg\nOpenLORIS_images img1618.jpg\nOpenLORIS_images img3491.jpg\nOpenLORIS_images img1542.jpg\nOpenLORIS_images img407.jpg\nOpenLORIS_images img2560.jpg\nOpenLORIS_images img2173.jpg\nOpenLORIS_images img3758.jpg\nOpenLORIS_images img3819.jpg\nOpenLORIS_images img5660.jpg\nOpenLORIS_images img3800.jpg\nOpenLORIS_images img6101.jpg\nOpenLORIS_images img1587.jpg\nOpenLORIS_images img651.jpg\nOpenLORIS_images img4097.jpg\nOpenLORIS_images img2748.jpg\nOpenLORIS_images img4337.jpg\nOpenLORIS_images img3117.jpg\nOpenLORIS_images img947.jpg\nOpenLORIS_images img3073.jpg\nOpenLORIS_images img3507.jpg\nOpenLORIS_images img4309.jpg\nOpenLORIS_images img546.jpg\nOpenLORIS_images img3068.jpg\nOpenLORIS_images img4597.jpg\nOpenLORIS_images img5424.jpg\nOpenLORIS_images img5730.jpg\nOpenLORIS_images img3035.jpg\nOpenLORIS_images img4773.jpg\nOpenLORIS_images img765.jpg\nOpenLORIS_images img3147.jpg\nOpenLORIS_images img1829.jpg\nOpenLORIS_images img2066.jpg\nOpenLORIS_images img311.jpg\nOpenLORIS_images img5781.jpg\nOpenLORIS_images img4882.jpg\nOpenLORIS_images img1240.jpg\nOpenLORIS_images img2622.jpg\nOpenLORIS_images img3920.jpg\nOpenLORIS_images img1702.jpg\nOpenLORIS_images img4012.jpg\nOpenLORIS_images img3545.jpg\nOpenLORIS_images img714.jpg\nOpenLORIS_images img2664.jpg\nOpenLORIS_images img3023.jpg\nOpenLORIS_images img4807.jpg\nOpenLORIS_images img1662.jpg\nOpenLORIS_images img1023.jpg\nOpenLORIS_images img3676.jpg\nOpenLORIS_images img4800.jpg\nOpenLORIS_images img4240.jpg\nOpenLORIS_images img234.jpg\nOpenLORIS_images img2222.jpg\nOpenLORIS_images img1246.jpg\nOpenLORIS_images img4267.jpg\nOpenLORIS_images img1601.jpg\nOpenLORIS_images img5300.jpg\nOpenLORIS_images img2558.jpg\nOpenLORIS_images img5578.jpg\nOpenLORIS_images img2179.jpg\nOpenLORIS_images img1185.jpg\nOpenLORIS_images img5178.jpg\nOpenLORIS_images img5627.jpg\nOpenLORIS_images img3.jpg\nOpenLORIS_images img5688.jpg\nOpenLORIS_images img5185.jpg\nOpenLORIS_images img2023.jpg\nOpenLORIS_images img4723.jpg\nOpenLORIS_images img2085.jpg\nOpenLORIS_images img533.jpg\nOpenLORIS_images img4930.jpg\nOpenLORIS_images img1867.jpg\nOpenLORIS_images img5421.jpg\nOpenLORIS_images img5467.jpg\nOpenLORIS_images img1489.jpg\nOpenLORIS_images img2320.jpg\nOpenLORIS_images img1774.jpg\nOpenLORIS_images img2824.jpg\nOpenLORIS_images img1256.jpg\nOpenLORIS_images img2148.jpg\nOpenLORIS_images img1998.jpg\nOpenLORIS_images img5187.jpg\nOpenLORIS_images img4544.jpg\nOpenLORIS_images img3634.jpg\nOpenLORIS_images img1443.jpg\nOpenLORIS_images img5648.jpg\nOpenLORIS_images img4353.jpg\nOpenLORIS_images img2782.jpg\nOpenLORIS_images img3315.jpg\nOpenLORIS_images img245.jpg\nOpenLORIS_images img438.jpg\nOpenLORIS_images img1490.jpg\nOpenLORIS_images img4903.jpg\nOpenLORIS_images img83.jpg\nOpenLORIS_images img5790.jpg\nOpenLORIS_images img4726.jpg\nOpenLORIS_images img1227.jpg\nOpenLORIS_images img4437.jpg\nOpenLORIS_images img2816.jpg\nOpenLORIS_images img3187.jpg\nOpenLORIS_images img1257.jpg\nOpenLORIS_images img6087.jpg\nOpenLORIS_images img1039.jpg\nOpenLORIS_images img1682.jpg\nOpenLORIS_images img2885.jpg\nOpenLORIS_images img684.jpg\nOpenLORIS_images img828.jpg\nOpenLORIS_images img6079.jpg\nOpenLORIS_images img862.jpg\nOpenLORIS_images img2166.jpg\nOpenLORIS_images img2533.jpg\nOpenLORIS_images img1834.jpg\nOpenLORIS_images img233.jpg\nOpenLORIS_images img3206.jpg\nOpenLORIS_images img2412.jpg\nOpenLORIS_images img432.jpg\nOpenLORIS_images img5607.jpg\nOpenLORIS_images img2456.jpg\nOpenLORIS_images img975.jpg\nOpenLORIS_images img5251.jpg\nOpenLORIS_images img6107.jpg\nOpenLORIS_images img5978.jpg\nOpenLORIS_images img3467.jpg\nOpenLORIS_images img4231.jpg\nOpenLORIS_images img2418.jpg\nOpenLORIS_images img605.jpg\nOpenLORIS_images img5161.jpg\nOpenLORIS_images img4141.jpg\nOpenLORIS_images img4929.jpg\nOpenLORIS_images img1067.jpg\nOpenLORIS_images img3418.jpg\nOpenLORIS_images img3994.jpg\nOpenLORIS_images img216.jpg\nOpenLORIS_images img4758.jpg\nOpenLORIS_images img4503.jpg\nOpenLORIS_images img5614.jpg\nOpenLORIS_images img903.jpg\nOpenLORIS_images img5163.jpg\nOpenLORIS_images img2380.jpg\nOpenLORIS_images img1959.jpg\nOpenLORIS_images img2448.jpg\nOpenLORIS_images img3080.jpg\nOpenLORIS_images img4431.jpg\nOpenLORIS_images img3533.jpg\nOpenLORIS_images img4717.jpg\nOpenLORIS_images img5360.jpg\nOpenLORIS_images img1041.jpg\nOpenLORIS_images img797.jpg\nOpenLORIS_images img5044.jpg\nOpenLORIS_images img2971.jpg\nOpenLORIS_images img6053.jpg\nOpenLORIS_images img2406.jpg\nOpenLORIS_images img4327.jpg\nOpenLORIS_images img1749.jpg\nOpenLORIS_images img3737.jpg\nOpenLORIS_images img2753.jpg\nOpenLORIS_images img2737.jpg\nOpenLORIS_images img4180.jpg\nOpenLORIS_images img3837.jpg\nOpenLORIS_images img2169.jpg\nOpenLORIS_images img4768.jpg\nOpenLORIS_images img1031.jpg\nOpenLORIS_images img2122.jpg\nOpenLORIS_images img646.jpg\nOpenLORIS_images img5959.jpg\nOpenLORIS_images img2893.jpg\nOpenLORIS_images img3479.jpg\nOpenLORIS_images img2655.jpg\nOpenLORIS_images img4427.jpg\nOpenLORIS_images img3241.jpg\nOpenLORIS_images img3592.jpg\nOpenLORIS_images img5462.jpg\nOpenLORIS_images img3836.jpg\nOpenLORIS_images img4766.jpg\nOpenLORIS_images img4998.jpg\nOpenLORIS_images img1434.jpg\nOpenLORIS_images img5941.jpg\nOpenLORIS_images img2780.jpg\nOpenLORIS_images img2969.jpg\nOpenLORIS_images img603.jpg\nOpenLORIS_images img1204.jpg\nOpenLORIS_images img2145.jpg\nOpenLORIS_images img5409.jpg\nOpenLORIS_images img7.jpg\nOpenLORIS_images img5246.jpg\nOpenLORIS_images img1578.jpg\nOpenLORIS_images img1193.jpg\nOpenLORIS_images img3002.jpg\nOpenLORIS_images img1362.jpg\nOpenLORIS_images img3596.jpg\nOpenLORIS_images img2035.jpg\nOpenLORIS_images img865.jpg\nOpenLORIS_images img44.jpg\nOpenLORIS_images img707.jpg\nOpenLORIS_images img3154.jpg\nOpenLORIS_images img876.jpg\nOpenLORIS_images img1251.jpg\nOpenLORIS_images img2407.jpg\nOpenLORIS_images img5782.jpg\nOpenLORIS_images img1136.jpg\nOpenLORIS_images img1951.jpg\nOpenLORIS_images img3893.jpg\nOpenLORIS_images img328.jpg\nOpenLORIS_images img5824.jpg\nOpenLORIS_images img1547.jpg\nOpenLORIS_images img1013.jpg\nOpenLORIS_images img1493.jpg\nOpenLORIS_images img5950.jpg\nOpenLORIS_images img5856.jpg\nOpenLORIS_images img429.jpg\nOpenLORIS_images img3924.jpg\nOpenLORIS_images img2928.jpg\nOpenLORIS_images img5874.jpg\nOpenLORIS_images img3642.jpg\nOpenLORIS_images img3951.jpg\nOpenLORIS_images img2117.jpg\nOpenLORIS_images img5406.jpg\nOpenLORIS_images img1917.jpg\nOpenLORIS_images img935.jpg\nOpenLORIS_images img4852.jpg\nOpenLORIS_images img3428.jpg\nOpenLORIS_images img645.jpg\nOpenLORIS_images img3774.jpg\nOpenLORIS_images img5277.jpg\nOpenLORIS_images img3683.jpg\nOpenLORIS_images img4802.jpg\nOpenLORIS_images img866.jpg\nOpenLORIS_images img3822.jpg\nOpenLORIS_images img3499.jpg\nOpenLORIS_images img930.jpg\nOpenLORIS_images img3269.jpg\nOpenLORIS_images img872.jpg\nOpenLORIS_images img3806.jpg\nOpenLORIS_images img2143.jpg\nOpenLORIS_images img521.jpg\nOpenLORIS_images img725.jpg\nOpenLORIS_images img851.jpg\nOpenLORIS_images img789.jpg\nOpenLORIS_images img1174.jpg\nOpenLORIS_images img1868.jpg\nOpenLORIS_images img5907.jpg\nOpenLORIS_images img4546.jpg\nOpenLORIS_images img1663.jpg\nOpenLORIS_images img1962.jpg\nOpenLORIS_images img2540.jpg\nOpenLORIS_images img5368.jpg\nOpenLORIS_images img5399.jpg\nOpenLORIS_images img4608.jpg\nOpenLORIS_images img999.jpg\nOpenLORIS_images img4524.jpg\nOpenLORIS_images img207.jpg\nOpenLORIS_images img2060.jpg\nOpenLORIS_images img3875.jpg\nOpenLORIS_images img3383.jpg\nOpenLORIS_images img2743.jpg\nOpenLORIS_images img1499.jpg\nOpenLORIS_images img5418.jpg\nOpenLORIS_images img5518.jpg\nOpenLORIS_images img5691.jpg\nOpenLORIS_images img893.jpg\nOpenLORIS_images img5398.jpg\nOpenLORIS_images img953.jpg\nOpenLORIS_images img5971.jpg\nOpenLORIS_images img1241.jpg\nOpenLORIS_images img2607.jpg\nOpenLORIS_images img2049.jpg\nOpenLORIS_images img5041.jpg\nOpenLORIS_images img1056.jpg\nOpenLORIS_images img5737.jpg\nOpenLORIS_images img4160.jpg\nOpenLORIS_images img2487.jpg\nOpenLORIS_images img4847.jpg\nOpenLORIS_images img106.jpg\nOpenLORIS_images img257.jpg\nOpenLORIS_images img1457.jpg\nOpenLORIS_images img724.jpg\nOpenLORIS_images img5333.jpg\nOpenLORIS_images img2915.jpg\nOpenLORIS_images img152.jpg\nOpenLORIS_images img2446.jpg\nOpenLORIS_images img2981.jpg\nOpenLORIS_images img3019.jpg\nOpenLORIS_images img4794.jpg\nOpenLORIS_images img3648.jpg\nOpenLORIS_images img2079.jpg\nOpenLORIS_images img3693.jpg\nOpenLORIS_images img3940.jpg\nOpenLORIS_images img2628.jpg\nOpenLORIS_images img2281.jpg\nOpenLORIS_images img3871.jpg\nOpenLORIS_images img3276.jpg\nOpenLORIS_images img2421.jpg\nOpenLORIS_images img5049.jpg\nOpenLORIS_images img196.jpg\nOpenLORIS_images img4053.jpg\nOpenLORIS_images img3789.jpg\nOpenLORIS_images img457.jpg\nOpenLORIS_images img4056.jpg\nOpenLORIS_images img3925.jpg\nOpenLORIS_images img2248.jpg\nOpenLORIS_images img265.jpg\nOpenLORIS_images img5845.jpg\nOpenLORIS_images img3197.jpg\nOpenLORIS_images img2564.jpg\nOpenLORIS_images img5751.jpg\nOpenLORIS_images img3817.jpg\nOpenLORIS_images img3519.jpg\nOpenLORIS_images img2623.jpg\nOpenLORIS_images img2764.jpg\nOpenLORIS_images img1291.jpg\nOpenLORIS_images img1265.jpg\nOpenLORIS_images img3646.jpg\nOpenLORIS_images img5963.jpg\nOpenLORIS_images img4047.jpg\nOpenLORIS_images img508.jpg\nOpenLORIS_images img2411.jpg\nOpenLORIS_images img12.jpg\nOpenLORIS_images img5879.jpg\nOpenLORIS_images img6014.jpg\nOpenLORIS_images img1345.jpg\nOpenLORIS_images img802.jpg\nOpenLORIS_images img4397.jpg\nOpenLORIS_images img2988.jpg\nOpenLORIS_images img4955.jpg\nOpenLORIS_images img5240.jpg\nOpenLORIS_images img3417.jpg\nOpenLORIS_images img5748.jpg\nOpenLORIS_images img571.jpg\nOpenLORIS_images img2365.jpg\nOpenLORIS_images img4250.jpg\nOpenLORIS_images img4514.jpg\nOpenLORIS_images img239.jpg\nOpenLORIS_images img5182.jpg\nOpenLORIS_images img226.jpg\nOpenLORIS_images img1523.jpg\nOpenLORIS_images img1809.jpg\nOpenLORIS_images img3381.jpg\nOpenLORIS_images img4122.jpg\nOpenLORIS_images img2791.jpg\nOpenLORIS_images img4940.jpg\nOpenLORIS_images img1399.jpg\nOpenLORIS_images img4191.jpg\nOpenLORIS_images img1200.jpg\nOpenLORIS_images img3672.jpg\nOpenLORIS_images img1968.jpg\nOpenLORIS_images img3650.jpg\nOpenLORIS_images img4372.jpg\nOpenLORIS_images img24.jpg\nOpenLORIS_images img1119.jpg\nOpenLORIS_images img5131.jpg\nOpenLORIS_images img4289.jpg\nOpenLORIS_images img1028.jpg\nOpenLORIS_images img3228.jpg\nOpenLORIS_images img5881.jpg\nOpenLORIS_images img4408.jpg\nOpenLORIS_images img4086.jpg\nOpenLORIS_images img2357.jpg\nOpenLORIS_images img3544.jpg\nOpenLORIS_images img5795.jpg\nOpenLORIS_images img39.jpg\nOpenLORIS_images img3180.jpg\nOpenLORIS_images img5403.jpg\nOpenLORIS_images img3077.jpg\nOpenLORIS_images img2829.jpg\nOpenLORIS_images img3189.jpg\nOpenLORIS_images img1548.jpg\nOpenLORIS_images img5278.jpg\nOpenLORIS_images img5952.jpg\nOpenLORIS_images img153.jpg\nOpenLORIS_images img5231.jpg\nOpenLORIS_images img5242.jpg\nOpenLORIS_images img2517.jpg\nOpenLORIS_images img3097.jpg\nOpenLORIS_images img5990.jpg\nOpenLORIS_images img2853.jpg\nOpenLORIS_images img4188.jpg\nOpenLORIS_images img4002.jpg\nOpenLORIS_images img3464.jpg\nOpenLORIS_images img3264.jpg\nOpenLORIS_images img4118.jpg\nOpenLORIS_images img1501.jpg\nOpenLORIS_images img3105.jpg\nOpenLORIS_images img887.jpg\nOpenLORIS_images img6080.jpg\nOpenLORIS_images img1863.jpg\nOpenLORIS_images img2204.jpg\nOpenLORIS_images img3880.jpg\nOpenLORIS_images img1495.jpg\nOpenLORIS_images img875.jpg\nOpenLORIS_images img229.jpg\nOpenLORIS_images img2475.jpg\nOpenLORIS_images img565.jpg\nOpenLORIS_images img1292.jpg\nOpenLORIS_images img5442.jpg\nOpenLORIS_images img436.jpg\nOpenLORIS_images img2091.jpg\nOpenLORIS_images img3662.jpg\nOpenLORIS_images img5640.jpg\nOpenLORIS_images img1131.jpg\nOpenLORIS_images img3685.jpg\nOpenLORIS_images img6066.jpg\nOpenLORIS_images img2345.jpg\nOpenLORIS_images img790.jpg\nOpenLORIS_images img3587.jpg\nOpenLORIS_images img210.jpg\nOpenLORIS_images img6050.jpg\nOpenLORIS_images img3284.jpg\nOpenLORIS_images img293.jpg\nOpenLORIS_images img874.jpg\nOpenLORIS_images img402.jpg\nOpenLORIS_images img235.jpg\nOpenLORIS_images img2938.jpg\nOpenLORIS_images img4347.jpg\nOpenLORIS_images img5298.jpg\nOpenLORIS_images img2252.jpg\nOpenLORIS_images img1140.jpg\nOpenLORIS_images img4904.jpg\nOpenLORIS_images img1100.jpg\nOpenLORIS_images img5615.jpg\nOpenLORIS_images img4388.jpg\nOpenLORIS_images img2207.jpg\nOpenLORIS_images img1598.jpg\nOpenLORIS_images img2672.jpg\nOpenLORIS_images img594.jpg\nOpenLORIS_images img4703.jpg\nOpenLORIS_images img3471.jpg\nOpenLORIS_images img15.jpg\nOpenLORIS_images img3441.jpg\nOpenLORIS_images img4277.jpg\nOpenLORIS_images img2255.jpg\nOpenLORIS_images img5432.jpg\nOpenLORIS_images img3771.jpg\nOpenLORIS_images img2864.jpg\nOpenLORIS_images img799.jpg\nOpenLORIS_images img538.jpg\nOpenLORIS_images img4379.jpg\nOpenLORIS_images img2982.jpg\nOpenLORIS_images img1275.jpg\nOpenLORIS_images img699.jpg\nOpenLORIS_images img5534.jpg\nOpenLORIS_images img5130.jpg\nOpenLORIS_images img115.jpg\nOpenLORIS_images img4246.jpg\nOpenLORIS_images img2534.jpg\nOpenLORIS_images img1728.jpg\nOpenLORIS_images img5207.jpg\nOpenLORIS_images img2246.jpg\nOpenLORIS_images img1117.jpg\nOpenLORIS_images img735.jpg\nOpenLORIS_images img2715.jpg\nOpenLORIS_images img5112.jpg\nOpenLORIS_images img1092.jpg\nOpenLORIS_images img1987.jpg\nOpenLORIS_images img3750.jpg\nOpenLORIS_images img1248.jpg\nOpenLORIS_images img5466.jpg\nOpenLORIS_images img1666.jpg\nOpenLORIS_images img5375.jpg\nOpenLORIS_images img1474.jpg\nOpenLORIS_images img890.jpg\nOpenLORIS_images img5889.jpg\nOpenLORIS_images img3010.jpg\nOpenLORIS_images img3049.jpg\nOpenLORIS_images img1672.jpg\nOpenLORIS_images img3460.jpg\nOpenLORIS_images img517.jpg\nOpenLORIS_images img4920.jpg\nOpenLORIS_images img528.jpg\nOpenLORIS_images img3096.jpg\nOpenLORIS_images img4705.jpg\nOpenLORIS_images img4960.jpg\nOpenLORIS_images img4791.jpg\nOpenLORIS_images img1046.jpg\nOpenLORIS_images img822.jpg\nOpenLORIS_images img4525.jpg\nOpenLORIS_images img151.jpg\nOpenLORIS_images img4934.jpg\nOpenLORIS_images img5214.jpg\nOpenLORIS_images img5082.jpg\nOpenLORIS_images img2523.jpg\nOpenLORIS_images img2468.jpg\nOpenLORIS_images img5074.jpg\nOpenLORIS_images img5888.jpg\nOpenLORIS_images img309.jpg\nOpenLORIS_images img2144.jpg\nOpenLORIS_images img2778.jpg\nOpenLORIS_images img4560.jpg\nOpenLORIS_images img3903.jpg\nOpenLORIS_images img3486.jpg\nOpenLORIS_images img988.jpg\nOpenLORIS_images img5451.jpg\nOpenLORIS_images img681.jpg\nOpenLORIS_images img5446.jpg\nOpenLORIS_images img4406.jpg\nOpenLORIS_images img806.jpg\nOpenLORIS_images img4921.jpg\nOpenLORIS_images img4889.jpg\nOpenLORIS_images img3056.jpg\nOpenLORIS_images img1323.jpg\nOpenLORIS_images img489.jpg\nOpenLORIS_images img2005.jpg\nOpenLORIS_images img5289.jpg\nOpenLORIS_images img4790.jpg\nOpenLORIS_images img2413.jpg\nOpenLORIS_images img1161.jpg\nOpenLORIS_images img2867.jpg\nOpenLORIS_images img981.jpg\nOpenLORIS_images img2454.jpg\nOpenLORIS_images img2249.jpg\nOpenLORIS_images img3729.jpg\nOpenLORIS_images img6096.jpg\nOpenLORIS_images img2445.jpg\nOpenLORIS_images img3456.jpg\nOpenLORIS_images img3811.jpg\nOpenLORIS_images img1231.jpg\nOpenLORIS_images img1753.jpg\nOpenLORIS_images img1624.jpg\nOpenLORIS_images img1.jpg\nOpenLORIS_images img1121.jpg\nOpenLORIS_images img5281.jpg\nOpenLORIS_images img3554.jpg\nOpenLORIS_images img4473.jpg\nOpenLORIS_images img3292.jpg\nOpenLORIS_images img766.jpg\nOpenLORIS_images img2301.jpg\nOpenLORIS_images img148.jpg\nOpenLORIS_images img2264.jpg\nOpenLORIS_images img4771.jpg\nOpenLORIS_images img2890.jpg\nOpenLORIS_images img3261.jpg\nOpenLORIS_images img3328.jpg\nOpenLORIS_images img2342.jpg\nOpenLORIS_images img1419.jpg\nOpenLORIS_images img4107.jpg\nOpenLORIS_images img5873.jpg\nOpenLORIS_images img5878.jpg\nOpenLORIS_images img6098.jpg\nOpenLORIS_images img2836.jpg\nOpenLORIS_images img5221.jpg\nOpenLORIS_images img3945.jpg\nOpenLORIS_images img2733.jpg\nOpenLORIS_images img4282.jpg\nOpenLORIS_images img2438.jpg\nOpenLORIS_images img4696.jpg\nOpenLORIS_images img1152.jpg\nOpenLORIS_images img2629.jpg\nOpenLORIS_images img2706.jpg\nOpenLORIS_images img5181.jpg\nOpenLORIS_images img905.jpg\nOpenLORIS_images img2275.jpg\nOpenLORIS_images img4997.jpg\nOpenLORIS_images img1963.jpg\nOpenLORIS_images img889.jpg\nOpenLORIS_images img772.jpg\nOpenLORIS_images img3514.jpg\nOpenLORIS_images img5034.jpg\nOpenLORIS_images img1588.jpg\nOpenLORIS_images img5917.jpg\nOpenLORIS_images img1690.jpg\nOpenLORIS_images img558.jpg\nOpenLORIS_images img4269.jpg\nOpenLORIS_images img1914.jpg\nOpenLORIS_images img5968.jpg\nOpenLORIS_images img3198.jpg\nOpenLORIS_images img5943.jpg\nOpenLORIS_images img4964.jpg\nOpenLORIS_images img128.jpg\nOpenLORIS_images img1567.jpg\nOpenLORIS_images img4131.jpg\nOpenLORIS_images img4689.jpg\nOpenLORIS_images img4279.jpg\nOpenLORIS_images img3917.jpg\nOpenLORIS_images img4877.jpg\nOpenLORIS_images img3104.jpg\nOpenLORIS_images img1431.jpg\nOpenLORIS_images img1355.jpg\nOpenLORIS_images img5169.jpg\nOpenLORIS_images img1584.jpg\nOpenLORIS_images img1278.jpg\nOpenLORIS_images img5274.jpg\nOpenLORIS_images img430.jpg\nOpenLORIS_images img5644.jpg\nOpenLORIS_images img1219.jpg\nOpenLORIS_images img1183.jpg\nOpenLORIS_images img5382.jpg\nOpenLORIS_images img1447.jpg\nOpenLORIS_images img3247.jpg\nOpenLORIS_images img3735.jpg\nOpenLORIS_images img5494.jpg\nOpenLORIS_images img5230.jpg\nOpenLORIS_images img5725.jpg\nOpenLORIS_images img3231.jpg\nOpenLORIS_images img4905.jpg\nOpenLORIS_images img4513.jpg\nOpenLORIS_images img3797.jpg\nOpenLORIS_images img1420.jpg\nOpenLORIS_images img3923.jpg\nOpenLORIS_images img320.jpg\nOpenLORIS_images img621.jpg\nOpenLORIS_images img3461.jpg\nOpenLORIS_images img6058.jpg\nOpenLORIS_images img5470.jpg\nOpenLORIS_images img2852.jpg\nOpenLORIS_images img5190.jpg\nOpenLORIS_images img1299.jpg\nOpenLORIS_images img2288.jpg\nOpenLORIS_images img2432.jpg\nOpenLORIS_images img771.jpg\nOpenLORIS_images img2424.jpg\nOpenLORIS_images img5206.jpg\nOpenLORIS_images img697.jpg\nOpenLORIS_images img1638.jpg\nOpenLORIS_images img5816.jpg\nOpenLORIS_images img214.jpg\nOpenLORIS_images img380.jpg\nOpenLORIS_images img689.jpg\nOpenLORIS_images img2088.jpg\nOpenLORIS_images img1783.jpg\nOpenLORIS_images img1095.jpg\nOpenLORIS_images img710.jpg\nOpenLORIS_images img2165.jpg\nOpenLORIS_images img5305.jpg\nOpenLORIS_images img5825.jpg\nOpenLORIS_images img4350.jpg\nOpenLORIS_images img3083.jpg\nOpenLORIS_images img1101.jpg\nOpenLORIS_images img3598.jpg\nOpenLORIS_images img2949.jpg\nOpenLORIS_images img1284.jpg\nOpenLORIS_images img1453.jpg\nOpenLORIS_images img5786.jpg\nOpenLORIS_images img3280.jpg\nOpenLORIS_images img3351.jpg\nOpenLORIS_images img5999.jpg\nOpenLORIS_images img426.jpg\nOpenLORIS_images img1050.jpg\nOpenLORIS_images img4100.jpg\nOpenLORIS_images img5826.jpg\nOpenLORIS_images img5156.jpg\nOpenLORIS_images img809.jpg\nOpenLORIS_images img2682.jpg\nOpenLORIS_images img1163.jpg\nOpenLORIS_images img5743.jpg\nOpenLORIS_images img4274.jpg\nOpenLORIS_images img1066.jpg\nOpenLORIS_images img2.jpg\nOpenLORIS_images img3179.jpg\nOpenLORIS_images img1591.jpg\nOpenLORIS_images img5871.jpg\nOpenLORIS_images img3878.jpg\nOpenLORIS_images img4571.jpg\nOpenLORIS_images img662.jpg\nOpenLORIS_images img6095.jpg\nOpenLORIS_images img2137.jpg\nOpenLORIS_images img2078.jpg\nOpenLORIS_images img4833.jpg\nOpenLORIS_images img3712.jpg\nOpenLORIS_images img2707.jpg\nOpenLORIS_images img3862.jpg\nOpenLORIS_images img3316.jpg\nOpenLORIS_images img165.jpg\nOpenLORIS_images img5005.jpg\nOpenLORIS_images img4366.jpg\nOpenLORIS_images img726.jpg\nOpenLORIS_images img5109.jpg\nOpenLORIS_images img695.jpg\nOpenLORIS_images img633.jpg\nOpenLORIS_images img3078.jpg\nOpenLORIS_images img4304.jpg\nOpenLORIS_images img3099.jpg\nOpenLORIS_images img5767.jpg\nOpenLORIS_images img961.jpg\nOpenLORIS_images img2537.jpg\nOpenLORIS_images img1930.jpg\nOpenLORIS_images img34.jpg\nOpenLORIS_images img5022.jpg\nOpenLORIS_images img1855.jpg\nOpenLORIS_images img4323.jpg\nOpenLORIS_images img2713.jpg\nOpenLORIS_images img332.jpg\nOpenLORIS_images img5444.jpg\nOpenLORIS_images img4319.jpg\nOpenLORIS_images img836.jpg\nOpenLORIS_images img4007.jpg\nOpenLORIS_images img5430.jpg\nOpenLORIS_images img2123.jpg\nOpenLORIS_images img4562.jpg\nOpenLORIS_images img4103.jpg\nOpenLORIS_images img470.jpg\nOpenLORIS_images img825.jpg\nOpenLORIS_images img5047.jpg\nOpenLORIS_images img5846.jpg\nOpenLORIS_images img1592.jpg\nOpenLORIS_images img5755.jpg\nOpenLORIS_images img2492.jpg\nOpenLORIS_images img4361.jpg\nOpenLORIS_images img3308.jpg\nOpenLORIS_images img2071.jpg\nOpenLORIS_images img4853.jpg\nOpenLORIS_images img4076.jpg\nOpenLORIS_images img1054.jpg\nOpenLORIS_images img3424.jpg\nOpenLORIS_images img4013.jpg\nOpenLORIS_images img4450.jpg\nOpenLORIS_images img5549.jpg\nOpenLORIS_images img158.jpg\nOpenLORIS_images img3205.jpg\nOpenLORIS_images img2205.jpg\nOpenLORIS_images img1367.jpg\nOpenLORIS_images img4974.jpg\nOpenLORIS_images img4312.jpg\nOpenLORIS_images img3639.jpg\nOpenLORIS_images img1487.jpg\nOpenLORIS_images img5060.jpg\nOpenLORIS_images img1182.jpg\nOpenLORIS_images img2279.jpg\nOpenLORIS_images img1670.jpg\nOpenLORIS_images img2300.jpg\nOpenLORIS_images img3434.jpg\nOpenLORIS_images img5717.jpg\nOpenLORIS_images img3199.jpg\nOpenLORIS_images img400.jpg\nOpenLORIS_images img393.jpg\nOpenLORIS_images img2267.jpg\nOpenLORIS_images img2174.jpg\nOpenLORIS_images img2392.jpg\nOpenLORIS_images img6051.jpg\nOpenLORIS_images img4235.jpg\nOpenLORIS_images img2543.jpg\nOpenLORIS_images img2216.jpg\nOpenLORIS_images img3143.jpg\nOpenLORIS_images img75.jpg\nOpenLORIS_images img5334.jpg\nOpenLORIS_images img3462.jpg\nOpenLORIS_images img4139.jpg\nOpenLORIS_images img5536.jpg\nOpenLORIS_images img3658.jpg\nOpenLORIS_images img1726.jpg\nOpenLORIS_images img107.jpg\nOpenLORIS_images img450.jpg\nOpenLORIS_images img5332.jpg\nOpenLORIS_images img108.jpg\nOpenLORIS_images img3850.jpg\nOpenLORIS_images img4543.jpg\nOpenLORIS_images img5839.jpg\nOpenLORIS_images img1994.jpg\nOpenLORIS_images img3556.jpg\nOpenLORIS_images img1115.jpg\nOpenLORIS_images img5713.jpg\nOpenLORIS_images img1807.jpg\nOpenLORIS_images img4845.jpg\nOpenLORIS_images img5405.jpg\nOpenLORIS_images img5768.jpg\nOpenLORIS_images img4785.jpg\nOpenLORIS_images img941.jpg\nOpenLORIS_images img4991.jpg\nOpenLORIS_images img390.jpg\nOpenLORIS_images img5797.jpg\nOpenLORIS_images img5455.jpg\nOpenLORIS_images img5793.jpg\nOpenLORIS_images img2318.jpg\nOpenLORIS_images img2461.jpg\nOpenLORIS_images img2024.jpg\nOpenLORIS_images img2135.jpg\nOpenLORIS_images img4208.jpg\nOpenLORIS_images img3243.jpg\nOpenLORIS_images img5441.jpg\nOpenLORIS_images img5709.jpg\nOpenLORIS_images img4788.jpg\nOpenLORIS_images img4055.jpg\nOpenLORIS_images img3835.jpg\nOpenLORIS_images img1862.jpg\nOpenLORIS_images img60.jpg\nOpenLORIS_images img637.jpg\nOpenLORIS_images img1916.jpg\nOpenLORIS_images img2728.jpg\nOpenLORIS_images img3795.jpg\nOpenLORIS_images img3355.jpg\nOpenLORIS_images img4642.jpg\nOpenLORIS_images img1931.jpg\nOpenLORIS_images img574.jpg\nOpenLORIS_images img4265.jpg\nOpenLORIS_images img3688.jpg\nOpenLORIS_images img1329.jpg\nOpenLORIS_images img381.jpg\nOpenLORIS_images img171.jpg\nOpenLORIS_images img70.jpg\nOpenLORIS_images img5674.jpg\nOpenLORIS_images img544.jpg\nOpenLORIS_images img1210.jpg\nOpenLORIS_images img355.jpg\nOpenLORIS_images img4685.jpg\nOpenLORIS_images img4783.jpg\nOpenLORIS_images img4054.jpg\nOpenLORIS_images img35.jpg\nOpenLORIS_images img269.jpg\nOpenLORIS_images img4205.jpg\nOpenLORIS_images img3005.jpg\nOpenLORIS_images img756.jpg\nOpenLORIS_images img5655.jpg\nOpenLORIS_images img5886.jpg\nOpenLORIS_images img3024.jpg\nOpenLORIS_images img459.jpg\nOpenLORIS_images img3535.jpg\nOpenLORIS_images img6081.jpg\nOpenLORIS_images img5506.jpg\nOpenLORIS_images img886.jpg\nOpenLORIS_images img3006.jpg\nOpenLORIS_images img1288.jpg\nOpenLORIS_images img5490.jpg\nOpenLORIS_images img4664.jpg\nOpenLORIS_images img4314.jpg\nOpenLORIS_images img2210.jpg\nOpenLORIS_images img2223.jpg\nOpenLORIS_images img6021.jpg\nOpenLORIS_images img970.jpg\nOpenLORIS_images img604.jpg\nOpenLORIS_images img1358.jpg\nOpenLORIS_images img5068.jpg\nOpenLORIS_images img6020.jpg\nOpenLORIS_images img47.jpg\nOpenLORIS_images img5859.jpg\nOpenLORIS_images img3655.jpg\nOpenLORIS_images img5682.jpg\nOpenLORIS_images img2039.jpg\nOpenLORIS_images img2202.jpg\nOpenLORIS_images img5996.jpg\nOpenLORIS_images img4417.jpg\nOpenLORIS_images img892.jpg\nOpenLORIS_images img3380.jpg\nOpenLORIS_images img4099.jpg\nOpenLORIS_images img5976.jpg\nOpenLORIS_images img591.jpg\nOpenLORIS_images img3211.jpg\nOpenLORIS_images img1628.jpg\nOpenLORIS_images img1341.jpg\nOpenLORIS_images img6074.jpg\nOpenLORIS_images img122.jpg\nOpenLORIS_images img2583.jpg\nOpenLORIS_images img4136.jpg\nOpenLORIS_images img4671.jpg\nOpenLORIS_images img3399.jpg\nOpenLORIS_images img5030.jpg\nOpenLORIS_images img6027.jpg\nOpenLORIS_images img1516.jpg\nOpenLORIS_images img200.jpg\nOpenLORIS_images img3258.jpg\nOpenLORIS_images img3932.jpg\nOpenLORIS_images img5071.jpg\nOpenLORIS_images img3320.jpg\nOpenLORIS_images img1392.jpg\nOpenLORIS_images img1353.jpg\nOpenLORIS_images img4846.jpg\nOpenLORIS_images img2694.jpg\nOpenLORIS_images img263.jpg\nOpenLORIS_images img2975.jpg\nOpenLORIS_images img4721.jpg\nOpenLORIS_images img2229.jpg\nOpenLORIS_images img1635.jpg\nOpenLORIS_images img2334.jpg\nOpenLORIS_images img2405.jpg\nOpenLORIS_images img5716.jpg\nOpenLORIS_images img3564.jpg\nOpenLORIS_images img5347.jpg\nOpenLORIS_images img4062.jpg\nOpenLORIS_images img4925.jpg\nOpenLORIS_images img2697.jpg\nOpenLORIS_images img117.jpg\nOpenLORIS_images img4733.jpg\nOpenLORIS_images img1339.jpg\nOpenLORIS_images img4797.jpg\nOpenLORIS_images img2880.jpg\nOpenLORIS_images img824.jpg\nOpenLORIS_images img4111.jpg\nOpenLORIS_images img5193.jpg\nOpenLORIS_images img4935.jpg\nOpenLORIS_images img2961.jpg\nOpenLORIS_images img4283.jpg\nOpenLORIS_images img752.jpg\nOpenLORIS_images img3753.jpg\nOpenLORIS_images img1298.jpg\nOpenLORIS_images img643.jpg\nOpenLORIS_images img5407.jpg\nOpenLORIS_images img2779.jpg\nOpenLORIS_images img4335.jpg\nOpenLORIS_images img1071.jpg\nOpenLORIS_images img3003.jpg\nOpenLORIS_images img5997.jpg\nOpenLORIS_images img3841.jpg\nOpenLORIS_images img4029.jpg\nOpenLORIS_images img2479.jpg\nOpenLORIS_images img4805.jpg\nOpenLORIS_images img3021.jpg\nOpenLORIS_images img4992.jpg\nOpenLORIS_images img2708.jpg\nOpenLORIS_images img4449.jpg\nOpenLORIS_images img1409.jpg\nOpenLORIS_images img1212.jpg\nOpenLORIS_images img303.jpg\nOpenLORIS_images img5480.jpg\nOpenLORIS_images img1220.jpg\nOpenLORIS_images img4297.jpg\nOpenLORIS_images img5245.jpg\nOpenLORIS_images img1274.jpg\nOpenLORIS_images img5318.jpg\nOpenLORIS_images img2634.jpg\nOpenLORIS_images img1896.jpg\nOpenLORIS_images img3629.jpg\nOpenLORIS_images img5504.jpg\nOpenLORIS_images img796.jpg\nOpenLORIS_images img1247.jpg\nOpenLORIS_images img368.jpg\nOpenLORIS_images img4064.jpg\nOpenLORIS_images img1259.jpg\nOpenLORIS_images img4959.jpg\nOpenLORIS_images img1330.jpg\nOpenLORIS_images img231.jpg\nOpenLORIS_images img3253.jpg\nOpenLORIS_images img4822.jpg\nOpenLORIS_images img5785.jpg\nOpenLORIS_images img1850.jpg\nOpenLORIS_images img3030.jpg\nOpenLORIS_images img2873.jpg\nOpenLORIS_images img2208.jpg\nOpenLORIS_images img4322.jpg\nOpenLORIS_images img3406.jpg\nOpenLORIS_images img820.jpg\nOpenLORIS_images img3013.jpg\nOpenLORIS_images img2510.jpg\nOpenLORIS_images img1876.jpg\nOpenLORIS_images img4405.jpg\nOpenLORIS_images img2186.jpg\nOpenLORIS_images img539.jpg\nOpenLORIS_images img2699.jpg\nOpenLORIS_images img484.jpg\nOpenLORIS_images img609.jpg\nOpenLORIS_images img6132.jpg\nOpenLORIS_images img5699.jpg\nOpenLORIS_images img5964.jpg\nOpenLORIS_images img5530.jpg\nOpenLORIS_images img2887.jpg\nOpenLORIS_images img5087.jpg\nOpenLORIS_images img3573.jpg\nOpenLORIS_images img3340.jpg\nOpenLORIS_images img2417.jpg\nOpenLORIS_images img4572.jpg\nOpenLORIS_images img4858.jpg\nOpenLORIS_images img5505.jpg\nOpenLORIS_images img4060.jpg\nOpenLORIS_images img335.jpg\nOpenLORIS_images img1461.jpg\nOpenLORIS_images img5523.jpg\nOpenLORIS_images img1064.jpg\nOpenLORIS_images img3102.jpg\nOpenLORIS_images img3705.jpg\nOpenLORIS_images img3366.jpg\nOpenLORIS_images img2712.jpg\nOpenLORIS_images img5665.jpg\nOpenLORIS_images img1105.jpg\nOpenLORIS_images img2414.jpg\nOpenLORIS_images img4739.jpg\nOpenLORIS_images img2849.jpg\nOpenLORIS_images img86.jpg\nOpenLORIS_images img3810.jpg\nOpenLORIS_images img3879.jpg\nOpenLORIS_images img927.jpg\nOpenLORIS_images img4747.jpg\nOpenLORIS_images img3869.jpg\nOpenLORIS_images img5680.jpg\nOpenLORIS_images img516.jpg\nOpenLORIS_images img23.jpg\nOpenLORIS_images img4025.jpg\nOpenLORIS_images img2346.jpg\nOpenLORIS_images img5809.jpg\nOpenLORIS_images img4970.jpg\nOpenLORIS_images img3697.jpg\nOpenLORIS_images img1768.jpg\nOpenLORIS_images img3227.jpg\nOpenLORIS_images img4003.jpg\nOpenLORIS_images img2931.jpg\nOpenLORIS_images img4686.jpg\nOpenLORIS_images img3825.jpg\nOpenLORIS_images img1411.jpg\nOpenLORIS_images img4916.jpg\nOpenLORIS_images img701.jpg\nOpenLORIS_images img5896.jpg\nOpenLORIS_images img1919.jpg\nOpenLORIS_images img5033.jpg\nOpenLORIS_images img478.jpg\nOpenLORIS_images img1189.jpg\nOpenLORIS_images img4074.jpg\nOpenLORIS_images img1668.jpg\nOpenLORIS_images img3036.jpg\nOpenLORIS_images img2830.jpg\nOpenLORIS_images img3225.jpg\nOpenLORIS_images img4178.jpg\nOpenLORIS_images img738.jpg\nOpenLORIS_images img71.jpg\nOpenLORIS_images img6022.jpg\nOpenLORIS_images img3052.jpg\nOpenLORIS_images img1312.jpg\nOpenLORIS_images img146.jpg\nOpenLORIS_images img2570.jpg\nOpenLORIS_images img747.jpg\nOpenLORIS_images img4879.jpg\nOpenLORIS_images img1948.jpg\nOpenLORIS_images img4346.jpg\nOpenLORIS_images img5707.jpg\nOpenLORIS_images img2353.jpg\nOpenLORIS_images img3282.jpg\nOpenLORIS_images img3614.jpg\nOpenLORIS_images img1684.jpg\nOpenLORIS_images img1976.jpg\nOpenLORIS_images img2509.jpg\nOpenLORIS_images img3039.jpg\nOpenLORIS_images img5808.jpg\nOpenLORIS_images img5095.jpg\nOpenLORIS_images img5213.jpg\nOpenLORIS_images img5543.jpg\nOpenLORIS_images img4165.jpg\nOpenLORIS_images img4117.jpg\nOpenLORIS_images img833.jpg\nOpenLORIS_images img5813.jpg\nOpenLORIS_images img3008.jpg\nOpenLORIS_images img3954.jpg\nOpenLORIS_images img5830.jpg\nOpenLORIS_images img742.jpg\nOpenLORIS_images img1908.jpg\nOpenLORIS_images img2213.jpg\nOpenLORIS_images img1564.jpg\nOpenLORIS_images img1258.jpg\nOpenLORIS_images img4364.jpg\nOpenLORIS_images img4700.jpg\nOpenLORIS_images img4603.jpg\nOpenLORIS_images img850.jpg\nOpenLORIS_images img215.jpg\nOpenLORIS_images img4303.jpg\nOpenLORIS_images img2314.jpg\nOpenLORIS_images img1164.jpg\nOpenLORIS_images img2067.jpg\nOpenLORIS_images img2138.jpg\nOpenLORIS_images img3872.jpg\nOpenLORIS_images img5926.jpg\nOpenLORIS_images img4014.jpg\nOpenLORIS_images img1717.jpg\nOpenLORIS_images img2034.jpg\nOpenLORIS_images img2077.jpg\nOpenLORIS_images img1955.jpg\nOpenLORIS_images img4868.jpg\nOpenLORIS_images img3756.jpg\nOpenLORIS_images img92.jpg\nOpenLORIS_images img2584.jpg\nOpenLORIS_images img1790.jpg\nOpenLORIS_images img227.jpg\nOpenLORIS_images img2386.jpg\nOpenLORIS_images img1068.jpg\nOpenLORIS_images img5657.jpg\nOpenLORIS_images img3899.jpg\nOpenLORIS_images img1553.jpg\nOpenLORIS_images img2499.jpg\nOpenLORIS_images img157.jpg\nOpenLORIS_images img5232.jpg\nOpenLORIS_images img1343.jpg\nOpenLORIS_images img2094.jpg\nOpenLORIS_images img2305.jpg\nOpenLORIS_images img4841.jpg\nOpenLORIS_images img3894.jpg\nOpenLORIS_images img2563.jpg\nOpenLORIS_images img1088.jpg\nOpenLORIS_images img4966.jpg\nOpenLORIS_images img2388.jpg\nOpenLORIS_images img1770.jpg\nOpenLORIS_images img4197.jpg\nOpenLORIS_images img4435.jpg\nOpenLORIS_images img839.jpg\nOpenLORIS_images img1860.jpg\nOpenLORIS_images img4536.jpg\nOpenLORIS_images img5854.jpg\nOpenLORIS_images img2242.jpg\nOpenLORIS_images img4077.jpg\nOpenLORIS_images img2387.jpg\nOpenLORIS_images img1144.jpg\nOpenLORIS_images img4844.jpg\nOpenLORIS_images img4711.jpg\nOpenLORIS_images img3727.jpg\nOpenLORIS_images img2313.jpg\nOpenLORIS_images img1573.jpg\nOpenLORIS_images img867.jpg\nOpenLORIS_images img1551.jpg\nOpenLORIS_images img1978.jpg\nOpenLORIS_images img4294.jpg\nOpenLORIS_images img1826.jpg\nOpenLORIS_images img5296.jpg\nOpenLORIS_images img5700.jpg\nOpenLORIS_images img6099.jpg\nOpenLORIS_images img3965.jpg\nOpenLORIS_images img4512.jpg\nOpenLORIS_images img3555.jpg\nOpenLORIS_images img2809.jpg\nOpenLORIS_images img4468.jpg\nOpenLORIS_images img3895.jpg\nOpenLORIS_images img27.jpg\nOpenLORIS_images img1898.jpg\nOpenLORIS_images img3230.jpg\nOpenLORIS_images img435.jpg\nOpenLORIS_images img350.jpg\nOpenLORIS_images img1879.jpg\nOpenLORIS_images img1888.jpg\nOpenLORIS_images img2020.jpg\nOpenLORIS_images img6090.jpg\nOpenLORIS_images img4194.jpg\nOpenLORIS_images img2701.jpg\nOpenLORIS_images img4509.jpg\nOpenLORIS_images img1869.jpg\nOpenLORIS_images img2127.jpg\nOpenLORIS_images img4161.jpg\nOpenLORIS_images img59.jpg\nOpenLORIS_images img5000.jpg\nOpenLORIS_images img5681.jpg\nOpenLORIS_images img3684.jpg\nOpenLORIS_images img5460.jpg\nOpenLORIS_images img1238.jpg\nOpenLORIS_images img111.jpg\nOpenLORIS_images img2041.jpg\nOpenLORIS_images img3272.jpg\nOpenLORIS_images img542.jpg\nOpenLORIS_images img5647.jpg\nOpenLORIS_images img2869.jpg\nOpenLORIS_images img5481.jpg\nOpenLORIS_images img3398.jpg\nOpenLORIS_images img5459.jpg\nOpenLORIS_images img2841.jpg\nOpenLORIS_images img5817.jpg\nOpenLORIS_images img2331.jpg\nOpenLORIS_images img137.jpg\nOpenLORIS_images img4042.jpg\nOpenLORIS_images img1799.jpg\nOpenLORIS_images img1754.jpg\nOpenLORIS_images img256.jpg\nOpenLORIS_images img522.jpg\nOpenLORIS_images img4412.jpg\nOpenLORIS_images img4816.jpg\nOpenLORIS_images img3289.jpg\nOpenLORIS_images img1118.jpg\nOpenLORIS_images img3497.jpg\nOpenLORIS_images img2722.jpg\nOpenLORIS_images img5429.jpg\nOpenLORIS_images img737.jpg\nOpenLORIS_images img1642.jpg\nOpenLORIS_images img5989.jpg\nOpenLORIS_images img2520.jpg\nOpenLORIS_images img3064.jpg\nOpenLORIS_images img136.jpg\nOpenLORIS_images img276.jpg\nOpenLORIS_images img2663.jpg\nOpenLORIS_images img308.jpg\nOpenLORIS_images img1689.jpg\nOpenLORIS_images img342.jpg\nOpenLORIS_images img3933.jpg\nOpenLORIS_images img4387.jpg\nOpenLORIS_images img5546.jpg\nOpenLORIS_images img545.jpg\nOpenLORIS_images img2056.jpg\nOpenLORIS_images img3176.jpg\nOpenLORIS_images img4968.jpg\nOpenLORIS_images img4300.jpg\nOpenLORIS_images img1950.jpg\nOpenLORIS_images img4382.jpg\nOpenLORIS_images img6115.jpg\nOpenLORIS_images img2546.jpg\nOpenLORIS_images img3874.jpg\nOpenLORIS_images img3360.jpg\nOpenLORIS_images img1273.jpg\nOpenLORIS_images img1599.jpg\nOpenLORIS_images img77.jpg\nOpenLORIS_images img65.jpg\nOpenLORIS_images img254.jpg\nOpenLORIS_images img6016.jpg\nOpenLORIS_images img2983.jpg\nOpenLORIS_images img2900.jpg\nOpenLORIS_images img1623.jpg\nOpenLORIS_images img1300.jpg\nOpenLORIS_images img2811.jpg\nOpenLORIS_images img3632.jpg\nOpenLORIS_images img1812.jpg\nOpenLORIS_images img3739.jpg\nOpenLORIS_images img5028.jpg\nOpenLORIS_images img3004.jpg\nOpenLORIS_images img2851.jpg\nOpenLORIS_images img5690.jpg\nOpenLORIS_images img2096.jpg\nOpenLORIS_images img155.jpg\nOpenLORIS_images img49.jpg\nOpenLORIS_images img1014.jpg\nOpenLORIS_images img5764.jpg\nOpenLORIS_images img5160.jpg\nOpenLORIS_images img5396.jpg\nOpenLORIS_images img5302.jpg\nOpenLORIS_images img2840.jpg\nOpenLORIS_images img5009.jpg\nOpenLORIS_images img3701.jpg\nOpenLORIS_images img5696.jpg\nOpenLORIS_images img3245.jpg\nOpenLORIS_images img3093.jpg\nOpenLORIS_images img1556.jpg\nOpenLORIS_images img1696.jpg\nOpenLORIS_images img3921.jpg\nOpenLORIS_images img2643.jpg\nOpenLORIS_images img759.jpg\nOpenLORIS_images img549.jpg\nOpenLORIS_images img3527.jpg\nOpenLORIS_images img4132.jpg\nOpenLORIS_images img315.jpg\nOpenLORIS_images img5863.jpg\nOpenLORIS_images img6069.jpg\nOpenLORIS_images img5704.jpg\nOpenLORIS_images img4032.jpg\nOpenLORIS_images img734.jpg\nOpenLORIS_images img2168.jpg\nOpenLORIS_images img1348.jpg\nOpenLORIS_images img1338.jpg\nOpenLORIS_images img1497.jpg\nOpenLORIS_images img387.jpg\nOpenLORIS_images img1293.jpg\nOpenLORIS_images img1988.jpg\nOpenLORIS_images img1403.jpg\nOpenLORIS_images img1785.jpg\nOpenLORIS_images img4162.jpg\nOpenLORIS_images img4480.jpg\nOpenLORIS_images img4752.jpg\nOpenLORIS_images img1557.jpg\nOpenLORIS_images img964.jpg\nOpenLORIS_images img2595.jpg\nOpenLORIS_images img2126.jpg\nOpenLORIS_images img2967.jpg\nOpenLORIS_images img3362.jpg\nOpenLORIS_images img172.jpg\nOpenLORIS_images img4181.jpg\nOpenLORIS_images img3046.jpg\nOpenLORIS_images img4211.jpg\nOpenLORIS_images img1069.jpg\nOpenLORIS_images img2494.jpg\nOpenLORIS_images img4069.jpg\nOpenLORIS_images img1566.jpg\nOpenLORIS_images img4606.jpg\nOpenLORIS_images img5473.jpg\nOpenLORIS_images img2423.jpg\nOpenLORIS_images img2702.jpg\nOpenLORIS_images img1049.jpg\nOpenLORIS_images img3509.jpg\nOpenLORIS_images img458.jpg\nOpenLORIS_images img5774.jpg\nOpenLORIS_images img5841.jpg\nOpenLORIS_images img2508.jpg\nOpenLORIS_images img378.jpg\nOpenLORIS_images img5733.jpg\nOpenLORIS_images img2188.jpg\nOpenLORIS_images img2217.jpg\nOpenLORIS_images img3757.jpg\nOpenLORIS_images img3570.jpg\nOpenLORIS_images img2193.jpg\nOpenLORIS_images img5667.jpg\nOpenLORIS_images img2200.jpg\nOpenLORIS_images img705.jpg\nOpenLORIS_images img5608.jpg\nOpenLORIS_images img1875.jpg\nOpenLORIS_images img3517.jpg\nOpenLORIS_images img4582.jpg\nOpenLORIS_images img1250.jpg\nOpenLORIS_images img6013.jpg\nOpenLORIS_images img3451.jpg\nOpenLORIS_images img1716.jpg\nOpenLORIS_images img5673.jpg\nOpenLORIS_images img255.jpg\nOpenLORIS_images img5422.jpg\nOpenLORIS_images img2581.jpg\nOpenLORIS_images img1290.jpg\nOpenLORIS_images img928.jpg\nOpenLORIS_images img5649.jpg\nOpenLORIS_images img84.jpg\nOpenLORIS_images img230.jpg\nOpenLORIS_images img2449.jpg\nOpenLORIS_images img5503.jpg\nOpenLORIS_images img5142.jpg\nOpenLORIS_images img4637.jpg\nOpenLORIS_images img4098.jpg\nOpenLORIS_images img3549.jpg\nOpenLORIS_images img4329.jpg\nOpenLORIS_images img5050.jpg\nOpenLORIS_images img2289.jpg\nOpenLORIS_images img1170.jpg\nOpenLORIS_images img3363.jpg\nOpenLORIS_images img5138.jpg\nOpenLORIS_images img5901.jpg\nOpenLORIS_images img3442.jpg\nOpenLORIS_images img2845.jpg\nOpenLORIS_images img3666.jpg\nOpenLORIS_images img2273.jpg\nOpenLORIS_images img1589.jpg\nOpenLORIS_images img1055.jpg\nOpenLORIS_images img4909.jpg\nOpenLORIS_images img906.jpg\nOpenLORIS_images img359.jpg\nOpenLORIS_images img4198.jpg\nOpenLORIS_images img3217.jpg\nOpenLORIS_images img804.jpg\nOpenLORIS_images img2604.jpg\nOpenLORIS_images img6110.jpg\nOpenLORIS_images img5689.jpg\nOpenLORIS_images img2230.jpg\nOpenLORIS_images img3144.jpg\nOpenLORIS_images img3500.jpg\nOpenLORIS_images img4567.jpg\nOpenLORIS_images img29.jpg\nOpenLORIS_images img3768.jpg\nOpenLORIS_images img5675.jpg\nOpenLORIS_images img4220.jpg\nOpenLORIS_images img3150.jpg\nOpenLORIS_images img3414.jpg\nOpenLORIS_images img555.jpg\nOpenLORIS_images img1540.jpg\nOpenLORIS_images img4336.jpg\nOpenLORIS_images img3619.jpg\nOpenLORIS_images img3234.jpg\nOpenLORIS_images img3621.jpg\nOpenLORIS_images img2608.jpg\nOpenLORIS_images img5750.jpg\nOpenLORIS_images img5069.jpg\nOpenLORIS_images img3722.jpg\nOpenLORIS_images img1318.jpg\nOpenLORIS_images img1707.jpg\nOpenLORIS_images img1030.jpg\nOpenLORIS_images img4680.jpg\nOpenLORIS_images img5894.jpg\nOpenLORIS_images img622.jpg\nOpenLORIS_images img2725.jpg\nOpenLORIS_images img1475.jpg\nOpenLORIS_images img2590.jpg\nOpenLORIS_images img4704.jpg\nOpenLORIS_images img4504.jpg\nOpenLORIS_images img3488.jpg\nOpenLORIS_images img1992.jpg\nOpenLORIS_images img3386.jpg\nOpenLORIS_images img312.jpg\nOpenLORIS_images img48.jpg\nOpenLORIS_images img357.jpg\nOpenLORIS_images img1305.jpg\nOpenLORIS_images img1492.jpg\nOpenLORIS_images img3268.jpg\nOpenLORIS_images img209.jpg\nOpenLORIS_images img5337.jpg\nOpenLORIS_images img337.jpg\nOpenLORIS_images img2730.jpg\nOpenLORIS_images img253.jpg\nOpenLORIS_images img4017.jpg\nOpenLORIS_images img5486.jpg\nOpenLORIS_images img2832.jpg\nOpenLORIS_images img2592.jpg\nOpenLORIS_images img3897.jpg\nOpenLORIS_images img2321.jpg\nOpenLORIS_images img4820.jpg\nOpenLORIS_images img5283.jpg\nOpenLORIS_images img5932.jpg\nOpenLORIS_images img2820.jpg\nOpenLORIS_images img5317.jpg\nOpenLORIS_images img1996.jpg\nOpenLORIS_images img4583.jpg\nOpenLORIS_images img2429.jpg\nOpenLORIS_images img3584.jpg\nOpenLORIS_images img4189.jpg\nOpenLORIS_images img4011.jpg\nOpenLORIS_images img2994.jpg\nOpenLORIS_images img753.jpg\nOpenLORIS_images img6010.jpg\nOpenLORIS_images img2206.jpg\nOpenLORIS_images img2371.jpg\nOpenLORIS_images img1942.jpg\nOpenLORIS_images img1477.jpg\nOpenLORIS_images img4355.jpg\nOpenLORIS_images img2278.jpg\nOpenLORIS_images img610.jpg\nOpenLORIS_images img548.jpg\nOpenLORIS_images img5392.jpg\nOpenLORIS_images img5993.jpg\nOpenLORIS_images img5011.jpg\nOpenLORIS_images img2466.jpg\nOpenLORIS_images img1190.jpg\nOpenLORIS_images img2596.jpg\nOpenLORIS_images img4381.jpg\nOpenLORIS_images img2704.jpg\nOpenLORIS_images img1856.jpg\nOpenLORIS_images img2362.jpg\nOpenLORIS_images img3794.jpg\nOpenLORIS_images img3647.jpg\nOpenLORIS_images img447.jpg\nOpenLORIS_images img5540.jpg\nOpenLORIS_images img4617.jpg\nOpenLORIS_images img1400.jpg\nOpenLORIS_images img4714.jpg\nOpenLORIS_images img2306.jpg\nOpenLORIS_images img5091.jpg\nOpenLORIS_images img3521.jpg\nOpenLORIS_images img4561.jpg\nOpenLORIS_images img2767.jpg\nOpenLORIS_images img5244.jpg\nOpenLORIS_images img2142.jpg\nOpenLORIS_images img1895.jpg\nOpenLORIS_images img2964.jpg\nOpenLORIS_images img3015.jpg\nOpenLORIS_images img1184.jpg\nOpenLORIS_images img1958.jpg\nOpenLORIS_images img3972.jpg\nOpenLORIS_images img455.jpg\nOpenLORIS_images img832.jpg\nOpenLORIS_images img2899.jpg\nOpenLORIS_images img3576.jpg\nOpenLORIS_images img4368.jpg\nOpenLORIS_images img179.jpg\nOpenLORIS_images img2906.jpg\nOpenLORIS_images img4416.jpg\nOpenLORIS_images img2846.jpg\nOpenLORIS_images img770.jpg\nOpenLORIS_images img421.jpg\nOpenLORIS_images img2911.jpg\nOpenLORIS_images img4225.jpg\nOpenLORIS_images img5821.jpg\nOpenLORIS_images img5357.jpg\nOpenLORIS_images img5531.jpg\nOpenLORIS_images img1075.jpg\nOpenLORIS_images img5319.jpg\nOpenLORIS_images img2789.jpg\nOpenLORIS_images img5600.jpg\nOpenLORIS_images img3801.jpg\nOpenLORIS_images img2863.jpg\nOpenLORIS_images img682.jpg\nOpenLORIS_images img4226.jpg\nOpenLORIS_images img2312.jpg\nOpenLORIS_images img3133.jpg\nOpenLORIS_images img2896.jpg\nOpenLORIS_images img501.jpg\nOpenLORIS_images img1378.jpg\nOpenLORIS_images img1595.jpg\nOpenLORIS_images img475.jpg\nOpenLORIS_images img3777.jpg\nOpenLORIS_images img5711.jpg\nOpenLORIS_images img5238.jpg\nOpenLORIS_images img5493.jpg\nOpenLORIS_images img6007.jpg\nOpenLORIS_images img3423.jpg\nOpenLORIS_images img994.jpg\nOpenLORIS_images img5076.jpg\nOpenLORIS_images img1593.jpg\nOpenLORIS_images img570.jpg\nOpenLORIS_images img5316.jpg\nOpenLORIS_images img990.jpg\nOpenLORIS_images img5509.jpg\nOpenLORIS_images img5551.jpg\nOpenLORIS_images img760.jpg\nOpenLORIS_images img238.jpg\nOpenLORIS_images img2550.jpg\nOpenLORIS_images img3422.jpg\nOpenLORIS_images img4048.jpg\nOpenLORIS_images img103.jpg\nOpenLORIS_images img5144.jpg\nOpenLORIS_images img79.jpg\nOpenLORIS_images img2231.jpg\nOpenLORIS_images img2810.jpg\nOpenLORIS_images img4874.jpg\nOpenLORIS_images img3792.jpg\nOpenLORIS_images img1819.jpg\nOpenLORIS_images img1083.jpg\nOpenLORIS_images img3300.jpg\nOpenLORIS_images img3888.jpg\nOpenLORIS_images img3281.jpg\nOpenLORIS_images img142.jpg\nOpenLORIS_images img5075.jpg\nOpenLORIS_images img3908.jpg\nOpenLORIS_images img3468.jpg\nOpenLORIS_images img4241.jpg\nOpenLORIS_images img717.jpg\nOpenLORIS_images img4044.jpg\nOpenLORIS_images img4239.jpg\nOpenLORIS_images img5611.jpg\nOpenLORIS_images img2795.jpg\nOpenLORIS_images img2292.jpg\nOpenLORIS_images img2447.jpg\nOpenLORIS_images img615.jpg\nOpenLORIS_images img5280.jpg\nOpenLORIS_images img2003.jpg\nOpenLORIS_images img996.jpg\nOpenLORIS_images img4419.jpg\nOpenLORIS_images img1167.jpg\nOpenLORIS_images img5701.jpg\nOpenLORIS_images img6032.jpg\nOpenLORIS_images img4292.jpg\nOpenLORIS_images img1470.jpg\nOpenLORIS_images img2998.jpg\nOpenLORIS_images img6034.jpg\nOpenLORIS_images img5723.jpg\nOpenLORIS_images img647.jpg\nOpenLORIS_images img2718.jpg\nOpenLORIS_images img1024.jpg\nOpenLORIS_images img5902.jpg\nOpenLORIS_images img1777.jpg\nOpenLORIS_images img30.jpg\nOpenLORIS_images img733.jpg\nOpenLORIS_images img5983.jpg\nOpenLORIS_images img653.jpg\nOpenLORIS_images img1286.jpg\nOpenLORIS_images img2164.jpg\nOpenLORIS_images img5434.jpg\nOpenLORIS_images img4321.jpg\nOpenLORIS_images img2958.jpg\nOpenLORIS_images img5860.jpg\nOpenLORIS_images img2904.jpg\nOpenLORIS_images img5987.jpg\nOpenLORIS_images img5046.jpg\nOpenLORIS_images img3865.jpg\nOpenLORIS_images img5900.jpg\nOpenLORIS_images img4199.jpg\nOpenLORIS_images img2984.jpg\nOpenLORIS_images img904.jpg\nOpenLORIS_images img4884.jpg\nOpenLORIS_images img206.jpg\nOpenLORIS_images img4371.jpg\nOpenLORIS_images img4478.jpg\nOpenLORIS_images img4823.jpg\nOpenLORIS_images img960.jpg\nOpenLORIS_images img3859.jpg\nOpenLORIS_images img844.jpg\nOpenLORIS_images img1641.jpg\nOpenLORIS_images img3539.jpg\nOpenLORIS_images img2992.jpg\nOpenLORIS_images img5146.jpg\nOpenLORIS_images img6029.jpg\nOpenLORIS_images img4682.jpg\nOpenLORIS_images img1332.jpg\nOpenLORIS_images img1727.jpg\nOpenLORIS_images img3330.jpg\nOpenLORIS_images img3009.jpg\nOpenLORIS_images img2017.jpg\nOpenLORIS_images img1877.jpg\nOpenLORIS_images img5194.jpg\nOpenLORIS_images img1099.jpg\nOpenLORIS_images img3952.jpg\nOpenLORIS_images img3240.jpg\nOpenLORIS_images img1149.jpg\nOpenLORIS_images img3950.jpg\nOpenLORIS_images img1381.jpg\nOpenLORIS_images img959.jpg\nOpenLORIS_images img3430.jpg\nOpenLORIS_images img446.jpg\nOpenLORIS_images img4932.jpg\nOpenLORIS_images img5158.jpg\nOpenLORIS_images img2284.jpg\nOpenLORIS_images img283.jpg\nOpenLORIS_images img983.jpg\nOpenLORIS_images img382.jpg\nOpenLORIS_images img4995.jpg\nOpenLORIS_images img674.jpg\nOpenLORIS_images img1391.jpg\nOpenLORIS_images img4301.jpg\nOpenLORIS_images img3079.jpg\nOpenLORIS_images img5694.jpg\nOpenLORIS_images img1724.jpg\nOpenLORIS_images img403.jpg\nOpenLORIS_images img3916.jpg\nOpenLORIS_images img4389.jpg\nOpenLORIS_images img2436.jpg\nOpenLORIS_images img5412.jpg\nOpenLORIS_images img1858.jpg\nOpenLORIS_images img5098.jpg\nOpenLORIS_images img9.jpg\nOpenLORIS_images img5866.jpg\nOpenLORIS_images img3294.jpg\nOpenLORIS_images img4052.jpg\nOpenLORIS_images img2328.jpg\nOpenLORIS_images img523.jpg\nOpenLORIS_images img5288.jpg\nOpenLORIS_images img5120.jpg\nOpenLORIS_images img541.jpg\nOpenLORIS_images img2987.jpg\nOpenLORIS_images img3014.jpg\nOpenLORIS_images img3938.jpg\nOpenLORIS_images img2180.jpg\nOpenLORIS_images img4134.jpg\nOpenLORIS_images img831.jpg\nOpenLORIS_images img6038.jpg\nOpenLORIS_images img4115.jpg\nOpenLORIS_images img1035.jpg\nOpenLORIS_images img5728.jpg\nOpenLORIS_images img2898.jpg\nOpenLORIS_images img302.jpg\nOpenLORIS_images img5687.jpg\nOpenLORIS_images img683.jpg\nOpenLORIS_images img5428.jpg\nOpenLORIS_images img3220.jpg\nOpenLORIS_images img2559.jpg\nOpenLORIS_images img5291.jpg\nOpenLORIS_images img3333.jpg\nOpenLORIS_images img2929.jpg\nOpenLORIS_images img2063.jpg\nOpenLORIS_images img1889.jpg\nOpenLORIS_images img4831.jpg\nOpenLORIS_images img4080.jpg\nOpenLORIS_images img1634.jpg\nOpenLORIS_images img1745.jpg\nOpenLORIS_images img6085.jpg\nOpenLORIS_images img1143.jpg\nOpenLORIS_images img3327.jpg\nOpenLORIS_images img723.jpg\nOpenLORIS_images img2776.jpg\nOpenLORIS_images img5698.jpg\nOpenLORIS_images img1357.jpg\nOpenLORIS_images img5284.jpg\nOpenLORIS_images img4360.jpg\nOpenLORIS_images img4185.jpg\nOpenLORIS_images img537.jpg\nOpenLORIS_images img1938.jpg\nOpenLORIS_images img4164.jpg\nOpenLORIS_images img1214.jpg\nOpenLORIS_images img2781.jpg\nOpenLORIS_images img114.jpg\nOpenLORIS_images img2665.jpg\nOpenLORIS_images img1486.jpg\nOpenLORIS_images img3444.jpg\nOpenLORIS_images img4628.jpg\nOpenLORIS_images img671.jpg\nOpenLORIS_images img5692.jpg\nOpenLORIS_images img1267.jpg\nOpenLORIS_images img4570.jpg\nOpenLORIS_images img4634.jpg\nOpenLORIS_images img629.jpg\nOpenLORIS_images img716.jpg\nOpenLORIS_images img3738.jpg\nOpenLORIS_images img5913.jpg\nOpenLORIS_images img1433.jpg\nOpenLORIS_images img1722.jpg\nOpenLORIS_images img4233.jpg\nOpenLORIS_images img967.jpg\nOpenLORIS_images img109.jpg\nOpenLORIS_images img2720.jpg\nOpenLORIS_images img4192.jpg\nOpenLORIS_images img3196.jpg\nOpenLORIS_images img89.jpg\nOpenLORIS_images img1712.jpg\nOpenLORIS_images img2218.jpg\nOpenLORIS_images img4876.jpg\nOpenLORIS_images img1949.jpg\nOpenLORIS_images img467.jpg\nOpenLORIS_images img572.jpg\nOpenLORIS_images img2025.jpg\nOpenLORIS_images img4677.jpg\nOpenLORIS_images img2471.jpg\nOpenLORIS_images img4878.jpg\nOpenLORIS_images img5088.jpg\nOpenLORIS_images img433.jpg\nOpenLORIS_images img3169.jpg\nOpenLORIS_images img5223.jpg\nOpenLORIS_images img3649.jpg\nOpenLORIS_images img2214.jpg\nOpenLORIS_images img3288.jpg\nOpenLORIS_images img552.jpg\nOpenLORIS_images img1454.jpg\nOpenLORIS_images img4540.jpg\nOpenLORIS_images img1019.jpg\nOpenLORIS_images img4150.jpg\nOpenLORIS_images img3195.jpg\nOpenLORIS_images img5927.jpg\nOpenLORIS_images img762.jpg\nOpenLORIS_images img5628.jpg\nOpenLORIS_images img2259.jpg\nOpenLORIS_images img1685.jpg\nOpenLORIS_images img3963.jpg\nOpenLORIS_images img280.jpg\nOpenLORIS_images img5759.jpg\nOpenLORIS_images img5671.jpg\nOpenLORIS_images img138.jpg\nOpenLORIS_images img2696.jpg\nOpenLORIS_images img4083.jpg\nOpenLORIS_images img1233.jpg\nOpenLORIS_images img2182.jpg\nOpenLORIS_images img2545.jpg\nOpenLORIS_images img1941.jpg\nOpenLORIS_images img456.jpg\nOpenLORIS_images img4357.jpg\nOpenLORIS_images img5260.jpg\nOpenLORIS_images img4659.jpg\nOpenLORIS_images img5732.jpg\nOpenLORIS_images img427.jpg\nOpenLORIS_images img5636.jpg\nOpenLORIS_images img4026.jpg\nOpenLORIS_images img3076.jpg\nOpenLORIS_images img5946.jpg\nOpenLORIS_images img4004.jpg\nOpenLORIS_images img2227.jpg\nOpenLORIS_images img2587.jpg\nOpenLORIS_images img1926.jpg\nOpenLORIS_images img3483.jpg\nOpenLORIS_images img3148.jpg\nOpenLORIS_images img4720.jpg\nOpenLORIS_images img100.jpg\nOpenLORIS_images img1802.jpg\nOpenLORIS_images img2383.jpg\nOpenLORIS_images img5380.jpg\nOpenLORIS_images img856.jpg\nOpenLORIS_images img1901.jpg\nOpenLORIS_images img2111.jpg\nOpenLORIS_images img6082.jpg\nOpenLORIS_images img4756.jpg\nOpenLORIS_images img614.jpg\nOpenLORIS_images img2477.jpg\nOpenLORIS_images img3625.jpg\nOpenLORIS_images img4912.jpg\nOpenLORIS_images img2299.jpg\nOpenLORIS_images img3116.jpg\nOpenLORIS_images img3090.jpg\nOpenLORIS_images img3435.jpg\nOpenLORIS_images img1102.jpg\nOpenLORIS_images img2857.jpg\nOpenLORIS_images img4646.jpg\nOpenLORIS_images img469.jpg\nOpenLORIS_images img2788.jpg\nOpenLORIS_images img5575.jpg\nOpenLORIS_images img5141.jpg\nOpenLORIS_images img1007.jpg\nOpenLORIS_images img1769.jpg\nOpenLORIS_images img1346.jpg\nOpenLORIS_images img4829.jpg\nOpenLORIS_images img5975.jpg\nOpenLORIS_images img3287.jpg\nOpenLORIS_images img3496.jpg\nOpenLORIS_images img4579.jpg\nOpenLORIS_images img1830.jpg\nOpenLORIS_images img485.jpg\nOpenLORIS_images img4809.jpg\nOpenLORIS_images img2650.jpg\nOpenLORIS_images img3970.jpg\nOpenLORIS_images img3985.jpg\nOpenLORIS_images img2224.jpg\nOpenLORIS_images img1455.jpg\nOpenLORIS_images img3602.jpg\nOpenLORIS_images img5847.jpg\nOpenLORIS_images img854.jpg\nOpenLORIS_images img556.jpg\nOpenLORIS_images img4724.jpg\nOpenLORIS_images img5513.jpg\nOpenLORIS_images img2187.jpg\nOpenLORIS_images img1197.jpg\nOpenLORIS_images img384.jpg\nOpenLORIS_images img5595.jpg\nOpenLORIS_images img4623.jpg\nOpenLORIS_images img5153.jpg\nOpenLORIS_images img118.jpg\nOpenLORIS_images img1375.jpg\nOpenLORIS_images img1974.jpg\nOpenLORIS_images img5055.jpg\nOpenLORIS_images img1425.jpg\nOpenLORIS_images img3378.jpg\nOpenLORIS_images img718.jpg\nOpenLORIS_images img4434.jpg\nOpenLORIS_images img1921.jpg\nOpenLORIS_images img5426.jpg\nOpenLORIS_images img3731.jpg\nOpenLORIS_images img4535.jpg\nOpenLORIS_images img1759.jpg\nOpenLORIS_images img1518.jpg\nOpenLORIS_images img534.jpg\nOpenLORIS_images img1849.jpg\nOpenLORIS_images img4058.jpg\nOpenLORIS_images img123.jpg\nOpenLORIS_images img5234.jpg\nOpenLORIS_images img4295.jpg\nOpenLORIS_images img497.jpg\nOpenLORIS_images img3215.jpg\nOpenLORIS_images img2485.jpg\nOpenLORIS_images img3796.jpg\nOpenLORIS_images img5862.jpg\nOpenLORIS_images img3926.jpg\nOpenLORIS_images img4545.jpg\nOpenLORIS_images img1132.jpg\nOpenLORIS_images img4742.jpg\nOpenLORIS_images img4104.jpg\nOpenLORIS_images img3256.jpg\nOpenLORIS_images img3931.jpg\nOpenLORIS_images img2631.jpg\nOpenLORIS_images img4280.jpg\nOpenLORIS_images img5693.jpg\nOpenLORIS_images img4892.jpg\nOpenLORIS_images img232.jpg\nOpenLORIS_images img5243.jpg\nOpenLORIS_images img5938.jpg\nOpenLORIS_images img2489.jpg\nOpenLORIS_images img748.jpg\nOpenLORIS_images img4036.jpg\nOpenLORIS_images img1652.jpg\nOpenLORIS_images img1319.jpg\nOpenLORIS_images img2962.jpg\nOpenLORIS_images img3388.jpg\nOpenLORIS_images img4815.jpg\nOpenLORIS_images img2769.jpg\nOpenLORIS_images img5998.jpg\nOpenLORIS_images img2681.jpg\nOpenLORIS_images img589.jpg\nOpenLORIS_images img3186.jpg\nOpenLORIS_images img1999.jpg\nOpenLORIS_images img908.jpg\nOpenLORIS_images img2042.jpg\nOpenLORIS_images img2946.jpg\nOpenLORIS_images img5738.jpg\nOpenLORIS_images img4072.jpg\nOpenLORIS_images img581.jpg\nOpenLORIS_images img1661.jpg\nOpenLORIS_images img1390.jpg\nOpenLORIS_images img2648.jpg\nOpenLORIS_images img3624.jpg\nOpenLORIS_images img4786.jpg\nOpenLORIS_images img563.jpg\nOpenLORIS_images img2504.jpg\nOpenLORIS_images img1725.jpg\nOpenLORIS_images img2617.jpg\nOpenLORIS_images img26.jpg\nOpenLORIS_images img5792.jpg\nOpenLORIS_images img1806.jpg\nOpenLORIS_images img2972.jpg\nOpenLORIS_images img780.jpg\nOpenLORIS_images img2668.jpg\nOpenLORIS_images img4580.jpg\nOpenLORIS_images img5374.jpg\nOpenLORIS_images img14.jpg\nOpenLORIS_images img1207.jpg\nOpenLORIS_images img2037.jpg\nOpenLORIS_images img2416.jpg\nOpenLORIS_images img5801.jpg\nOpenLORIS_images img3401.jpg\nOpenLORIS_images img1432.jpg\nOpenLORIS_images img1816.jpg\nOpenLORIS_images img1765.jpg\nOpenLORIS_images img3773.jpg\nOpenLORIS_images img121.jpg\nOpenLORIS_images img377.jpg\nOpenLORIS_images img3779.jpg\nOpenLORIS_images img284.jpg\nOpenLORIS_images img326.jpg\nOpenLORIS_images img1281.jpg\nOpenLORIS_images img5714.jpg\nOpenLORIS_images img4082.jpg\nOpenLORIS_images img2409.jpg\nOpenLORIS_images img5800.jpg\nOpenLORIS_images img3900.jpg\nOpenLORIS_images img2616.jpg\nOpenLORIS_images img3108.jpg\nOpenLORIS_images img1902.jpg\nOpenLORIS_images img1810.jpg\nOpenLORIS_images img3140.jpg\nOpenLORIS_images img4712.jpg\nOpenLORIS_images img5306.jpg\nOpenLORIS_images img1841.jpg\nOpenLORIS_images img221.jpg\nOpenLORIS_images img1249.jpg\nOpenLORIS_images img3020.jpg\nOpenLORIS_images img676.jpg\nOpenLORIS_images img1761.jpg\nOpenLORIS_images img3476.jpg\nOpenLORIS_images img3120.jpg\nOpenLORIS_images img5609.jpg\nOpenLORIS_images img2766.jpg\nOpenLORIS_images img1005.jpg\nOpenLORIS_images img2856.jpg\nOpenLORIS_images img4588.jpg\nOpenLORIS_images img5110.jpg\nOpenLORIS_images img5542.jpg\nOpenLORIS_images img4088.jpg\nOpenLORIS_images img5784.jpg\nOpenLORIS_images img5947.jpg\nOpenLORIS_images img1596.jpg\nOpenLORIS_images img3807.jpg\nOpenLORIS_images img1146.jpg\nOpenLORIS_images img585.jpg\nOpenLORIS_images img273.jpg\nOpenLORIS_images img3318.jpg\nOpenLORIS_images img3732.jpg\nOpenLORIS_images img5982.jpg\nOpenLORIS_images img5957.jpg\nOpenLORIS_images img2102.jpg\nOpenLORIS_images img1380.jpg\nOpenLORIS_images img837.jpg\nOpenLORIS_images img913.jpg\nOpenLORIS_images img116.jpg\nOpenLORIS_images img1825.jpg\nOpenLORIS_images img4096.jpg\nOpenLORIS_images img3708.jpg\nOpenLORIS_images img2045.jpg\nOpenLORIS_images img3863.jpg\nOpenLORIS_images img3326.jpg\nOpenLORIS_images img940.jpg\nOpenLORIS_images img3059.jpg\nOpenLORIS_images img267.jpg\nOpenLORIS_images img2554.jpg\nOpenLORIS_images img5257.jpg\nOpenLORIS_images img595.jpg\nOpenLORIS_images img507.jpg\nOpenLORIS_images img1124.jpg\nOpenLORIS_images img5419.jpg\nOpenLORIS_images img173.jpg\nOpenLORIS_images img5304.jpg\nOpenLORIS_images img2136.jpg\nOpenLORIS_images img5557.jpg\nOpenLORIS_images img3842.jpg\nOpenLORIS_images img3054.jpg\nOpenLORIS_images img902.jpg\nOpenLORIS_images img5051.jpg\nOpenLORIS_images img4990.jpg\nOpenLORIS_images img4209.jpg\nOpenLORIS_images img1488.jpg\nOpenLORIS_images img529.jpg\nOpenLORIS_images img4218.jpg\nOpenLORIS_images img1571.jpg\nOpenLORIS_images img2970.jpg\nOpenLORIS_images img2199.jpg\nOpenLORIS_images img665.jpg\nOpenLORIS_images img575.jpg\nOpenLORIS_images img5994.jpg\nOpenLORIS_images img6086.jpg\nOpenLORIS_images img5703.jpg\nOpenLORIS_images img4639.jpg\nOpenLORIS_images img3188.jpg\nOpenLORIS_images img298.jpg\nOpenLORIS_images img2069.jpg\nOpenLORIS_images img4202.jpg\nOpenLORIS_images img675.jpg\nOpenLORIS_images img5154.jpg\nOpenLORIS_images img670.jpg\nOpenLORIS_images img3301.jpg\nOpenLORIS_images img3482.jpg\nOpenLORIS_images img1017.jpg\nOpenLORIS_images img5128.jpg\nOpenLORIS_images img1279.jpg\nOpenLORIS_images img2677.jpg\nOpenLORIS_images img4331.jpg\nOpenLORIS_images img3307.jpg\nOpenLORIS_images img5910.jpg\nOpenLORIS_images img527.jpg\nOpenLORIS_images img6008.jpg\nOpenLORIS_images img2356.jpg\nOpenLORIS_images img3341.jpg\nOpenLORIS_images img2966.jpg\nOpenLORIS_images img773.jpg\nOpenLORIS_images img5390.jpg\nOpenLORIS_images img1009.jpg\nOpenLORIS_images img3911.jpg\nOpenLORIS_images img1430.jpg\nOpenLORIS_images img3928.jpg\nOpenLORIS_images img3487.jpg\nOpenLORIS_images img1741.jpg\nOpenLORIS_images img4116.jpg\nOpenLORIS_images img5577.jpg\nOpenLORIS_images img1216.jpg\nOpenLORIS_images img6071.jpg\nOpenLORIS_images img4598.jpg\nOpenLORIS_images img2639.jpg\nOpenLORIS_images img2528.jpg\nOpenLORIS_images img859.jpg\nOpenLORIS_images img4650.jpg\nOpenLORIS_images img4871.jpg\nOpenLORIS_images img3934.jpg\nOpenLORIS_images img4781.jpg\nOpenLORIS_images img186.jpg\nOpenLORIS_images img362.jpg\nOpenLORIS_images img405.jpg\nOpenLORIS_images img1874.jpg\nOpenLORIS_images img4683.jpg\nOpenLORIS_images img2389.jpg\nOpenLORIS_images img5651.jpg\nOpenLORIS_images img2322.jpg\nOpenLORIS_images img3745.jpg\nOpenLORIS_images img3439.jpg\nOpenLORIS_images img669.jpg\nOpenLORIS_images img3821.jpg\nOpenLORIS_images img4348.jpg\nOpenLORIS_images img2555.jpg\nOpenLORIS_images img5191.jpg\nOpenLORIS_images img1342.jpg\nOpenLORIS_images img2036.jpg\nOpenLORIS_images img3349.jpg\nOpenLORIS_images img3799.jpg\nOpenLORIS_images img731.jpg\nOpenLORIS_images img306.jpg\nOpenLORIS_images img3563.jpg\nOpenLORIS_images img2294.jpg\nOpenLORIS_images img611.jpg\nOpenLORIS_images img222.jpg\nOpenLORIS_images img6025.jpg\nOpenLORIS_images img1827.jpg\nOpenLORIS_images img2945.jpg\nOpenLORIS_images img5656.jpg\nOpenLORIS_images img2338.jpg\nOpenLORIS_images img1714.jpg\nOpenLORIS_images img2190.jpg\nOpenLORIS_images img2232.jpg\nOpenLORIS_images img1657.jpg\nOpenLORIS_images img3447.jpg\nOpenLORIS_images img2113.jpg\nOpenLORIS_images img5113.jpg\nOpenLORIS_images img3764.jpg\nOpenLORIS_images img3569.jpg\nOpenLORIS_images img5325.jpg\nOpenLORIS_images img6024.jpg\nOpenLORIS_images img3641.jpg\nOpenLORIS_images img559.jpg\nOpenLORIS_images img5949.jpg\nOpenLORIS_images img939.jpg\nOpenLORIS_images img2075.jpg\nOpenLORIS_images img1961.jpg\nOpenLORIS_images img2680.jpg\nOpenLORIS_images img352.jpg\nOpenLORIS_images img2019.jpg\nOpenLORIS_images img218.jpg\nOpenLORIS_images img338.jpg\nOpenLORIS_images img4826.jpg\nOpenLORIS_images img5212.jpg\nOpenLORIS_images img2339.jpg\nOpenLORIS_images img5209.jpg\nOpenLORIS_images img1073.jpg\nOpenLORIS_images img547.jpg\nOpenLORIS_images img894.jpg\nOpenLORIS_images img2661.jpg\nOpenLORIS_images img830.jpg\nOpenLORIS_images img5228.jpg\nOpenLORIS_images img2375.jpg\nOpenLORIS_images img375.jpg\nOpenLORIS_images img2838.jpg\nOpenLORIS_images img262.jpg\nOpenLORIS_images img1462.jpg\nOpenLORIS_images img4984.jpg\nOpenLORIS_images img3296.jpg\nOpenLORIS_images img4093.jpg\nOpenLORIS_images img2721.jpg\nOpenLORIS_images img560.jpg\nOpenLORIS_images img5241.jpg\nOpenLORIS_images img135.jpg\nOpenLORIS_images img3162.jpg\nOpenLORIS_images img4193.jpg\nOpenLORIS_images img440.jpg\nOpenLORIS_images img739.jpg\nOpenLORIS_images img4670.jpg\nOpenLORIS_images img2002.jpg\nOpenLORIS_images img474.jpg\nOpenLORIS_images img845.jpg\nOpenLORIS_images img4409.jpg\nOpenLORIS_images img1885.jpg\nOpenLORIS_images img3578.jpg\nOpenLORIS_images img1412.jpg\nOpenLORIS_images img2191.jpg\nOpenLORIS_images img1519.jpg\nOpenLORIS_images img4821.jpg\nOpenLORIS_images img43.jpg\nOpenLORIS_images img3704.jpg\nOpenLORIS_images img161.jpg\nOpenLORIS_images img1360.jpg\nOpenLORIS_images img5266.jpg\nOpenLORIS_images img2061.jpg\nOpenLORIS_images img2336.jpg\nOpenLORIS_images img3808.jpg\nOpenLORIS_images img5679.jpg\nOpenLORIS_images img6142.jpg\nOpenLORIS_images img2497.jpg\nOpenLORIS_images img1655.jpg\nOpenLORIS_images img3352.jpg\nOpenLORIS_images img4731.jpg\nOpenLORIS_images img2979.jpg\nOpenLORIS_images img1555.jpg\nOpenLORIS_images img3984.jpg\nOpenLORIS_images img3557.jpg\nOpenLORIS_images img1632.jpg\nOpenLORIS_images img590.jpg\nOpenLORIS_images img1397.jpg\nOpenLORIS_images img3018.jpg\nOpenLORIS_images img5107.jpg\nOpenLORIS_images img5488.jpg\nOpenLORIS_images img3881.jpg\nOpenLORIS_images img3051.jpg\nOpenLORIS_images img1211.jpg\nOpenLORIS_images img5553.jpg\nOpenLORIS_images img5383.jpg\nOpenLORIS_images img2959.jpg\nOpenLORIS_images img2703.jpg\nOpenLORIS_images img3132.jpg\nOpenLORIS_images img5035.jpg\nOpenLORIS_images img3396.jpg\nOpenLORIS_images img1289.jpg\nOpenLORIS_images img3536.jpg\nOpenLORIS_images img4311.jpg\nOpenLORIS_images img5189.jpg\nOpenLORIS_images img2513.jpg\nOpenLORIS_images img2794.jpg\nOpenLORIS_images img4469.jpg\nOpenLORIS_images img3007.jpg\nOpenLORIS_images img1568.jpg\nOpenLORIS_images img2478.jpg\nOpenLORIS_images img5954.jpg\nOpenLORIS_images img2116.jpg\nOpenLORIS_images img2522.jpg\nOpenLORIS_images img1151.jpg\nOpenLORIS_images img1892.jpg\nOpenLORIS_images img2908.jpg\nOpenLORIS_images img4376.jpg\nOpenLORIS_images img5092.jpg\nOpenLORIS_images img3702.jpg\nOpenLORIS_images img3526.jpg\nOpenLORIS_images img2355.jpg\nOpenLORIS_images img754.jpg\nOpenLORIS_images img3682.jpg\nOpenLORIS_images img5718.jpg\nOpenLORIS_images img3348.jpg\nOpenLORIS_images img5264.jpg\nOpenLORIS_images img4812.jpg\nOpenLORIS_images img1719.jpg\nOpenLORIS_images img4755.jpg\nOpenLORIS_images img5763.jpg\nOpenLORIS_images img2420.jpg\nOpenLORIS_images img5102.jpg\nOpenLORIS_images img2065.jpg\nOpenLORIS_images img2771.jpg\nOpenLORIS_images img5803.jpg\nOpenLORIS_images img3350.jpg\nOpenLORIS_images img4741.jpg\nOpenLORIS_images img292.jpg\nOpenLORIS_images img4911.jpg\nOpenLORIS_images img182.jpg\nOpenLORIS_images img5261.jpg\nOpenLORIS_images img4039.jpg\nOpenLORIS_images img2198.jpg\nOpenLORIS_images img5632.jpg\nOpenLORIS_images img1639.jpg\nOpenLORIS_images img5271.jpg\nOpenLORIS_images img5345.jpg\nOpenLORIS_images img921.jpg\nOpenLORIS_images img5253.jpg\nOpenLORIS_images img4776.jpg\nOpenLORIS_images img95.jpg\nOpenLORIS_images img612.jpg\nOpenLORIS_images img20.jpg\nOpenLORIS_images img3726.jpg\nOpenLORIS_images img3409.jpg\nOpenLORIS_images img3714.jpg\nOpenLORIS_images img2441.jpg\nOpenLORIS_images img4204.jpg\nOpenLORIS_images img2921.jpg\nOpenLORIS_images img909.jpg\nOpenLORIS_images img2630.jpg\nOpenLORIS_images img1610.jpg\nOpenLORIS_images img2813.jpg\nOpenLORIS_images img1052.jpg\nOpenLORIS_images img4210.jpg\nOpenLORIS_images img5294.jpg\nOpenLORIS_images img5695.jpg\nOpenLORIS_images img1797.jpg\nOpenLORIS_images img5329.jpg\nOpenLORIS_images img5262.jpg\nOpenLORIS_images img3356.jpg\nOpenLORIS_images img6123.jpg\nOpenLORIS_images img2469.jpg\nOpenLORIS_images img1473.jpg\nOpenLORIS_images img3338.jpg\nOpenLORIS_images img599.jpg\nOpenLORIS_images img4886.jpg\nOpenLORIS_images img169.jpg\nOpenLORIS_images img1033.jpg\nOpenLORIS_images img4063.jpg\nOpenLORIS_images img5591.jpg\nOpenLORIS_images img5535.jpg\nOpenLORIS_images img6011.jpg\nOpenLORIS_images img1148.jpg\nOpenLORIS_images img4574.jpg\nOpenLORIS_images img6030.jpg\nOpenLORIS_images img4906.jpg\nOpenLORIS_images img1404.jpg\nOpenLORIS_images img1615.jpg\nOpenLORIS_images img6131.jpg\nOpenLORIS_images img1059.jpg\nOpenLORIS_images img553.jpg\nOpenLORIS_images img4586.jpg\nOpenLORIS_images img3553.jpg\nOpenLORIS_images img1478.jpg\nOpenLORIS_images img6004.jpg\nOpenLORIS_images img4657.jpg\nOpenLORIS_images img2612.jpg\nOpenLORIS_images img1882.jpg\nOpenLORIS_images img5308.jpg\nOpenLORIS_images img3615.jpg\nOpenLORIS_images img2844.jpg\nOpenLORIS_images img6019.jpg\nOpenLORIS_images img1755.jpg\nOpenLORIS_images img644.jpg\nOpenLORIS_images img462.jpg\nOpenLORIS_images img4577.jpg\nOpenLORIS_images img2285.jpg\nOpenLORIS_images img3011.jpg\nOpenLORIS_images img21.jpg\nOpenLORIS_images img1954.jpg\nOpenLORIS_images img2032.jpg\nOpenLORIS_images img5948.jpg\nOpenLORIS_images img4457.jpg\nOpenLORIS_images img5440.jpg\nOpenLORIS_images img3809.jpg\nOpenLORIS_images img2858.jpg\nOpenLORIS_images img4939.jpg\nOpenLORIS_images img986.jpg\nOpenLORIS_images img0.jpg\nOpenLORIS_images img3232.jpg\nOpenLORIS_images img4456.jpg\nOpenLORIS_images img4182.jpg\nOpenLORIS_images img5471.jpg\nOpenLORIS_images img2600.jpg\nOpenLORIS_images img711.jpg\nOpenLORIS_images img3248.jpg\nOpenLORIS_images img4867.jpg\nOpenLORIS_images img1669.jpg\nOpenLORIS_images img713.jpg\nOpenLORIS_images img3528.jpg\nOpenLORIS_images img1738.jpg\nOpenLORIS_images img924.jpg\nOpenLORIS_images img4214.jpg\nOpenLORIS_images img5794.jpg\nOpenLORIS_images img2637.jpg\nOpenLORIS_images img5025.jpg\nOpenLORIS_images img1973.jpg\nOpenLORIS_images img5955.jpg\nOpenLORIS_images img1629.jpg\nOpenLORIS_images img4242.jpg\nOpenLORIS_images img2340.jpg\nOpenLORIS_images img2790.jpg\nOpenLORIS_images img251.jpg\nOpenLORIS_images img2772.jpg\nOpenLORIS_images img5295.jpg\nOpenLORIS_images img3636.jpg\nOpenLORIS_images img4493.jpg\nOpenLORIS_images img2081.jpg\nOpenLORIS_images img4522.jpg\nOpenLORIS_images img918.jpg\nOpenLORIS_images img3552.jpg\nOpenLORIS_images img295.jpg\nOpenLORIS_images img1600.jpg\nOpenLORIS_images img3089.jpg\nOpenLORIS_images img3037.jpg\nOpenLORIS_images img2097.jpg\nOpenLORIS_images img1469.jpg\nOpenLORIS_images img1953.jpg\nOpenLORIS_images img4612.jpg\nOpenLORIS_images img624.jpg\nOpenLORIS_images img3411.jpg\nOpenLORIS_images img2348.jpg\nOpenLORIS_images img2814.jpg\nOpenLORIS_images img5740.jpg\nOpenLORIS_images img1097.jpg\nOpenLORIS_images img5255.jpg\nOpenLORIS_images img4073.jpg\nOpenLORIS_images img944.jpg\nOpenLORIS_images img3635.jpg\nOpenLORIS_images img3515.jpg\nOpenLORIS_images img2642.jpg\nOpenLORIS_images img2194.jpg\nOpenLORIS_images img2337.jpg\nOpenLORIS_images img989.jpg\nOpenLORIS_images img2443.jpg\nOpenLORIS_images img3565.jpg\nOpenLORIS_images img4485.jpg\nOpenLORIS_images img397.jpg\nOpenLORIS_images img3353.jpg\nOpenLORIS_images img185.jpg\nOpenLORIS_images img5062.jpg\nOpenLORIS_images img6094.jpg\nOpenLORIS_images img2235.jpg\nOpenLORIS_images img1514.jpg\nOpenLORIS_images img4937.jpg\nOpenLORIS_images img4263.jpg\nOpenLORIS_images img1483.jpg\nOpenLORIS_images img1113.jpg\nOpenLORIS_images img3541.jpg\nOpenLORIS_images img1498.jpg\nOpenLORIS_images img3657.jpg\nOpenLORIS_images img1756.jpg\nOpenLORIS_images img2502.jpg\nOpenLORIS_images img5018.jpg\nOpenLORIS_images img3100.jpg\nOpenLORIS_images img2798.jpg\nOpenLORIS_images img3377.jpg\nOpenLORIS_images img4761.jpg\nOpenLORIS_images img5877.jpg\nOpenLORIS_images img5013.jpg\nOpenLORIS_images img4101.jpg\nOpenLORIS_images img4085.jpg\nOpenLORIS_images img346.jpg\nOpenLORIS_images img1372.jpg\nOpenLORIS_images img4914.jpg\nOpenLORIS_images img3947.jpg\nOpenLORIS_images img5303.jpg\nOpenLORIS_images img2787.jpg\nOpenLORIS_images img4967.jpg\nOpenLORIS_images img3989.jpg\nOpenLORIS_images img2594.jpg\nOpenLORIS_images img619.jpg\nOpenLORIS_images img4917.jpg\nOpenLORIS_images img3956.jpg\nOpenLORIS_images img2937.jpg\nOpenLORIS_images img1673.jpg\nOpenLORIS_images img2886.jpg\nOpenLORIS_images img1811.jpg\nOpenLORIS_images img5861.jpg\nOpenLORIS_images img2052.jpg\nOpenLORIS_images img1351.jpg\nOpenLORIS_images img577.jpg\nOpenLORIS_images img2932.jpg\nOpenLORIS_images img2724.jpg\nOpenLORIS_images img2258.jpg\nOpenLORIS_images img3118.jpg\nOpenLORIS_images img3200.jpg\nOpenLORIS_images img656.jpg\nOpenLORIS_images img4601.jpg\nOpenLORIS_images img3827.jpg\nOpenLORIS_images img3053.jpg\nOpenLORIS_images img3044.jpg\nOpenLORIS_images img1103.jpg\nOpenLORIS_images img4672.jpg\nOpenLORIS_images img4296.jpg\nOpenLORIS_images img2676.jpg\nOpenLORIS_images img2999.jpg\nOpenLORIS_images img1255.jpg\nOpenLORIS_images img1532.jpg\nOpenLORIS_images img2785.jpg\nOpenLORIS_images img3477.jpg\nOpenLORIS_images img2427.jpg\nOpenLORIS_images img2538.jpg\nOpenLORIS_images img4748.jpg\nOpenLORIS_images img3124.jpg\nOpenLORIS_images img5547.jpg\nOpenLORIS_images img1943.jpg\nOpenLORIS_images img2083.jpg\nOpenLORIS_images img3026.jpg\nOpenLORIS_images img3115.jpg\nOpenLORIS_images img3164.jpg\nOpenLORIS_images img588.jpg\nOpenLORIS_images img2940.jpg\nOpenLORIS_images img5355.jpg\nOpenLORIS_images img2678.jpg\nOpenLORIS_images img6063.jpg\nOpenLORIS_images img5084.jpg\nOpenLORIS_images img6000.jpg\nOpenLORIS_images img5457.jpg\nOpenLORIS_images img3128.jpg\nOpenLORIS_images img4215.jpg\nOpenLORIS_images img4749.jpg\nOpenLORIS_images img3139.jpg\nOpenLORIS_images img5833.jpg\nOpenLORIS_images img4641.jpg\nOpenLORIS_images img1502.jpg\nOpenLORIS_images img3437.jpg\nOpenLORIS_images img5545.jpg\nOpenLORIS_images img5541.jpg\nOpenLORIS_images img4486.jpg\nOpenLORIS_images img5104.jpg\nOpenLORIS_images img2015.jpg\nOpenLORIS_images img2912.jpg\nOpenLORIS_images img465.jpg\nOpenLORIS_images img112.jpg\nOpenLORIS_images img5081.jpg\nOpenLORIS_images img4255.jpg\nOpenLORIS_images img50.jpg\nOpenLORIS_images img2240.jpg\nOpenLORIS_images img5339.jpg\nOpenLORIS_images img4521.jpg\nOpenLORIS_images img4144.jpg\nOpenLORIS_images img846.jpg\nOpenLORIS_images img5117.jpg\nOpenLORIS_images img4613.jpg\nOpenLORIS_images img3824.jpg\nOpenLORIS_images img1082.jpg\nOpenLORIS_images img6113.jpg\nOpenLORIS_images img5712.jpg\nOpenLORIS_images img3784.jpg\nOpenLORIS_images img3177.jpg\nOpenLORIS_images img5132.jpg\nOpenLORIS_images img3304.jpg\nOpenLORIS_images img1090.jpg\nOpenLORIS_images img3575.jpg\nOpenLORIS_images img5670.jpg\nOpenLORIS_images img3626.jpg\nOpenLORIS_images img3660.jpg\nOpenLORIS_images img4778.jpg\nOpenLORIS_images img2638.jpg\nOpenLORIS_images img1552.jpg\nOpenLORIS_images img4281.jpg\nOpenLORIS_images img1537.jpg\nOpenLORIS_images img2746.jpg\nOpenLORIS_images img1625.jpg\nOpenLORIS_images img4410.jpg\nOpenLORIS_images img5250.jpg\nOpenLORIS_images img3678.jpg\nOpenLORIS_images img4203.jpg\nOpenLORIS_images img5610.jpg\nOpenLORIS_images img33.jpg\nOpenLORIS_images img5662.jpg\nOpenLORIS_images img4222.jpg\nOpenLORIS_images img5093.jpg\nOpenLORIS_images img2274.jpg\nOpenLORIS_images img2266.jpg\nOpenLORIS_images img1909.jpg\nOpenLORIS_images img2437.jpg\nOpenLORIS_images img2360.jpg\nOpenLORIS_images img4465.jpg\nOpenLORIS_images img5439.jpg\nOpenLORIS_images img1337.jpg\nOpenLORIS_images img4751.jpg\nOpenLORIS_images img428.jpg\nOpenLORIS_images img2601.jpg\nOpenLORIS_images img3663.jpg\nOpenLORIS_images img4367.jpg\nOpenLORIS_images img1022.jpg\nOpenLORIS_images img5908.jpg\nOpenLORIS_images img2740.jpg\nOpenLORIS_images img4763.jpg\nOpenLORIS_images img2636.jpg\nOpenLORIS_images img652.jpg\nOpenLORIS_images img4958.jpg\nOpenLORIS_images img3153.jpg\nOpenLORIS_images img1158.jpg\nOpenLORIS_images img6106.jpg\nOpenLORIS_images img1153.jpg\nOpenLORIS_images img175.jpg\nOpenLORIS_images img189.jpg\nOpenLORIS_images img5059.jpg\nOpenLORIS_images img5134.jpg\nOpenLORIS_images img3375.jpg\nOpenLORIS_images img511.jpg\nOpenLORIS_images img2686.jpg\nOpenLORIS_images img2344.jpg\nOpenLORIS_images img301.jpg\nOpenLORIS_images img2325.jpg\nOpenLORIS_images img3616.jpg\nOpenLORIS_images img3382.jpg\nOpenLORIS_images img3293.jpg\nOpenLORIS_images img3752.jpg\nOpenLORIS_images img5857.jpg\nOpenLORIS_images img5965.jpg\nOpenLORIS_images img5461.jpg\nOpenLORIS_images img937.jpg\nOpenLORIS_images img520.jpg\nOpenLORIS_images img278.jpg\nOpenLORIS_images img1581.jpg\nOpenLORIS_images img3961.jpg\nOpenLORIS_images img3219.jpg\nOpenLORIS_images img2774.jpg\nOpenLORIS_images img54.jpg\nOpenLORIS_images img4926.jpg\nOpenLORIS_images img1036.jpg\nOpenLORIS_images img5796.jpg\nOpenLORIS_images img2552.jpg\nOpenLORIS_images img4635.jpg\nOpenLORIS_images img3084.jpg\nOpenLORIS_images img4913.jpg\nOpenLORIS_images img4605.jpg\nOpenLORIS_images img4602.jpg\nOpenLORIS_images img630.jpg\nOpenLORIS_images img51.jpg\nOpenLORIS_images img1643.jpg\nOpenLORIS_images img1034.jpg\nOpenLORIS_images img4384.jpg\nOpenLORIS_images img4523.jpg\nOpenLORIS_images img4092.jpg\nOpenLORIS_images img3974.jpg\nOpenLORIS_images img5893.jpg\nOpenLORIS_images img728.jpg\nOpenLORIS_images img4526.jpg\nOpenLORIS_images img2444.jpg\nOpenLORIS_images img957.jpg\nOpenLORIS_images img4599.jpg\nOpenLORIS_images img5377.jpg\nOpenLORIS_images img1134.jpg\nOpenLORIS_images img441.jpg\nOpenLORIS_images img351.jpg\nOpenLORIS_images img6093.jpg\nOpenLORIS_images img2393.jpg\nOpenLORIS_images img1563.jpg\nOpenLORIS_images img4232.jpg\nOpenLORIS_images img3208.jpg\nOpenLORIS_images img3815.jpg\nOpenLORIS_images img5199.jpg\nOpenLORIS_images img2177.jpg\nOpenLORIS_images img2403.jpg\nOpenLORIS_images img5115.jpg\nOpenLORIS_images img3856.jpg\nOpenLORIS_images img5052.jpg\nOpenLORIS_images img3709.jpg\nOpenLORIS_images img4128.jpg\nOpenLORIS_images img4145.jpg\nOpenLORIS_images img686.jpg\nOpenLORIS_images img945.jpg\nOpenLORIS_images img3609.jpg\nOpenLORIS_images img858.jpg\nOpenLORIS_images img5921.jpg\nOpenLORIS_images img823.jpg\nOpenLORIS_images img3721.jpg\nOpenLORIS_images img5776.jpg\nOpenLORIS_images img4260.jpg\nOpenLORIS_images img2211.jpg\nOpenLORIS_images img1767.jpg\nOpenLORIS_images img3858.jpg\nOpenLORIS_images img5292.jpg\nOpenLORIS_images img1061.jpg\nOpenLORIS_images img2577.jpg\nOpenLORIS_images img4071.jpg\nOpenLORIS_images img2152.jpg\nOpenLORIS_images img3978.jpg\nOpenLORIS_images img5638.jpg\nOpenLORIS_images img1104.jpg\nOpenLORIS_images img1448.jpg\nOpenLORIS_images img3385.jpg\nOpenLORIS_images img2745.jpg\nOpenLORIS_images img966.jpg\nOpenLORIS_images img2076.jpg\nOpenLORIS_images img2140.jpg\nOpenLORIS_images img2895.jpg\nOpenLORIS_images img6002.jpg\nOpenLORIS_images img74.jpg\nOpenLORIS_images img2614.jpg\nOpenLORIS_images img5961.jpg\nOpenLORIS_images img5247.jpg\nOpenLORIS_images img6005.jpg\nOpenLORIS_images img5828.jpg\nOpenLORIS_images img3770.jpg\nOpenLORIS_images img3654.jpg\nOpenLORIS_images img5775.jpg\nOpenLORIS_images img3290.jpg\nOpenLORIS_images img1096.jpg\nOpenLORIS_images img5125.jpg\nOpenLORIS_images img4142.jpg\nOpenLORIS_images img783.jpg\nOpenLORIS_images img3379.jpg\nOpenLORIS_images img392.jpg\nOpenLORIS_images img2692.jpg\nOpenLORIS_images img4340.jpg\nOpenLORIS_images img5815.jpg\nOpenLORIS_images img5121.jpg\nOpenLORIS_images img289.jpg\nOpenLORIS_images img3728.jpg\nOpenLORIS_images img2262.jpg\nOpenLORIS_images img6103.jpg\nOpenLORIS_images img4648.jpg\nOpenLORIS_images img5017.jpg\nOpenLORIS_images img2317.jpg\nOpenLORIS_images img5641.jpg\nOpenLORIS_images img1027.jpg\nOpenLORIS_images img661.jpg\nOpenLORIS_images img5072.jpg\nOpenLORIS_images img1237.jpg\nOpenLORIS_images img1843.jpg\nOpenLORIS_images img5097.jpg\nOpenLORIS_images img2936.jpg\nOpenLORIS_images img2848.jpg\nOpenLORIS_images img779.jpg\nOpenLORIS_images img3652.jpg\nOpenLORIS_images img1366.jpg\nOpenLORIS_images img4075.jpg\nOpenLORIS_images img1972.jpg\nOpenLORIS_images img4358.jpg\nOpenLORIS_images img4390.jpg\nOpenLORIS_images img2659.jpg\nOpenLORIS_images img4725.jpg\nOpenLORIS_images img4022.jpg\nOpenLORIS_images img510.jpg\nOpenLORIS_images img3766.jpg\nOpenLORIS_images img855.jpg\nOpenLORIS_images img655.jpg\nOpenLORIS_images img4261.jpg\nOpenLORIS_images img3832.jpg\nOpenLORIS_images img997.jpg\nOpenLORIS_images img5324.jpg\nOpenLORIS_images img4722.jpg\nOpenLORIS_images img307.jpg\nOpenLORIS_images img3640.jpg\nOpenLORIS_images img536.jpg\nOpenLORIS_images img1040.jpg\nOpenLORIS_images img4589.jpg\nOpenLORIS_images img4594.jpg\nOpenLORIS_images img4035.jpg\nOpenLORIS_images img2363.jpg\nOpenLORIS_images img1058.jpg\nOpenLORIS_images img5048.jpg\nOpenLORIS_images img987.jpg\nOpenLORIS_images img4537.jpg\nOpenLORIS_images img3919.jpg\nOpenLORIS_images img1084.jpg\nOpenLORIS_images img5761.jpg\nOpenLORIS_images img259.jpg\nOpenLORIS_images img5758.jpg\nOpenLORIS_images img1861.jpg\nOpenLORIS_images img2562.jpg\nOpenLORIS_images img183.jpg\nOpenLORIS_images img3112.jpg\nOpenLORIS_images img3504.jpg\nOpenLORIS_images img3181.jpg\nOpenLORIS_images img5168.jpg\nOpenLORIS_images img1713.jpg\nOpenLORIS_images img4695.jpg\nOpenLORIS_images img3867.jpg\nOpenLORIS_images img4385.jpg\nOpenLORIS_images img1539.jpg\nOpenLORIS_images img5482.jpg\nOpenLORIS_images img3236.jpg\nOpenLORIS_images img2073.jpg\nOpenLORIS_images img3492.jpg\nOpenLORIS_images img1482.jpg\nOpenLORIS_images img2439.jpg\nOpenLORIS_images img1660.jpg\nOpenLORIS_images img5836.jpg\nOpenLORIS_images img2688.jpg\nOpenLORIS_images img1206.jpg\nOpenLORIS_images img519.jpg\nOpenLORIS_images img6137.jpg\nOpenLORIS_images img3226.jpg\nOpenLORIS_images img2561.jpg\nOpenLORIS_images img4328.jpg\nOpenLORIS_images img995.jpg\nOpenLORIS_images img4576.jpg\nOpenLORIS_images img4994.jpg\nOpenLORIS_images img3058.jpg\nOpenLORIS_images img1873.jpg\nOpenLORIS_images img2773.jpg\nOpenLORIS_images img914.jpg\nOpenLORIS_images img1108.jpg\nOpenLORIS_images img1209.jpg\nOpenLORIS_images img201.jpg\nOpenLORIS_images img5162.jpg\nOpenLORIS_images img3992.jpg\nOpenLORIS_images img1315.jpg\nOpenLORIS_images img2040.jpg\nOpenLORIS_images img3882.jpg\nOpenLORIS_images img1924.jpg\nOpenLORIS_images img1758.jpg\nOpenLORIS_images img2379.jpg\nOpenLORIS_images img4028.jpg\nOpenLORIS_images img5719.jpg\nOpenLORIS_images img3123.jpg\nOpenLORIS_images img5184.jpg\nOpenLORIS_images img1823.jpg\nOpenLORIS_images img5624.jpg\nOpenLORIS_images img5875.jpg\nOpenLORIS_images img2553.jpg\nOpenLORIS_images img3450.jpg\nOpenLORIS_images img110.jpg\nOpenLORIS_images img3844.jpg\nOpenLORIS_images img690.jpg\nOpenLORIS_images img531.jpg\nOpenLORIS_images img2842.jpg\nOpenLORIS_images img3976.jpg\nOpenLORIS_images img3436.jpg\nOpenLORIS_images img509.jpg\nOpenLORIS_images img5564.jpg\nOpenLORIS_images img5086.jpg\nOpenLORIS_images img1186.jpg\nOpenLORIS_images img261.jpg\nOpenLORIS_images img5501.jpg\nOpenLORIS_images img1156.jpg\nOpenLORIS_images img6135.jpg\nOpenLORIS_images img3175.jpg\nOpenLORIS_images img1656.jpg\nOpenLORIS_images img2943.jpg\nOpenLORIS_images img5404.jpg\nOpenLORIS_images img129.jpg\nOpenLORIS_images img2599.jpg\nOpenLORIS_images img3927.jpg\nOpenLORIS_images img398.jpg\nOpenLORIS_images img1006.jpg\nOpenLORIS_images img2154.jpg\nOpenLORIS_images img5472.jpg\nOpenLORIS_images img962.jpg\nOpenLORIS_images img2884.jpg\nOpenLORIS_images img3038.jpg\nOpenLORIS_images img4488.jpg\nOpenLORIS_images img3204.jpg\nOpenLORIS_images img1051.jpg\nOpenLORIS_images img1307.jpg\nOpenLORIS_images img2459.jpg\nOpenLORIS_images img1932.jpg\nOpenLORIS_images img3633.jpg\nOpenLORIS_images img5868.jpg\nOpenLORIS_images img5108.jpg\nOpenLORIS_images img4862.jpg\nOpenLORIS_images img2115.jpg\nOpenLORIS_images img4106.jpg\nOpenLORIS_images img2913.jpg\nOpenLORIS_images img3107.jpg\nOpenLORIS_images img1732.jpg\nOpenLORIS_images img5593.jpg\nOpenLORIS_images img4114.jpg\nOpenLORIS_images img2323.jpg\nOpenLORIS_images img1579.jpg\nOpenLORIS_images img2000.jpg\nOpenLORIS_images img5819.jpg\nOpenLORIS_images img1365.jpg\nOpenLORIS_images img2309.jpg\nOpenLORIS_images img4985.jpg\nOpenLORIS_images img6077.jpg\nOpenLORIS_images img4596.jpg\nOpenLORIS_images img6116.jpg\nOpenLORIS_images img5931.jpg\nOpenLORIS_images img495.jpg\nOpenLORIS_images img803.jpg\nOpenLORIS_images img3694.jpg\nOpenLORIS_images img4900.jpg\nOpenLORIS_images img3440.jpg\nOpenLORIS_images img1524.jpg\nOpenLORIS_images img3765.jpg\nOpenLORIS_images img4591.jpg\nOpenLORIS_images img4374.jpg\nOpenLORIS_images img425.jpg\nOpenLORIS_images img4519.jpg\nOpenLORIS_images img5249.jpg\nOpenLORIS_images img2723.jpg\nOpenLORIS_images img4424.jpg\nOpenLORIS_images img3723.jpg\nOpenLORIS_images img3063.jpg\nOpenLORIS_images img2571.jpg\nOpenLORIS_images img5411.jpg\nOpenLORIS_images img1123.jpg\nOpenLORIS_images img901.jpg\nOpenLORIS_images img415.jpg\nOpenLORIS_images img2710.jpg\nOpenLORIS_images img3191.jpg\nOpenLORIS_images img3119.jpg\nOpenLORIS_images img1266.jpg\nOpenLORIS_images img2031.jpg\nOpenLORIS_images img4854.jpg\nOpenLORIS_images img452.jpg\nOpenLORIS_images img1303.jpg\nOpenLORIS_images img3114.jpg\nOpenLORIS_images img4446.jpg\nOpenLORIS_images img1857.jpg\nOpenLORIS_images img3915.jpg\nOpenLORIS_images img687.jpg\nOpenLORIS_images img1205.jpg\nOpenLORIS_images img1215.jpg\nOpenLORIS_images img4627.jpg\nOpenLORIS_images img5677.jpg\nOpenLORIS_images img2093.jpg\nOpenLORIS_images img2953.jpg\nOpenLORIS_images img5152.jpg\nOpenLORIS_images img1952.jpg\nOpenLORIS_images img305.jpg\nOpenLORIS_images img1837.jpg\nOpenLORIS_images img1688.jpg\nOpenLORIS_images img5094.jpg\nOpenLORIS_images img2099.jpg\nOpenLORIS_images img2729.jpg\nOpenLORIS_images img3548.jpg\nOpenLORIS_images img5269.jpg\nOpenLORIS_images img2326.jpg\nOpenLORIS_images img1617.jpg\nOpenLORIS_images img4041.jpg\nOpenLORIS_images img5939.jpg\nOpenLORIS_images img2603.jpg\nOpenLORIS_images img4005.jpg\nOpenLORIS_images img4113.jpg\nOpenLORIS_images img2576.jpg\nOpenLORIS_images img2568.jpg\nOpenLORIS_images img1558.jpg\nOpenLORIS_images img2341.jpg\nOpenLORIS_images img4010.jpg\nOpenLORIS_images img1393.jpg\nOpenLORIS_images img1060.jpg\nOpenLORIS_images img3706.jpg\nOpenLORIS_images img272.jpg\nOpenLORIS_images img1213.jpg\nOpenLORIS_images img2566.jpg\nOpenLORIS_images img4183.jpg\nOpenLORIS_images img188.jpg\nOpenLORIS_images img5166.jpg\nOpenLORIS_images img388.jpg\nOpenLORIS_images img6045.jpg\nOpenLORIS_images img1911.jpg\nOpenLORIS_images img4814.jpg\nOpenLORIS_images img4326.jpg\nOpenLORIS_images img423.jpg\nOpenLORIS_images img5073.jpg\nOpenLORIS_images img2050.jpg\nOpenLORIS_images img5133.jpg\nOpenLORIS_images img3367.jpg\nOpenLORIS_images img3870.jpg\nOpenLORIS_images img4728.jpg\nOpenLORIS_images img5935.jpg\nOpenLORIS_images img5749.jpg\nOpenLORIS_images img1771.jpg\nOpenLORIS_images img932.jpg\nOpenLORIS_images img373.jpg\nOpenLORIS_images img1626.jpg\nOpenLORIS_images img4875.jpg\nOpenLORIS_images img2220.jpg\nOpenLORIS_images img4033.jpg\nOpenLORIS_images img3074.jpg\nOpenLORIS_images img1162.jpg\nOpenLORIS_images img667.jpg\nOpenLORIS_images img3055.jpg\nOpenLORIS_images img496.jpg\nOpenLORIS_images img5004.jpg\nOpenLORIS_images img1277.jpg\nOpenLORIS_images img4857.jpg\nOpenLORIS_images img3907.jpg\nOpenLORIS_images img1649.jpg\nOpenLORIS_images img3540.jpg\nOpenLORIS_images img5944.jpg\nOpenLORIS_images img1828.jpg\nOpenLORIS_images img5165.jpg\nOpenLORIS_images img2474.jpg\nOpenLORIS_images img1647.jpg\nOpenLORIS_images img1142.jpg\nOpenLORIS_images img4697.jpg\nOpenLORIS_images img5756.jpg\nOpenLORIS_images img3405.jpg\nOpenLORIS_images img4324.jpg\nOpenLORIS_images img1836.jpg\nOpenLORIS_images img4244.jpg\nOpenLORIS_images img2271.jpg\nOpenLORIS_images img2158.jpg\nOpenLORIS_images img4604.jpg\nOpenLORIS_images img955.jpg\nOpenLORIS_images img2709.jpg\nOpenLORIS_images img1648.jpg\nOpenLORIS_images img5273.jpg\nOpenLORIS_images img1510.jpg\nOpenLORIS_images img466.jpg\nOpenLORIS_images img5587.jpg\nOpenLORIS_images img3185.jpg\nOpenLORIS_images img1651.jpg\nOpenLORIS_images img5272.jpg\nOpenLORIS_images img1173.jpg\nOpenLORIS_images img125.jpg\nOpenLORIS_images img2261.jpg\nOpenLORIS_images img4380.jpg\nOpenLORIS_images img848.jpg\nOpenLORIS_images img2640.jpg\nOpenLORIS_images img492.jpg\nOpenLORIS_images img2871.jpg\nOpenLORIS_images img2457.jpg\nOpenLORIS_images img2440.jpg\nOpenLORIS_images img4843.jpg\nOpenLORIS_images img2675.jpg\nOpenLORIS_images img1609.jpg\nOpenLORIS_images img2157.jpg\nOpenLORIS_images img3173.jpg\nOpenLORIS_images img1983.jpg\nOpenLORIS_images img5903.jpg\nOpenLORIS_images img2532.jpg\nOpenLORIS_images img3939.jpg\nOpenLORIS_images img3707.jpg\nOpenLORIS_images img2935.jpg\nOpenLORIS_images img13.jpg\nOpenLORIS_images img193.jpg\n"
  },
  {
    "path": "plane_segmentation/requirements.txt",
    "content": "# This file may be used to create an environment using:\n# $ conda create --name <env> --file <this file>\n# platform: linux-64\n_libgcc_mutex=0.1=conda_forge\n_openmp_mutex=4.5=1_gnu\naddict=2.3.0=py36h9f0ad1d_2\nasync_generator=1.10=py_0\nattrs=20.3.0=pyhd3eb1b0_0\nbackcall=0.2.0=py_0\nblas=2.14=openblas\nbleach=1.5.0=py36_0\nbzip2=1.0.8=h7f98852_4\nc-ares=1.17.1=h36c2ea0_0\nca-certificates=2020.10.14=0\ncairo=1.16.0=h9f066cc_1006\ncertifi=2020.6.20=py36_0\ncycler=0.10.0=py_2\ncython=0.29.21=py36he6710b0_0\ndbus=1.13.6=hfdff14a_1\ndecorator=4.4.2=py_0\ndefusedxml=0.6.0=py_0\nentrypoints=0.3=py36_0\nexpat=2.2.9=he1b5a44_2\nffmpeg=4.3.1=h3215721_1\nfontconfig=2.13.1=h7e3eb15_1002\nfreetype=2.10.4=h7ca028e_0\ngettext=0.19.8.1=hf34092f_1004\nglib=2.66.3=h58526e2_0\ngmp=6.2.1=h58526e2_0\ngnutls=3.6.13=h85f3911_1\ngraphite2=1.3.13=h58526e2_1001\ngst-plugins-base=1.14.5=h0935bb2_2\ngstreamer=1.14.5=h36ae1b5_2\nharfbuzz=2.7.2=ha5b49bf_1\nhdf5=1.10.6=nompi_h3c11f04_101\nhtml5lib=0.9999999=py36_0\nicu=67.1=he1b5a44_0\nimportlib-metadata=3.3.0=py36h5fab9bb_2\nintel-openmp=2020.2=254\nipykernel=5.3.4=py36h5ca1d4c_0\nipython=7.16.1=py36h5ca1d4c_0\nipython_genutils=0.2.0=pyhd3eb1b0_1\nipywidgets=7.6.0=pyhd3eb1b0_1\njasper=1.900.1=h07fcdf6_1006\njedi=0.18.0=py36h06a4308_0\njinja2=2.11.2=py_0\njoblib=1.0.0=pyhd8ed1ab_0\njpeg=9d=h36c2ea0_0\njsonschema=3.0.2=py36_0\njupyter_client=6.1.7=py_0\njupyter_core=4.7.0=py36h06a4308_0\njupyterlab_pygments=0.1.2=py_0\nkiwisolver=1.3.1=py36h51d7077_0\nkrb5=1.17.2=h926e7f8_0\nlame=3.100=h14c3975_1001\nlcms2=2.11=hcbb858e_1\nld_impl_linux-64=2.33.1=h53a641e_7\nlibblas=3.8.0=14_openblas\nlibcblas=3.8.0=14_openblas\nlibclang=10.0.1=default_hde54327_1\nlibcurl=7.71.1=hcdd3856_8\nlibedit=3.1.20191231=h14c3975_1\nlibev=4.33=h516909a_1\nlibevent=2.1.10=hcdb4288_3\nlibffi=3.2.1=he1b5a44_1007\nlibgcc-ng=9.3.0=h5dbcf3e_17\nlibgfortran-ng=7.3.0=hdf63c60_0\nlibglib=2.66.3=hbe7bbb4_0\nlibgomp=9.3.0=h5dbcf3e_17\nlibiconv=1.16=h516909a_0\nliblapack=3.8.0=14_openblas\nliblapacke=3.8.0=14_openblas\nlibllvm10=10.0.1=he513fc3_3\nlibnghttp2=1.41.0=h8cfc5f6_2\nlibopenblas=0.3.7=h5ec1e0e_6\nlibopencv=4.5.0=py36_4\nlibpng=1.6.37=h21135ba_2\nlibpq=12.3=h255efa7_3\nlibprotobuf=3.13.0.1=h8b12597_0\nlibsodium=1.0.18=h7b6447c_0\nlibssh2=1.9.0=hab1572f_5\nlibstdcxx-ng=9.3.0=h2ae2ef3_17\nlibtiff=4.1.0=h4f3a223_6\nlibuuid=2.32.1=h7f98852_1000\nlibwebp-base=1.1.0=h36c2ea0_3\nlibxcb=1.13=h14c3975_1002\nlibxkbcommon=0.10.0=he1b5a44_0\nlibxml2=2.9.10=h68273f3_2\nllvmlite=0.34.0=py36h269e1b5_4\nlz4-c=1.9.2=he1b5a44_3\nmarkdown=3.3.3=pyh9f0ad1d_0\nmarkupsafe=1.1.1=py36h7b6447c_0\nmatplotlib=3.3.3=py36h5fab9bb_0\nmatplotlib-base=3.3.3=py36he12231b_0\nmistune=0.8.4=py36h7b6447c_0\nmkl=2019.4=243\nmkl-service=2.3.0=py36he8ac12f_0\nmkl_fft=1.1.0=py36hc1659b7_1\nmkl_random=1.1.0=py36hb3f55d8_0\nmysql-common=8.0.21=2\nmysql-libs=8.0.21=hf3661c5_2\nnbclient=0.5.1=py_0\nnbconvert=6.0.7=py36_0\nnbformat=5.0.8=py_0\nncurses=6.2=he6710b0_1\nnest-asyncio=1.4.3=pyhd3eb1b0_0\nnettle=3.6=he412f7d_0\nnotebook=6.0.3=py36_0\nnspr=4.29=he1b5a44_1\nnss=3.59=h2c00c37_0\nnumba=0.51.2=py36h0573a6f_1\nnumpy=1.17.0=py36h95a1406_0\nnumpy-base=1.18.5=py36h2f8d375_0\nolefile=0.46=pyh9f0ad1d_1\nopen3d=0.11.2=py36_0\nopencv=4.5.0=py36_4\nopenh264=2.1.1=h8b12597_0\nopenssl=1.1.1h=h7b6447c_0\npandas=1.1.3=py36he6710b0_0\npandoc=2.11=hb0f4dca_0\npandocfilters=1.4.3=py36h06a4308_1\nparso=0.7.0=py_0\npcre=8.44=he1b5a44_0\npexpect=4.8.0=pyhd3eb1b0_3\npickleshare=0.7.5=pyhd3eb1b0_1003\npillow=8.0.1=py36h10ecd5c_0\npip=20.3.3=py36h06a4308_0\npixman=0.40.0=h36c2ea0_0\nplyfile=0.7.2=pyh9f0ad1d_0\nprometheus_client=0.9.0=pyhd3eb1b0_0\nprompt-toolkit=3.0.8=py_0\nprotobuf=3.13.0.1=py36h201607c_1\npthread-stubs=0.4=h36c2ea0_1001\nptyprocess=0.6.0=pyhd3eb1b0_2\npy-opencv=4.5.0=py36he448a4c_4\npydensecrf=1.0rc2=pypi_0\npygments=2.7.3=pyhd3eb1b0_0\npyparsing=2.4.7=pyh9f0ad1d_0\npyqt=5.12.3=py36h5fab9bb_6\npyqt-impl=5.12.3=py36h7ec31b9_6\npyqt5-sip=4.19.18=py36hc4f0c31_6\npyqtchart=5.12=py36h7ec31b9_6\npyqtwebengine=5.12.1=py36h7ec31b9_6\npyrsistent=0.17.3=py36h7b6447c_0\npython=3.6.11=h425cb1d_0_cpython\npython-dateutil=2.8.1=py_0\npython_abi=3.6=1_cp36m\npytz=2020.1=py_0\npyyaml=5.3.1=py36he6145b8_1\npyzmq=20.0.0=py36h2531618_1\nqt=5.12.9=h1f2b2cb_0\nreadline=8.0=h7b6447c_0\nscikit-learn=0.23.2=py36h0573a6f_0\nscipy=0.19.1=py36h9976243_3\nsend2trash=1.5.0=pyhd3eb1b0_1\nsetuptools=51.0.0=py36h06a4308_2\nsix=1.15.0=pyh9f0ad1d_0\nsqlite=3.33.0=h62c20be_0\ntensorflow=1.4.1=0\ntensorflow-base=1.4.1=py36hd00c003_2\ntensorflow-gpu=1.4.1=0\ntensorflow-gpu-base=1.4.1=py36h01caf0a_0\ntensorflow-tensorboard=1.5.1=py36hf484d3e_1\nterminado=0.9.1=py36_0\ntestpath=0.4.4=py_0\nthreadpoolctl=2.1.0=pyh5ca1d4c_0\ntk=8.6.10=hbc83047_0\ntornado=6.1=py36h1d69622_0\ntqdm=4.55.0=pyhd8ed1ab_0\ntraitlets=4.3.3=py36_0\ntyping_extensions=3.7.4.3=py_0\nwcwidth=0.2.5=py_0\nwebencodings=0.5.1=py_1\nwerkzeug=1.0.1=pyh9f0ad1d_0\nwheel=0.36.2=pyhd3eb1b0_0\nwidgetsnbextension=3.5.1=py36_0\nx264=1!152.20180806=h14c3975_0\nxorg-kbproto=1.0.7=h14c3975_1002\nxorg-libice=1.0.10=h516909a_0\nxorg-libsm=1.2.3=h84519dc_1000\nxorg-libx11=1.6.12=h516909a_0\nxorg-libxau=1.0.9=h14c3975_0\nxorg-libxdmcp=1.1.3=h516909a_0\nxorg-libxext=1.3.4=h516909a_0\nxorg-libxrender=0.9.10=h516909a_1002\nxorg-renderproto=0.11.1=h14c3975_1002\nxorg-xextproto=7.3.0=h14c3975_1002\nxorg-xproto=7.0.31=h7f98852_1007\nxz=5.2.5=h7b6447c_0\nyaml=0.2.5=h7b6447c_0\nzeromq=4.3.3=he6710b0_3\nzipp=3.4.0=py_0\nzlib=1.2.11=h7b6447c_3\nzstd=1.4.5=h6597ccf_2\n"
  },
  {
    "path": "plane_segmentation/train.py",
    "content": "from __future__ import division\nimport tensorflow as tf\nimport pprint\nimport random\nimport numpy as np\n\nfrom RecoverPlane_perpendicular import RecoverPlane\n\nimport os\n\nflags = tf.app.flags\nflags.DEFINE_string(\"dataset_dir\", \"\", \"Dataset directory\")\nflags.DEFINE_string(\"log_dir2\", \"\", \"log directory\")\nflags.DEFINE_string(\"init_checkpoint_file\",'', \"Specific checkpoint file to initialize from\")\nflags.DEFINE_float(\"learning_rate\", 0.0001, \"Learning rate of for adam\")\nflags.DEFINE_float(\"beta1\", 0.99, \"Momentum term of adam\")\nflags.DEFINE_float(\"beta2\", 0.9999, \"Momentum term of adam\")\nflags.DEFINE_float(\"plane_weight\",0.1, \"Weight for plane regularization\")\nflags.DEFINE_integer(\"batch_size\", 8, \"The size of of a sample batch\")\nflags.DEFINE_integer(\"img_height\", 192, \"Image height\")\nflags.DEFINE_integer(\"img_width\", 320, \"Image width\")\nflags.DEFINE_integer(\"max_steps\",800000 , \"Maximum number of training iterations\")\nflags.DEFINE_integer(\"summary_freq\", 1000, \"Logging every log_freq iterations\")\nflags.DEFINE_integer(\"num_plane\",3, \"The estimated number of planes in the scenario\")\nflags.DEFINE_integer(\"save_latest_freq\", 5000, \\\n    \"Save the latest model every save_latest_freq iterations (overwrites the previous latest model)\")\nflags.DEFINE_boolean(\"continue_train\", False, \"Continue training from previous checkpoint\")\nflags.DEFINE_boolean(\"debug\", True, \"debug mode?\")\nflags.DEFINE_string(\"gpu\", \"1\", \"GPU ID\")\n\n\ncheck_file = flags.FLAGS.log_dir2 + '/' + flags.FLAGS.dataset_dir.split(\"/\")[-1] +\\\n            '_lr=' + str(flags.FLAGS.learning_rate) +\\\n            '_b1=' + str(flags.FLAGS.beta1) + '_b2=' + str(flags.FLAGS.beta2) +\\\n             '_weight=' + str(flags.FLAGS.plane_weight)  +\\\n             '_n_plane=' + str(flags.FLAGS.num_plane)\n\n\nflags.DEFINE_string(\"checkpoint_dir\", check_file, \"Directory name to save the checkpoints\")\n\nFLAGS = flags.FLAGS  # this is used to transfer all the params need during the app.run\n\ndef main(_):\n    seed = 8964\n    tf.set_random_seed(seed)\n    np.random.seed(seed)\n    random.seed(seed)\n\n    pp = pprint.PrettyPrinter()\n    pp.pprint(flags.FLAGS.__flags) #change this to pp.pprint(tf.app.flags.FLAGS.flag_values_dict()) for tensorflow 1.5 or higher\n    \n    if not os.path.exists(FLAGS.checkpoint_dir):\n        os.mkdir(FLAGS.checkpoint_dir)\n        \n    planeRecover = RecoverPlane()\n    planeRecover.train(FLAGS)\n\nif __name__ == '__main__':\n    tf.app.run()\n\n"
  },
  {
    "path": "plane_segmentation/utils.py",
    "content": "from __future__ import division\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport tensorflow as tf\nimport random\nimport colorsys\n\n\n# ****************************colorful mask part*************************************\n# Usage: apply different color to each plane\n#        the plane determination is based on the plane_threshold = 0.5 now\n#        and the area without additional color are belong to non-plane\n#\ndef random_colors(N, bright=True):\n    \"\"\"\n    Generate random colors.\n    To get visually distinct colors, generate them in HSV space then\n    convert to RGB.\n    \"\"\"\n    brightness = 1.0 if bright else 0.7\n    hsv = [(float(i) / N, 1, brightness) for i in range(N)]\n    colors = list(map(lambda c: colorsys.hsv_to_rgb(*c), hsv))\n    random.shuffle(colors)\n    return colors\n\ndef apply_mask(image, mask, max_mask, color, alpha=0.4):\n    \"\"\"Apply the given mask to the image.\n       alpha here means the alpha channel value\n    \"\"\"\n    b,h,w,_ = image.get_shape().as_list()\n    res_mask = tf.concat([mask, mask, mask], axis=-1)  # be consistent with color channel number\n    ref_max_mask = tf.concat([max_mask,max_mask,max_mask],axis=-1)\n    b_equal = tf.equal(res_mask, ref_max_mask)\n\n    alpha_img = tf.where(b_equal, tf.cast(tf.scalar_mul(1 - alpha, tf.cast(image, tf.float32)),\n                                            tf.uint8), image)\n\n    alpha_color = alpha * np.array(color)\n    ref_color = np.tile(alpha_color.astype(int), (b, h, w, 1))\n    ref_color = tf.constant(ref_color, dtype=tf.uint8)\n\n    res_image = tf.where(b_equal, ref_color + alpha_img, alpha_img)\n\n    return res_image\n\n\ndef color_mask(image, pred_mask_s, colors, alpha=0.4 ):\n    '''do iteration to assign the color to the corrosponding mask\n       Based on experiment, the first plane will be red, and the second one will be green-blue, when num=2\n       and drak blue ,red ,green for num=3\n    '''\n    N = len(colors)\n    masked_image = image\n    for i in range(N):\n        color = colors[i]\n\n        mask = tf.expand_dims(pred_mask_s[:, :, :, i],axis=-1)\n        max_mask = tf.reduce_max(pred_mask_s, axis=-1,keep_dims=True)\n        masked_image = apply_mask(masked_image, mask, max_mask, color, alpha)\n\n    return masked_image\n\n#*********************************************************************************************\n\n\ndef meshgrid(batch, height, width, is_homogeneous=True):\n  \"\"\"Construct a 2D meshgrid.\n\n  Args:\n    batch: batch size\n    height: height of the grid\n    width: width of the grid\n    is_homogeneous: whether to return in homogeneous coordinates\n  Returns:\n    x,y grid coordinates [batch, 2 (3 if homogeneous), height, width]\n  \"\"\"\n  x_t = tf.matmul(tf.ones(shape=tf.stack([height, 1])),\n                  tf.transpose(tf.expand_dims(tf.linspace(-1.0, 1.0, width), 1), [1, 0])) #tf.linspace(start,stop,num)\n  y_t = tf.matmul(tf.expand_dims(tf.linspace(-1.0, 1.0, height), 1),\n                  tf.ones(shape=tf.stack([1, width])))\n  x_t = (x_t + 1.0) * 0.5 * tf.cast(width - 1, tf.float32)\n  y_t = (y_t + 1.0) * 0.5 * tf.cast(height - 1, tf.float32)\n  if is_homogeneous:\n    ones = tf.ones_like(x_t)\n    coords = tf.stack([x_t, y_t, ones], axis=0)\n  else:\n    coords = tf.stack([x_t, y_t], axis=0)\n  coords = tf.tile(tf.expand_dims(coords, 0), [batch, 1, 1, 1])\n  return coords\n\n\ndef compute_depth(img, pred_param, num_plane, intrinsics):\n    batch, height, width, _ = img.get_shape().as_list()\n    # Construct pixel grid coordinates\n    pixel_coords = meshgrid(batch, height, width)  # 3*128*416\n\n    cam_coords = tf.reshape(pixel_coords, [batch, 3, -1])\n\n    unscaled_ray_Q = tf.matmul(tf.matrix_inverse(intrinsics), cam_coords)\n\n    for k in range(num_plane):\n        n_div_d = tf.expand_dims(pred_param[:, k, :], axis=1)\n        scale =   tf.matmul(n_div_d, tf.matmul(tf.matrix_inverse(intrinsics), cam_coords))\n\n        plane_based_Q = scale * 1./ (unscaled_ray_Q + 1e-10)\n        plane_based_Q = tf.reshape(plane_based_Q, [batch, 3, height, width])\n        plane_based_Q = tf.transpose(plane_based_Q, perm=[0, 2, 3, 1])\n\n        if k == 0:\n            plane_inv_depth_stack =  plane_based_Q[:,:,:,-1:]\n        else:\n            plane_inv_depth_stack = tf.concat([plane_inv_depth_stack,\n                                         plane_based_Q[:, :,:, -1:]], axis=-1)\n\n    return plane_inv_depth_stack\n\ndef compute_unscaled_ray(img, intrinsics):\n    batch, height, width, _ = img.get_shape().as_list()\n    # Construct pixel grid coordinates\n    pixel_coords = meshgrid(batch, height, width)  # 3*128*416\n\n    cam_coords = tf.reshape(pixel_coords, [batch, 3, -1])\n\n    unscaled_ray_Q = tf.matmul(tf.matrix_inverse(intrinsics), cam_coords)\n\n    return unscaled_ray_Q\n\ndef compute_plane_equation(img, pred_param, ray, depth):\n\n    batch, height, width, _ = img.get_shape().as_list()\n    n_time_ray = tf.matmul(pred_param,ray)\n    n_time_ray = tf.reshape(n_time_ray, [batch, 1, height, width])\n    n_time_ray = tf.transpose(n_time_ray, perm=[0, 2, 3, 1])\n\n    left_eq = n_time_ray * depth\n\n    return  left_eq\n\ndef val2uint8(mat,maxVal):\n    maxVal_mat = tf.constant(maxVal,tf.float32,mat.get_shape())\n    mat_vis = tf.where(tf.greater(mat,maxVal_mat), maxVal_mat, mat)\n    return tf.cast(mat_vis * 255. / maxVal, tf.uint8)\n"
  },
  {
    "path": "rpvio.patch",
    "content": "From 3d295e1ff4c292ce203e636f363520f9b633908c Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 17 Sep 2020 12:22:47 +0530\nSubject: [PATCH 01/42] Add code for homography-based initializatin\n\n---\n vins_estimator/src/estimator.cpp          | 14 ++--\n vins_estimator/src/initial/solve_5pts.cpp | 80 ++++++++++++++++++-----\n vins_estimator/src/initial/solve_5pts.h   |  6 +-\n 3 files changed, 73 insertions(+), 27 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex cb7a4af..46c93ef 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -491,14 +491,16 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l\n             }\n             average_parallax = 1.0 * sum_parallax / int(corres.size());\n \n-            MatrixXd::Identity(4,4) TrIC; \n-            TrIC.block(0, 0, 3, 3) = RIC;\n-            TrIC.col(3) = TIC;\n+            Matrix4d TrIC = Matrix4d::Identity(); \n+            TrIC.block(0, 0, 3, 3) = RIC[0];\n+            TrIC.block(0,3,3,0) = TIC[0];\n \n-            //compute preintegrated rotation\n-            \n+            //compute corresponding preintegrated rotation\n+            Matrix3d R_imu = Matrix3d::Identity();\n+            for(int k = WINDOW_SIZE - 1; k > i; k--)\n+                R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n \n-            if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, relative_R, relative_T, R_imu, TrIC))\n+            if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T))\n             {\n                 l = i;\n                 ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex a1c170d..61440da 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -1,4 +1,5 @@\n #include \"solve_5pts.h\"\n+#include <algorithm>\n \n \n namespace cv {\n@@ -241,27 +242,70 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/665);\n         cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n \n-        decomposeH(H, K, R_imu, TIC);\n-\n-        Eigen::Matrix3d R;\n-        Eigen::Vector3d T;\n-        for (int i = 0; i < 3; i++)\n-        {   \n-            T(i) = trans.at<double>(i, 0);\n-            for (int j = 0; j < 3; j++)\n-                R(i, j) = rot.at<double>(i, j);\n-        }\n-\n-        Rotation = R.transpose();\n-        Translation = -R.transpose() * T;\n-        if(inlier_cnt > 12)\n-            return true;\n-        else\n-            return false;\n+        Eigen::Matrix4d Tr = decomposeH(H, K, R_imu, TrIC);\n+        Rotation = Tr.block(0,0,3,3);\n+        Translation = Tr.block(0,3,3,0);\n+        return true;\n     }\n     return false;\n }\n \n-void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC)\n+Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC)\n {\n+    vector<cv::Mat> cv_Rs, cv_ts, cv_ns;\n+    int n_sols = cv::decomposeHomographyMat(H, K, cv_Rs, cv_ts, cv_ns);\n+\n+    vector<Matrix4d> positive_depth_transforms;\n+    if(n_sols > 1)\n+    {\n+        for(int i = 0; i < n_sols; i++)\n+        {\n+            Matrix4d Tr = Matrix4d::Identity();\n+            Matrix3d R;\n+            cv::cv2eigen(cv_Rs[i], R);\n+            Tr.block(0,0,3,3) = R;\n+\n+            Vector3d t;\n+            cv::cv2eigen(cv_ts[i], t);\n+            Tr.block(0,3,3,0) = t;\n+\n+            //Tr = TrIC * Tr * TrIC.inverse();\n+            Tr = Tr.inverse();\n+\n+            Vector3d e3(0, 0, 1);\n+            Vector3d n;\n+            cv::cv2eigen(cv_ns[i], n);\n+            if(n.dot(e3) > 0)\n+                positive_depth_transforms.push_back(Tr);\n+        }\n+\n+        vector<double> rot_diff;\n+        for(size_t i = 0; i < positive_depth_transforms.size(); i++)\n+        {\n+            Eigen::Matrix4d Tr = TrIC * positive_depth_transforms[i] * TrIC.inverse();\n+            Eigen::Matrix3d R = Tr.block(0,0,3,3);\n+            double f = (R.transpose()*R_imu - MatrixXd::Identity(3,3)).norm();\n+            rot_diff.push_back(f);\n+        }\n+\n+        int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin();\n+        Matrix4d Tr = positive_depth_transforms[min_index];\n+        return Tr;\n+    }\n+\n+    else\n+    {\n+        Matrix4d Tr = Matrix4d::Identity();\n+        Matrix3d R;\n+        cv::cv2eigen(cv_Rs[0], R);\n+        Tr.block(0,0,3,3) = R;\n+\n+        Vector3d t;\n+        cv::cv2eigen(cv_ts[0], t);\n+        Tr.block(0,3,3,0) = t;\n+\n+        //Tr = TrIC * Tr * TrIC.inverse();\n+        Tr = Tr.inverse();\n+        return Tr;\n+    }\n }\ndiff --git a/vins_estimator/src/initial/solve_5pts.h b/vins_estimator/src/initial/solve_5pts.h\nindex 3e95c9e..9fb118e 100644\n--- a/vins_estimator/src/initial/solve_5pts.h\n+++ b/vins_estimator/src/initial/solve_5pts.h\n@@ -4,8 +4,8 @@\n using namespace std;\n \n #include <opencv2/opencv.hpp>\n-//#include <opencv2/core/eigen.hpp>\n #include <eigen3/Eigen/Dense>\n+#include <opencv2/core/eigen.hpp>\n using namespace Eigen;\n \n #include <ros/console.h>\n@@ -15,8 +15,8 @@ class MotionEstimator\n   public:\n \n     bool solveRelativeRT(const vector<pair<Vector3d, Vector3d>> &corres, Matrix3d &R, Vector3d &T);\n-    bool solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, const Matrix3d &R, Vector3d &T);\n-    void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const  Matrix4d &TrIC);\n+    bool solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T);\n+    Matrix4d decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const  Matrix4d &TrIC);\n \n   private:\n     double testTriangulation(const vector<cv::Point2f> &l,\n-- \n2.17.1\n\n\nFrom 56f9ef55ba6caf95001bba4e8cd34524fd79a089 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 17 Sep 2020 14:19:09 +0530\nSubject: [PATCH 02/42] Fix runtime errors\n\n---\n vins_estimator/src/estimator.cpp          |  4 ++--\n vins_estimator/src/initial/solve_5pts.cpp | 10 +++++-----\n 2 files changed, 7 insertions(+), 7 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 46c93ef..733a968 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -492,8 +492,8 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l\n             average_parallax = 1.0 * sum_parallax / int(corres.size());\n \n             Matrix4d TrIC = Matrix4d::Identity(); \n-            TrIC.block(0, 0, 3, 3) = RIC[0];\n-            TrIC.block(0,3,3,0) = TIC[0];\n+            TrIC.block(0,0,3,3) = ric[0];\n+            TrIC.block(0,3,3,1) = tic[0];\n \n             //compute corresponding preintegrated rotation\n             Matrix3d R_imu = Matrix3d::Identity();\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex 61440da..6be2bf1 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -244,7 +244,7 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n \n         Eigen::Matrix4d Tr = decomposeH(H, K, R_imu, TrIC);\n         Rotation = Tr.block(0,0,3,3);\n-        Translation = Tr.block(0,3,3,0);\n+        Translation = Tr.block(0,3,3,1);\n         return true;\n     }\n     return false;\n@@ -267,10 +267,10 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M\n \n             Vector3d t;\n             cv::cv2eigen(cv_ts[i], t);\n-            Tr.block(0,3,3,0) = t;\n+            Tr.block(0,3,3,1) = t;\n \n             //Tr = TrIC * Tr * TrIC.inverse();\n-            Tr = Tr.inverse();\n+            Tr = Tr.inverse().eval();\n \n             Vector3d e3(0, 0, 1);\n             Vector3d n;\n@@ -302,10 +302,10 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M\n \n         Vector3d t;\n         cv::cv2eigen(cv_ts[0], t);\n-        Tr.block(0,3,3,0) = t;\n+        Tr.block(0,3,3,1) = t;\n \n         //Tr = TrIC * Tr * TrIC.inverse();\n-        Tr = Tr.inverse();\n+        Tr = Tr.inverse().eval();\n         return Tr;\n     }\n }\n-- \n2.17.1\n\n\nFrom f9db6ec5f6d62bc608e5acd96c45414f384eadc4 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sat, 19 Sep 2020 08:21:32 +0530\nSubject: [PATCH 03/42] Reject with H\n\n---\n feature_tracker/src/feature_tracker.cpp   | 3 ++-\n vins_estimator/src/initial/solve_5pts.cpp | 3 +++\n 2 files changed, 5 insertions(+), 1 deletion(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex d33709c..41322b8 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -188,7 +188,8 @@ void FeatureTracker::rejectWithF()\n         }\n \n         vector<uchar> status;\n-        cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n+        //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n+        cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status);\n         int size_a = cur_pts.size();\n         reduceVector(prev_pts, status);\n         reduceVector(cur_pts, status);\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex 6be2bf1..50999c3 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -227,6 +227,7 @@ bool MotionEstimator::solveRelativeRT(const vector<pair<Vector3d, Vector3d>> &co\n     return false;\n }\n \n+\n bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &Rotation, Vector3d &Translation)\n {\n     if (corres.size() >= 15)\n@@ -241,6 +242,7 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask);\n         cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/665);\n         cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n+        //cv::Mat K = (cv::Mat_<double>(3,3) << 665, 0, 511, 0, 665, 383, 0, 0, 1);\n \n         Eigen::Matrix4d Tr = decomposeH(H, K, R_imu, TrIC);\n         Rotation = Tr.block(0,0,3,3);\n@@ -250,6 +252,7 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n     return false;\n }\n \n+\n Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC)\n {\n     vector<cv::Mat> cv_Rs, cv_ts, cv_ns;\n-- \n2.17.1\n\n\nFrom 64247beda4b05f162fbec72127dd1f79bade8709 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Tue, 22 Sep 2020 10:39:26 +0530\nSubject: [PATCH 04/42] Add code for homography-constrained visual BA for init\n\n---\n vins_estimator/src/estimator.cpp           |  11 +-\n vins_estimator/src/estimator.h             |   2 +-\n vins_estimator/src/initial/initial_sfm.cpp | 216 ++++++++++++++++++++-\n vins_estimator/src/initial/initial_sfm.h   |  40 +++-\n vins_estimator/src/initial/solve_5pts.cpp  |  27 ++-\n vins_estimator/src/initial/solve_5pts.h    |   4 +-\n 6 files changed, 280 insertions(+), 20 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 733a968..c561fd6 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -267,15 +267,16 @@ bool Estimator::initialStructure()\n     } \n     Matrix3d relative_R;\n     Vector3d relative_T;\n+    Vector3d n;\n     int l;\n-    if (!relativeHPose(relative_R, relative_T, l))\n+    if (!relativeHPose(relative_R, relative_T, n, l))\n     {\n         ROS_INFO(\"Not enough features or parallax; Move device around\");\n         return false;\n     }\n     GlobalSFM sfm;\n-    if(!sfm.construct(frame_count + 1, Q, T, l,\n-              relative_R, relative_T,\n+    if(!sfm.constructH(frame_count + 1, Q, T, l,\n+              relative_R, relative_T, n,\n               sfm_f, sfm_tracked_points))\n     {\n         ROS_DEBUG(\"global SFM failed!\");\n@@ -470,7 +471,7 @@ bool Estimator::relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l)\n     return false;\n }\n \n-bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l)\n+bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l)\n {\n     // find previous frame which contains enough correspondence and parallex with newest frame\n     for (int i = 0; i < WINDOW_SIZE; i++)\n@@ -500,7 +501,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l\n             for(int k = WINDOW_SIZE - 1; k > i; k--)\n                 R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n \n-            if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T))\n+            if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n))\n             {\n                 l = i;\n                 ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex 7f912da..9b2a731 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -40,7 +40,7 @@ class Estimator\n     bool initialStructure();\n     bool visualInitialAlign();\n     bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l);\n-    bool relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, int &l);\n+    bool relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l);\n     void slideWindow();\n     void solveOdometry();\n     void slideWindowNew();\ndiff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp\nindex 6fbcc17..c7b1661 100644\n--- a/vins_estimator/src/initial/initial_sfm.cpp\n+++ b/vins_estimator/src/initial/initial_sfm.cpp\n@@ -280,11 +280,11 @@ bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t//std::cout << summary.BriefReport() << \"\\n\";\n \tif (summary.termination_type == ceres::CONVERGENCE || summary.final_cost < 5e-03)\n \t{\n-\t\t//cout << \"vision only BA converge\" << endl;\n+\t\tcout << \"vision only BA converge\" << endl;\n \t}\n \telse\n \t{\n-\t\t//cout << \"vision only BA not converge \" << endl;\n+\t\tcout << \"vision only BA not converge \" << endl;\n \t\treturn false;\n \t}\n \tfor (int i = 0; i < frame_num; i++)\n@@ -311,3 +311,215 @@ bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \n }\n \n+\n+bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n+\t\t\t  const Matrix3d relative_R, const Vector3d relative_T, const Vector3d n,\n+\t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points)\n+{\n+\tfeature_num = sfm_f.size();\n+\t//cout << \"set 0 and \" << l << \" as known \" << endl;\n+\t// have relative_r relative_t\n+\t// intial two view\n+\tq[l].w() = 1;\n+\tq[l].x() = 0;\n+\tq[l].y() = 0;\n+\tq[l].z() = 0;\n+\tT[l].setZero();\n+\tq[frame_num - 1] = q[l] * Quaterniond(relative_R);\n+\tT[frame_num - 1] = relative_T;\n+\t//cout << \"init q_l \" << q[l].w() << \" \" << q[l].vec().transpose() << endl;\n+\t//cout << \"init t_l \" << T[l].transpose() << endl;\n+\n+\t//rotate to cam frame\n+\tMatrix3d c_Rotation[frame_num];\n+\tVector3d c_Translation[frame_num];\n+\tQuaterniond c_Quat[frame_num];\n+\tdouble c_rotation[frame_num][4];\n+\tdouble c_translation[frame_num][3];\n+    double c_normal[3];\n+\tEigen::Matrix<double, 3, 4> Pose[frame_num];\n+\n+\tc_Quat[l] = q[l].inverse();\n+\tc_Rotation[l] = c_Quat[l].toRotationMatrix();\n+\tc_Translation[l] = -1 * (c_Rotation[l] * T[l]);\n+\tPose[l].block<3, 3>(0, 0) = c_Rotation[l];\n+\tPose[l].block<3, 1>(0, 3) = c_Translation[l];\n+\n+\tc_Quat[frame_num - 1] = q[frame_num - 1].inverse();\n+\tc_Rotation[frame_num - 1] = c_Quat[frame_num - 1].toRotationMatrix();\n+\tc_Translation[frame_num - 1] = -1 * (c_Rotation[frame_num - 1] * T[frame_num - 1]);\n+\tPose[frame_num - 1].block<3, 3>(0, 0) = c_Rotation[frame_num - 1];\n+\tPose[frame_num - 1].block<3, 1>(0, 3) = c_Translation[frame_num - 1];\n+\n+\t//1: trangulate between l ----- frame_num - 1\n+\t//2: solve pnp l + 1; trangulate l + 1 ------- frame_num - 1; \n+\tfor (int i = l; i < frame_num - 1 ; i++)\n+\t{\n+\t\t// solve pnp\n+\t\tif (i > l)\n+\t\t{\n+\t\t\tMatrix3d R_initial = c_Rotation[i - 1];\n+\t\t\tVector3d P_initial = c_Translation[i - 1];\n+\t\t\tif(!solveFrameByPnP(R_initial, P_initial, i, sfm_f))\n+\t\t\t\treturn false;\n+\t\t\tc_Rotation[i] = R_initial;\n+\t\t\tc_Translation[i] = P_initial;\n+\t\t\tc_Quat[i] = c_Rotation[i];\n+\t\t\tPose[i].block<3, 3>(0, 0) = c_Rotation[i];\n+\t\t\tPose[i].block<3, 1>(0, 3) = c_Translation[i];\n+\t\t}\n+\n+\t\t// triangulate point based on the solve pnp result\n+\t\ttriangulateTwoFrames(i, Pose[i], frame_num - 1, Pose[frame_num - 1], sfm_f);\n+\t}\n+\t//3: triangulate l-----l+1 l+2 ... frame_num -2\n+\tfor (int i = l + 1; i < frame_num - 1; i++)\n+\t\ttriangulateTwoFrames(l, Pose[l], i, Pose[i], sfm_f);\n+\t//4: solve pnp l-1; triangulate l-1 ----- l\n+\t//             l-2              l-2 ----- l\n+\tfor (int i = l - 1; i >= 0; i--)\n+\t{\n+\t\t//solve pnp\n+\t\tMatrix3d R_initial = c_Rotation[i + 1];\n+\t\tVector3d P_initial = c_Translation[i + 1];\n+\t\tif(!solveFrameByPnP(R_initial, P_initial, i, sfm_f))\n+\t\t\treturn false;\n+\t\tc_Rotation[i] = R_initial;\n+\t\tc_Translation[i] = P_initial;\n+\t\tc_Quat[i] = c_Rotation[i];\n+\t\tPose[i].block<3, 3>(0, 0) = c_Rotation[i];\n+\t\tPose[i].block<3, 1>(0, 3) = c_Translation[i];\n+\t\t//triangulate\n+\t\ttriangulateTwoFrames(i, Pose[i], l, Pose[l], sfm_f);\n+\t}\n+\t//5: triangulate all other points\n+\tfor (int j = 0; j < feature_num; j++)\n+\t{\n+\t\tif (sfm_f[j].state == true)\n+\t\t\tcontinue;\n+\t\tif ((int)sfm_f[j].observation.size() >= 2)\n+\t\t{\n+\t\t\tVector2d point0, point1;\n+\t\t\tint frame_0 = sfm_f[j].observation[0].first;\n+\t\t\tpoint0 = sfm_f[j].observation[0].second;\n+\t\t\tint frame_1 = sfm_f[j].observation.back().first;\n+\t\t\tpoint1 = sfm_f[j].observation.back().second;\n+\t\t\tVector3d point_3d;\n+\t\t\ttriangulatePoint(Pose[frame_0], Pose[frame_1], point0, point1, point_3d);\n+\t\t\tsfm_f[j].state = true;\n+\t\t\tsfm_f[j].position[0] = point_3d(0);\n+\t\t\tsfm_f[j].position[1] = point_3d(1);\n+\t\t\tsfm_f[j].position[2] = point_3d(2);\n+\t\t\t//cout << \"trangulated : \" << frame_0 << \" \" << frame_1 << \"  3d point : \"  << j << \"  \" << point_3d.transpose() << endl;\n+\t\t}\t\t\n+\t}\n+\n+/*\n+\tfor (int i = 0; i < frame_num; i++)\n+\t{\n+\t\tq[i] = c_Rotation[i].transpose(); \n+\t\tcout << \"solvePnP  q\" << \" i \" << i <<\"  \" <<q[i].w() << \"  \" << q[i].vec().transpose() << endl;\n+\t}\n+\tfor (int i = 0; i < frame_num; i++)\n+\t{\n+\t\tVector3d t_tmp;\n+\t\tt_tmp = -1 * (q[i] * c_Translation[i]);\n+\t\tcout << \"solvePnP  t\" << \" i \" << i <<\"  \" << t_tmp.x() <<\"  \"<< t_tmp.y() <<\"  \"<< t_tmp.z() << endl;\n+\t}\n+*/\n+\t//full BA\n+\tceres::Problem problem;\n+\tceres::LocalParameterization* local_parameterization = new ceres::QuaternionParameterization();\n+\t//cout << \" begin full BA \" << endl;\n+\tfor (int i = 0; i < frame_num; i++)\n+\t{\n+\t\t//double array for ceres\n+\t\tc_translation[i][0] = c_Translation[i].x();\n+\t\tc_translation[i][1] = c_Translation[i].y();\n+\t\tc_translation[i][2] = c_Translation[i].z();\n+\t\tc_rotation[i][0] = c_Quat[i].w();\n+\t\tc_rotation[i][1] = c_Quat[i].x();\n+\t\tc_rotation[i][2] = c_Quat[i].y();\n+\t\tc_rotation[i][3] = c_Quat[i].z();\n+\t\tproblem.AddParameterBlock(c_rotation[i], 4, local_parameterization);\n+\t\tproblem.AddParameterBlock(c_translation[i], 3);\n+\t\tif (i == l)\n+\t\t{\n+\t\t\tproblem.SetParameterBlockConstant(c_rotation[i]);\n+\t\t}\n+\t\tif (i == l || i == frame_num - 1)\n+\t\t{\n+\t\t\tproblem.SetParameterBlockConstant(c_translation[i]);\n+\t\t}\n+\t}\n+\n+    c_normal[0] = n[0];\n+    c_normal[1] = n[1];\n+    c_normal[2] = n[2];\n+    problem.AddParameterBlock(c_normal, 3);\n+\n+\tfor (int i = 0; i < feature_num; i++)\n+\t{\n+\t\tif (sfm_f[i].state != true)\n+\t\t\tcontinue;\n+\t\tfor (int j = 0; j < int(sfm_f[i].observation.size()); j++)\n+\t\t{\n+\t\t\tint l = sfm_f[i].observation[j].first;\n+            \n+\t\t\tceres::CostFunction* b_cost_function = ReprojectionError3D::Create(\n+\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n+\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n+\n+    \t\tproblem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l],\n+    \t\t\t\t\t\t\t\tsfm_f[i].position);\t \n+\n+\t\t\tceres::CostFunction* h_cost_function = ReprojectionErrorH::Create(\n+\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n+\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n+\n+    \t\tproblem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l],c_normal, \n+    \t\t\t\t\t\t\t\tsfm_f[i].position);\t \n+\t\t}\n+\n+\t}\n+\tceres::Solver::Options options;\n+\toptions.linear_solver_type = ceres::DENSE_SCHUR;\n+\t//options.minimizer_progress_to_stdout = true;\n+\toptions.max_solver_time_in_seconds = 0.2;\n+\tceres::Solver::Summary summary;\n+\tceres::Solve(options, &problem, &summary);\n+\t//std::cout << summary.BriefReport() << \"\\n\";\n+\tif (summary.termination_type == ceres::CONVERGENCE || summary.final_cost < 5e-03)\n+\t{\n+\t\tcout << \"vision only BA converge\" << endl;\n+\t}\n+\telse\n+\t{\n+\t\tcout << \"vision only BA not converge \" << endl;\n+\t\treturn false;\n+\t}\n+\tfor (int i = 0; i < frame_num; i++)\n+\t{\n+\t\tq[i].w() = c_rotation[i][0]; \n+\t\tq[i].x() = c_rotation[i][1]; \n+\t\tq[i].y() = c_rotation[i][2]; \n+\t\tq[i].z() = c_rotation[i][3]; \n+\t\tq[i] = q[i].inverse();\n+\t\t//cout << \"final  q\" << \" i \" << i <<\"  \" <<q[i].w() << \"  \" << q[i].vec().transpose() << endl;\n+\t}\n+\tfor (int i = 0; i < frame_num; i++)\n+\t{\n+\n+\t\tT[i] = -1 * (q[i] * Vector3d(c_translation[i][0], c_translation[i][1], c_translation[i][2]));\n+\t\t//cout << \"final  t\" << \" i \" << i <<\"  \" << T[i](0) <<\"  \"<< T[i](1) <<\"  \"<< T[i](2) << endl;\n+\t}\n+\tfor (int i = 0; i < (int)sfm_f.size(); i++)\n+\t{\n+\t\tif(sfm_f[i].state)\n+\t\t\tsfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]);\n+\t}\n+\treturn true;\n+\n+}\n+\n+\ndiff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h\nindex f70432d..a1e6854 100644\n--- a/vins_estimator/src/initial/initial_sfm.h\n+++ b/vins_estimator/src/initial/initial_sfm.h\n@@ -53,6 +53,40 @@ struct ReprojectionError3D\n \tdouble observed_v;\n };\n \n+\n+struct ReprojectionErrorH\n+{\n+\tReprojectionErrorH(double observed_u, double observed_v)\n+\t\t:observed_u(observed_u), observed_v(observed_v)\n+\t\t{}\n+\n+\ttemplate <typename T>\n+\tbool operator()(const T* const R, const T* const t, const T* const n, const T* point, T* residuals) const\n+\t{\n+\t\tT p[3];\n+        ceres::QuaternionRotatePoint(R, point, p);\n+        T np = n[0] * point[0] + n[1] * point[1] + n[2] * point[2];\n+        p[0] += t[0]*np; p[1] += t[1]*np; p[2] += t[2]*np;\n+\t\tT xp = p[0] / p[2];\n+    \tT yp = p[1] / p[2];\n+    \tresiduals[0] = xp - T(observed_u);\n+    \tresiduals[1] = yp - T(observed_v);\n+    \treturn true;\n+\t}\n+\n+\tstatic ceres::CostFunction* Create(const double observed_x,\n+\t                                   const double observed_y) \n+\t{\n+\t  return (new ceres::AutoDiffCostFunction<\n+\t          ReprojectionErrorH, 2, 4, 3, 3, 3>(\n+\t          \tnew ReprojectionErrorH(observed_x,observed_y)));\n+\t}\n+\n+\tdouble observed_u;\n+\tdouble observed_v;\n+};\n+\n+\n class GlobalSFM\n {\n public:\n@@ -61,6 +95,10 @@ public:\n \t\t\t  const Matrix3d relative_R, const Vector3d relative_T,\n \t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points);\n \n+    bool constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n+\t\t\t  const Matrix3d relative_R, const Vector3d relative_T, const Vector3d n,\n+\t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points);\n+\n private:\n \tbool solveFrameByPnP(Matrix3d &R_initial, Vector3d &P_initial, int i, vector<SFMFeature> &sfm_f);\n \n@@ -71,4 +109,4 @@ private:\n \t\t\t\t\t\t\t  vector<SFMFeature> &sfm_f);\n \n \tint feature_num;\n-};\n\\ No newline at end of file\n+};\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex 50999c3..bdc87a3 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -228,7 +228,7 @@ bool MotionEstimator::solveRelativeRT(const vector<pair<Vector3d, Vector3d>> &co\n }\n \n \n-bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &Rotation, Vector3d &Translation)\n+bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &Rotation, Vector3d &Translation, Vector3d &n)\n {\n     if (corres.size() >= 15)\n     {\n@@ -244,21 +244,26 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n         //cv::Mat K = (cv::Mat_<double>(3,3) << 665, 0, 511, 0, 665, 383, 0, 0, 1);\n \n-        Eigen::Matrix4d Tr = decomposeH(H, K, R_imu, TrIC);\n-        Rotation = Tr.block(0,0,3,3);\n-        Translation = Tr.block(0,3,3,1);\n+        Eigen::Matrix4d est_Tr;\n+        Eigen::Vector3d est_n;\n+        decomposeH(H, K, R_imu, TrIC, est_Tr, est_n);\n+        Rotation = est_Tr.block(0,0,3,3);\n+        Translation = est_Tr.block(0,3,3,1);\n+        n = est_n;\n         return true;\n     }\n     return false;\n }\n \n \n-Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC)\n+void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix4d &est_Tr, Vector3d &est_n)\n {\n     vector<cv::Mat> cv_Rs, cv_ts, cv_ns;\n     int n_sols = cv::decomposeHomographyMat(H, K, cv_Rs, cv_ts, cv_ns);\n \n     vector<Matrix4d> positive_depth_transforms;\n+    vector<Vector3d> positive_depth_normals;\n+\n     if(n_sols > 1)\n     {\n         for(int i = 0; i < n_sols; i++)\n@@ -280,6 +285,7 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M\n             cv::cv2eigen(cv_ns[i], n);\n             if(n.dot(e3) > 0)\n                 positive_depth_transforms.push_back(Tr);\n+                positive_depth_normals.push_back(n);\n         }\n \n         vector<double> rot_diff;\n@@ -292,8 +298,8 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M\n         }\n \n         int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin();\n-        Matrix4d Tr = positive_depth_transforms[min_index];\n-        return Tr;\n+        est_Tr = positive_depth_transforms[min_index];\n+        est_n = positive_depth_normals[min_index];\n     }\n \n     else\n@@ -307,8 +313,11 @@ Matrix4d MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const M\n         cv::cv2eigen(cv_ts[0], t);\n         Tr.block(0,3,3,1) = t;\n \n+        Vector3d n;\n+        cv::cv2eigen(cv_ns[0], n);\n+\n         //Tr = TrIC * Tr * TrIC.inverse();\n-        Tr = Tr.inverse().eval();\n-        return Tr;\n+        est_Tr = Tr.inverse().eval();\n+        est_n = n;\n     }\n }\ndiff --git a/vins_estimator/src/initial/solve_5pts.h b/vins_estimator/src/initial/solve_5pts.h\nindex 9fb118e..f090370 100644\n--- a/vins_estimator/src/initial/solve_5pts.h\n+++ b/vins_estimator/src/initial/solve_5pts.h\n@@ -15,8 +15,8 @@ class MotionEstimator\n   public:\n \n     bool solveRelativeRT(const vector<pair<Vector3d, Vector3d>> &corres, Matrix3d &R, Vector3d &T);\n-    bool solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T);\n-    Matrix4d decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const  Matrix4d &TrIC);\n+    bool solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T, Vector3d &n);\n+    void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const  Matrix4d &TrIC, Matrix4d &est_Tr, Vector3d &est_n);\n \n   private:\n     double testTriangulation(const vector<cv::Point2f> &l,\n-- \n2.17.1\n\n\nFrom fb6f3cf6cf15d5119c170b55e86ca5784773f970 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Wed, 23 Sep 2020 08:12:13 +0530\nSubject: [PATCH 05/42] Fix parameter bugs\n\n---\n vins_estimator/src/initial/initial_sfm.cpp | 15 +++++++++++----\n vins_estimator/src/initial/initial_sfm.h   |  1 +\n 2 files changed, 12 insertions(+), 4 deletions(-)\n\ndiff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp\nindex c7b1661..05917a0 100644\n--- a/vins_estimator/src/initial/initial_sfm.cpp\n+++ b/vins_estimator/src/initial/initial_sfm.cpp\n@@ -104,6 +104,9 @@ void GlobalSFM::triangulateTwoFrames(int frame0, Eigen::Matrix<double, 3, 4> &Po\n \t\t\tsfm_f[j].position[0] = point_3d(0);\n \t\t\tsfm_f[j].position[1] = point_3d(1);\n \t\t\tsfm_f[j].position[2] = point_3d(2);\n+            sfm_f[j].image[0] = point1[0];\n+            sfm_f[j].image[1] = point1[1];\n+            sfm_f[j].image[2] = 1;\n \t\t\t//cout << \"trangulated : \" << frame1 << \"  3d point : \"  << j << \"  \" << point_3d.transpose() << endl;\n \t\t}\t\t\t\t\t\t\t  \n \t}\n@@ -212,6 +215,9 @@ bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t\tsfm_f[j].position[0] = point_3d(0);\n \t\t\tsfm_f[j].position[1] = point_3d(1);\n \t\t\tsfm_f[j].position[2] = point_3d(2);\n+            sfm_f[j].image[0] = point1[0];\n+            sfm_f[j].image[1] = point1[1];\n+            sfm_f[j].image[2] = 1;\n \t\t\t//cout << \"trangulated : \" << frame_0 << \" \" << frame_1 << \"  3d point : \"  << j << \"  \" << point_3d.transpose() << endl;\n \t\t}\t\t\n \t}\n@@ -410,6 +416,9 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t\tsfm_f[j].position[0] = point_3d(0);\n \t\t\tsfm_f[j].position[1] = point_3d(1);\n \t\t\tsfm_f[j].position[2] = point_3d(2);\n+            sfm_f[j].image[0] = point1[0];\n+            sfm_f[j].image[1] = point1[1];\n+            sfm_f[j].image[2] = 1;\n \t\t\t//cout << \"trangulated : \" << frame_0 << \" \" << frame_1 << \"  3d point : \"  << j << \"  \" << point_3d.transpose() << endl;\n \t\t}\t\t\n \t}\n@@ -471,16 +480,14 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n \n     \t\tproblem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l],\n-    \t\t\t\t\t\t\t\tsfm_f[i].position);\t \n+    \t\t\t\t\t\t\t\tsfm_f[i].position); \n \n \t\t\tceres::CostFunction* h_cost_function = ReprojectionErrorH::Create(\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n \n-    \t\tproblem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l],c_normal, \n-    \t\t\t\t\t\t\t\tsfm_f[i].position);\t \n+    \t\tproblem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l], c_normal, sfm_f[i].image);\n \t\t}\n-\n \t}\n \tceres::Solver::Options options;\n \toptions.linear_solver_type = ceres::DENSE_SCHUR;\ndiff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h\nindex a1e6854..7780732 100644\n--- a/vins_estimator/src/initial/initial_sfm.h\n+++ b/vins_estimator/src/initial/initial_sfm.h\n@@ -19,6 +19,7 @@ struct SFMFeature\n     int id;\n     vector<pair<int,Vector2d>> observation;\n     double position[3];\n+    double image[3]; // u v 1\n     double depth;\n };\n \n-- \n2.17.1\n\n\nFrom d33f9314e26d538f07fdb97cc6278b5a4f449b41 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 1 Oct 2020 11:11:06 +0530\nSubject: [PATCH 06/42] Add homography const functor into estimator\n\n---\n vins_estimator/src/estimator.cpp              |  5 +-\n vins_estimator/src/estimator.h                |  2 +\n vins_estimator/src/factor/homography_factor.h | 47 +++++++++++++++++++\n vins_estimator/src/initial/initial_sfm.cpp    |  7 ++-\n vins_estimator/src/initial/initial_sfm.h      |  4 +-\n 5 files changed, 61 insertions(+), 4 deletions(-)\n create mode 100644 vins_estimator/src/factor/homography_factor.h\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex c561fd6..97a3821 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -435,7 +435,7 @@ bool Estimator::visualInitialAlign()\n         Vs[i] = rot_diff * Vs[i];\n     }\n     ROS_DEBUG_STREAM(\"g0     \" << g.transpose());\n-    ROS_DEBUG_STREAM(\"my R0  \" << Utility::R2ypr(Rs[0]).transpose()); \n+    ROS_DEBUG_STREAM(\"my R0  \" << Utility::R2ypr(Rs[0]).transpose());\n \n     return true;\n }\n@@ -800,6 +800,9 @@ void Estimator::optimization()\n             {\n                 ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n+\n+                ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n+                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n         }\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex 9b2a731..f3feedf 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -17,6 +17,7 @@\n #include \"factor/projection_factor.h\"\n #include \"factor/projection_td_factor.h\"\n #include \"factor/marginalization_factor.h\"\n+#include \"factor/homography_factor.h\"\n \n #include <unordered_map>\n #include <queue>\n@@ -114,6 +115,7 @@ class Estimator\n     double para_Retrive_Pose[SIZE_POSE];\n     double para_Td[1][1];\n     double para_Tr[1][1];\n+    double para_N[3];\n \n     int loop_window_index;\n \ndiff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h\nnew file mode 100644\nindex 0000000..86b03ff\n--- /dev/null\n+++ b/vins_estimator/src/factor/homography_factor.h\n@@ -0,0 +1,47 @@\n+#pragma once\n+#include <ceres/ceres.h>\n+#include <eigen3/Eigen/Dense>\n+\n+struct HomographyFactor\n+{\n+    HomographyFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) {}\n+\n+    template <typename T>\n+        bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const ex_pose, \n+        T* residuals) const\n+        {\n+            Eigen::Map<const Eigen::Matrix<T, 3, 1>> pi(pose_i);\n+            Eigen::Quaternion<T> qi;\n+            qi.coeffs() << pose_i[3], pose_i[4], pose_i[5], pose_i[6];\n+\n+            Eigen::Map<const Eigen::Matrix<T, 3, 1>> pj(pose_j);\n+            Eigen::Quaternion<T> qj;\n+            qj.coeffs() << pose_j[3], pose_j[4], pose_j[5], pose_j[6];\n+\n+            Eigen::Map<const Eigen::Matrix<T, 3, 1>> tic(ex_pose);\n+            Eigen::Quaternion<T> qic;\n+            qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6];\n+\n+            Eigen::Map<const Eigen::Matrix<T, 3, 1>> n(para_n);\n+\n+            Eigen::Quaternion<T> qji = qj.inverse() * qi;\n+            Eigen::Matrix<T, 3, 1> tji = qj.inverse().toRotationMatrix() * (pi - pj);\n+            Eigen::Matrix<T, 3, 1> pts_imu_i = qic * pts_i.cast<T>() + tic;\n+            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + tji * n.transpose() * pts_imu_i;\n+            Eigen::Matrix<T, 3, 1> pts_cam_j = qic.inverse() * (pts_imu_j - tic);\n+\n+            pts_cam_j = (pts_cam_j / pts_cam_j[2]);\n+            residuals[0] = pts_cam_j[0] - pts_j[0];\n+            residuals[1] = pts_cam_j[1] - pts_j[1];\n+\n+            return true;\n+        }\n+\n+    static ceres::CostFunction* Create(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j)\n+    {\n+        return (new ceres::AutoDiffCostFunction<HomographyFactor, 2, 7, 7, 3, 7>\n+                (new HomographyFactor(_pts_i, _pts_j)));\n+    }\n+\n+    Eigen::Vector3d pts_i, pts_j;\n+};\ndiff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp\nindex 05917a0..76a4f88 100644\n--- a/vins_estimator/src/initial/initial_sfm.cpp\n+++ b/vins_estimator/src/initial/initial_sfm.cpp\n@@ -319,7 +319,7 @@ bool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \n \n bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n-\t\t\t  const Matrix3d relative_R, const Vector3d relative_T, const Vector3d n,\n+\t\t\t  const Matrix3d relative_R, const Vector3d relative_T, Vector3d &n,\n \t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points)\n {\n \tfeature_num = sfm_f.size();\n@@ -525,6 +525,11 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\tif(sfm_f[i].state)\n \t\t\tsfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]);\n \t}\n+\n+    n[0] = c_normal[0];\n+    n[1] = c_normal[1];\n+    n[2] = c_normal[2];\n+\n \treturn true;\n \n }\ndiff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h\nindex 7780732..e79dcea 100644\n--- a/vins_estimator/src/initial/initial_sfm.h\n+++ b/vins_estimator/src/initial/initial_sfm.h\n@@ -1,4 +1,4 @@\n-#pragma once \n+#pragma once\n #include <ceres/ceres.h>\n #include <ceres/rotation.h>\n #include <eigen3/Eigen/Dense>\n@@ -97,7 +97,7 @@ public:\n \t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points);\n \n     bool constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n-\t\t\t  const Matrix3d relative_R, const Vector3d relative_T, const Vector3d n,\n+\t\t\t  const Matrix3d relative_R, const Vector3d relative_T, Vector3d &n,\n \t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points);\n \n private:\n-- \n2.17.1\n\n\nFrom d3d1faeaa37f6b413622e6b3373c80cffc4182f7 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 1 Oct 2020 11:57:44 +0530\nSubject: [PATCH 07/42] Add code for transforming normal vector\n\n---\n vins_estimator/src/estimator.cpp              | 3 +++\n vins_estimator/src/factor/homography_factor.h | 5 ++++-\n 2 files changed, 7 insertions(+), 1 deletion(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 97a3821..0112bcd 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -283,6 +283,9 @@ bool Estimator::initialStructure()\n         marginalization_flag = MARGIN_OLD;\n         return false;\n     }\n+    para_N[0] = n[0];\n+    para_N[1] = n[1];\n+    para_N[2] = n[2];\n \n     //solve pnp for all frame\n     map<double, ImageFrame>::iterator frame_it;\ndiff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h\nindex 86b03ff..8473930 100644\n--- a/vins_estimator/src/factor/homography_factor.h\n+++ b/vins_estimator/src/factor/homography_factor.h\n@@ -23,11 +23,14 @@ struct HomographyFactor\n             qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6];\n \n             Eigen::Map<const Eigen::Matrix<T, 3, 1>> n(para_n);\n+            Eigen::Matrix<T, 3, 1> n_imu_1 = qic*(n.normalized()) + tic;\n+            Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi;\n+            Eigen::Matrix<T, 3, 1> n_cam_i = qic.inverse()*n_imu_i - qic.inverse()*tic;\n \n             Eigen::Quaternion<T> qji = qj.inverse() * qi;\n             Eigen::Matrix<T, 3, 1> tji = qj.inverse().toRotationMatrix() * (pi - pj);\n             Eigen::Matrix<T, 3, 1> pts_imu_i = qic * pts_i.cast<T>() + tic;\n-            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + tji * n.transpose() * pts_imu_i;\n+            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + tji * n_cam_i.transpose() * pts_imu_i;\n             Eigen::Matrix<T, 3, 1> pts_cam_j = qic.inverse() * (pts_imu_j - tic);\n \n             pts_cam_j = (pts_cam_j / pts_cam_j[2]);\n-- \n2.17.1\n\n\nFrom dbbe91b1f26b80fe235e3f7dfee9a55c98f108ad Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 2 Oct 2020 10:27:49 +0530\nSubject: [PATCH 08/42] Change to pixel coordinates in h cost functor\n\n---\n vins_estimator/src/estimator.cpp | 6 +++++-\n 1 file changed, 5 insertions(+), 1 deletion(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 0112bcd..6f31fb5 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -774,6 +774,7 @@ void Estimator::optimization()\n         int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;\n         \n         Vector3d pts_i = it_per_id.feature_per_frame[0].point;\n+        Vector3d img_pts_i(it_per_id.feature_per_frame[0].uv.x(), it_per_id.feature_per_frame[1].uv.y(), 1);\n \n         for (auto &it_per_frame : it_per_id.feature_per_frame)\n         {\n@@ -782,7 +783,10 @@ void Estimator::optimization()\n             {\n                 continue;\n             }\n+\n             Vector3d pts_j = it_per_frame.point;\n+            Vector3d img_pts_j(it_per_frame.uv.x(), it_per_frame.uv.y(), 1);\n+\n             if (ESTIMATE_TD)\n             {\n                     ProjectionTdFactor *f_td = new ProjectionTdFactor(pts_i, pts_j, it_per_id.feature_per_frame[0].velocity, it_per_frame.velocity,\n@@ -804,7 +808,7 @@ void Estimator::optimization()\n                 ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n \n-                ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n+                ceres::CostFunction *h_cost_function = HomographyFactor::Create(img_pts_i, img_pts_j);\n                 problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n-- \n2.17.1\n\n\nFrom 9cfa806aac4f59176a59e778e3570c4e69ebb1d2 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 2 Oct 2020 12:21:06 +0530\nSubject: [PATCH 09/42] Fix reference frame bug\n\n---\n vins_estimator/src/factor/homography_factor.h | 5 ++---\n 1 file changed, 2 insertions(+), 3 deletions(-)\n\ndiff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h\nindex 8473930..2a35d40 100644\n--- a/vins_estimator/src/factor/homography_factor.h\n+++ b/vins_estimator/src/factor/homography_factor.h\n@@ -25,12 +25,11 @@ struct HomographyFactor\n             Eigen::Map<const Eigen::Matrix<T, 3, 1>> n(para_n);\n             Eigen::Matrix<T, 3, 1> n_imu_1 = qic*(n.normalized()) + tic;\n             Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi;\n-            Eigen::Matrix<T, 3, 1> n_cam_i = qic.inverse()*n_imu_i - qic.inverse()*tic;\n \n             Eigen::Quaternion<T> qji = qj.inverse() * qi;\n-            Eigen::Matrix<T, 3, 1> tji = qj.inverse().toRotationMatrix() * (pi - pj);\n+            Eigen::Matrix<T, 3, 1> tji = qj.inverse() * (pi - pj);\n             Eigen::Matrix<T, 3, 1> pts_imu_i = qic * pts_i.cast<T>() + tic;\n-            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + tji * n_cam_i.transpose() * pts_imu_i;\n+            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + tji * n_imu_i.transpose() * pts_imu_i;\n             Eigen::Matrix<T, 3, 1> pts_cam_j = qic.inverse() * (pts_imu_j - tic);\n \n             pts_cam_j = (pts_cam_j / pts_cam_j[2]);\n-- \n2.17.1\n\n\nFrom 80c091ccbffd376f8bef7f19c1129f47d321e8c2 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 2 Oct 2020 12:39:00 +0530\nSubject: [PATCH 10/42] Fix indexing bug\n\n---\n vins_estimator/src/estimator.cpp | 2 +-\n 1 file changed, 1 insertion(+), 1 deletion(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 6f31fb5..141ba1c 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -774,7 +774,7 @@ void Estimator::optimization()\n         int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;\n         \n         Vector3d pts_i = it_per_id.feature_per_frame[0].point;\n-        Vector3d img_pts_i(it_per_id.feature_per_frame[0].uv.x(), it_per_id.feature_per_frame[1].uv.y(), 1);\n+        Vector3d img_pts_i(it_per_id.feature_per_frame[0].uv.x(), it_per_id.feature_per_frame[0].uv.y(), 1);\n \n         for (auto &it_per_frame : it_per_id.feature_per_frame)\n         {\n-- \n2.17.1\n\n\nFrom 608edba769955088a250ac4ac416a5f8bb766588 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sat, 3 Oct 2020 11:21:59 +0530\nSubject: [PATCH 11/42] Change to point values in cost functor\n\n---\n vins_estimator/src/estimator.cpp | 2 +-\n 1 file changed, 1 insertion(+), 1 deletion(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 141ba1c..096ef10 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -808,7 +808,7 @@ void Estimator::optimization()\n                 ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n \n-                ceres::CostFunction *h_cost_function = HomographyFactor::Create(img_pts_i, img_pts_j);\n+                ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n                 problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n-- \n2.17.1\n\n\nFrom 55e116d9bfa5350e2e717a758fcd9dd1dfa7d0f9 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Mon, 19 Oct 2020 10:40:18 +0530\nSubject: [PATCH 12/42] Apply mask before feature extraction\n\n---\n config/vins_rviz_config.rviz                 | 192 +++++++++---------\n feature_tracker/CMakeLists.txt               |   1 +\n feature_tracker/src/feature_tracker.cpp      |   9 +-\n feature_tracker/src/feature_tracker.h        |   4 +-\n feature_tracker/src/feature_tracker_node.cpp | 195 ++++++++++++++++++-\n feature_tracker/src/parameters.cpp           |   2 +\n feature_tracker/src/parameters.h             |   1 +\n 7 files changed, 302 insertions(+), 102 deletions(-)\n\ndiff --git a/config/vins_rviz_config.rviz b/config/vins_rviz_config.rviz\nindex 6ed63a9..92ff49d 100644\n--- a/config/vins_rviz_config.rviz\n+++ b/config/vins_rviz_config.rviz\n@@ -4,8 +4,8 @@ Panels:\n     Name: Displays\n     Property Tree Widget:\n       Expanded: ~\n-      Splitter Ratio: 0.465115994\n-    Tree Height: 221\n+      Splitter Ratio: 0.4651159942150116\n+    Tree Height: 151\n   - Class: rviz/Selection\n     Name: Selection\n   - Class: rviz/Tool Properties\n@@ -14,7 +14,7 @@ Panels:\n       - /2D Nav Goal1\n       - /Publish Point1\n     Name: Tool Properties\n-    Splitter Ratio: 0.588679016\n+    Splitter Ratio: 0.5886790156364441\n   - Class: rviz/Views\n     Expanded:\n       - /Current View1\n@@ -24,14 +24,18 @@ Panels:\n     Experimental: false\n     Name: Time\n     SyncMode: 0\n-    SyncSource: tracked image\n+    SyncSource: \"\"\n   - Class: rviz/Displays\n     Help Height: 78\n     Name: Displays\n     Property Tree Widget:\n       Expanded: ~\n       Splitter Ratio: 0.5\n-    Tree Height: 359\n+    Tree Height: 363\n+Preferences:\n+  PromptSaveOnExit: true\n+Toolbars:\n+  toolButtonStyle: 2\n Visualization Manager:\n   Class: \"\"\n   Displays:\n@@ -41,7 +45,7 @@ Visualization Manager:\n       Color: 130; 130; 130\n       Enabled: true\n       Line Style:\n-        Line Width: 0.0299999993\n+        Line Width: 0.029999999329447746\n         Value: Lines\n       Name: Grid\n       Normal Cell Count: 0\n@@ -57,7 +61,7 @@ Visualization Manager:\n       Enabled: true\n       Length: 1\n       Name: Axes\n-      Radius: 0.100000001\n+      Radius: 0.10000000149011612\n       Reference Frame: <Fixed Frame>\n       Value: true\n     - Alpha: 1\n@@ -65,9 +69,9 @@ Visualization Manager:\n       Class: rviz/Path\n       Color: 255; 0; 0\n       Enabled: true\n-      Head Diameter: 0.300000012\n-      Head Length: 0.200000003\n-      Length: 0.300000012\n+      Head Diameter: 0.30000001192092896\n+      Head Length: 0.20000000298023224\n+      Length: 0.30000001192092896\n       Line Style: Lines\n       Line Width: 0.5\n       Name: ground_truth_path\n@@ -77,9 +81,9 @@ Visualization Manager:\n         Z: 0\n       Pose Color: 255; 85; 255\n       Pose Style: None\n-      Radius: 0.0299999993\n-      Shaft Diameter: 0.100000001\n-      Shaft Length: 0.100000001\n+      Radius: 0.029999999329447746\n+      Shaft Diameter: 0.10000000149011612\n+      Shaft Length: 0.10000000149011612\n       Topic: /benchmark_publisher/path\n       Unreliable: false\n       Value: true\n@@ -114,11 +118,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 0; 255; 0\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.0299999993\n+          Line Width: 0.029999999329447746\n           Name: Path\n           Offset:\n             X: 0\n@@ -126,9 +130,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /vins_estimator/path\n           Unreliable: false\n           Value: true\n@@ -163,7 +167,7 @@ Visualization Manager:\n           Queue Size: 10\n           Selectable: true\n           Size (Pixels): 1\n-          Size (m): 0.00999999978\n+          Size (m): 0.009999999776482582\n           Style: Points\n           Topic: /vins_estimator/point_cloud\n           Unreliable: false\n@@ -214,7 +218,7 @@ Visualization Manager:\n             {}\n           Update Interval: 0\n           Value: true\n-      Enabled: false\n+      Enabled: true\n       Name: VIO\n     - Class: rviz/Group\n       Displays:\n@@ -223,11 +227,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 0; 255; 0\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.0299999993\n+          Line Width: 0.029999999329447746\n           Name: pose_graph_path\n           Offset:\n             X: 0\n@@ -235,9 +239,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /pose_graph/pose_graph_path\n           Unreliable: false\n           Value: true\n@@ -246,11 +250,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 255; 170; 0\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.0299999993\n+          Line Width: 0.029999999329447746\n           Name: base_path\n           Offset:\n             X: 0\n@@ -258,9 +262,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /pose_graph/base_path\n           Unreliable: false\n           Value: true\n@@ -305,11 +309,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 25; 255; 0\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.300000012\n+          Line Width: 0.30000001192092896\n           Name: Sequence1\n           Offset:\n             X: 0\n@@ -317,9 +321,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /pose_graph/path_1\n           Unreliable: false\n           Value: true\n@@ -328,11 +332,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 255; 0; 0\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.300000012\n+          Line Width: 0.30000001192092896\n           Name: Sequence2\n           Offset:\n             X: 0\n@@ -340,9 +344,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /pose_graph/path_2\n           Unreliable: false\n           Value: true\n@@ -351,11 +355,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 0; 85; 255\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.300000012\n+          Line Width: 0.30000001192092896\n           Name: Sequence3\n           Offset:\n             X: 0\n@@ -363,9 +367,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /pose_graph/path_3\n           Unreliable: false\n           Value: true\n@@ -374,11 +378,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 255; 170; 0\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.300000012\n+          Line Width: 0.30000001192092896\n           Name: Sequence4\n           Offset:\n             X: 0\n@@ -386,9 +390,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /pose_graph/path_4\n           Unreliable: false\n           Value: true\n@@ -397,11 +401,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 255; 170; 255\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.0299999993\n+          Line Width: 0.029999999329447746\n           Name: Sequence5\n           Offset:\n             X: 0\n@@ -409,9 +413,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /pose_graph/path_5\n           Unreliable: false\n           Value: true\n@@ -420,11 +424,11 @@ Visualization Manager:\n           Class: rviz/Path\n           Color: 25; 255; 0\n           Enabled: true\n-          Head Diameter: 0.300000012\n-          Head Length: 0.200000003\n-          Length: 0.300000012\n+          Head Diameter: 0.30000001192092896\n+          Head Length: 0.20000000298023224\n+          Length: 0.30000001192092896\n           Line Style: Lines\n-          Line Width: 0.0299999993\n+          Line Width: 0.029999999329447746\n           Name: no_loop_path\n           Offset:\n             X: 0\n@@ -432,9 +436,9 @@ Visualization Manager:\n             Z: 0\n           Pose Color: 255; 85; 255\n           Pose Style: None\n-          Radius: 0.0299999993\n-          Shaft Diameter: 0.100000001\n-          Shaft Length: 0.100000001\n+          Radius: 0.029999999329447746\n+          Shaft Diameter: 0.10000000149011612\n+          Shaft Length: 0.10000000149011612\n           Topic: /pose_graph/no_loop_path\n           Unreliable: false\n           Value: true\n@@ -443,6 +447,7 @@ Visualization Manager:\n   Enabled: true\n   Global Options:\n     Background Color: 0; 0; 0\n+    Default Light: true\n     Fixed Frame: world\n     Frame Rate: 30\n   Name: root\n@@ -452,7 +457,10 @@ Visualization Manager:\n     - Class: rviz/FocusCamera\n     - Class: rviz/Measure\n     - Class: rviz/SetInitialPose\n+      Theta std deviation: 0.2617993950843811\n       Topic: /initialpose\n+      X std deviation: 0.5\n+      Y std deviation: 0.5\n     - Class: rviz/SetGoal\n       Topic: /move_base_simple/goal\n     - Class: rviz/PublishPoint\n@@ -462,33 +470,33 @@ Visualization Manager:\n   Views:\n     Current:\n       Class: rviz/XYOrbit\n-      Distance: 20.9684067\n+      Distance: 20.968406677246094\n       Enable Stereo Rendering:\n-        Stereo Eye Separation: 0.0599999987\n+        Stereo Eye Separation: 0.05999999865889549\n         Stereo Focal Distance: 1\n         Swap Stereo Eyes: false\n         Value: false\n       Focal Point:\n-        X: -2.7069304\n-        Y: -1.26974416\n-        Z: 2.1410624e-05\n+        X: -2.70693039894104\n+        Y: -1.2697441577911377\n+        Z: 2.141062395821791e-5\n       Focal Shape Fixed Size: true\n-      Focal Shape Size: 0.0500000007\n+      Focal Shape Size: 0.05000000074505806\n       Invert Z Axis: false\n       Name: Current View\n-      Near Clip Distance: 0.00999999978\n-      Pitch: 1.0797962\n+      Near Clip Distance: 0.009999999776482582\n+      Pitch: 1.0797961950302124\n       Target Frame: <Fixed Frame>\n       Value: XYOrbit (rviz)\n-      Yaw: 3.08722663\n+      Yaw: 3.087226629257202\n     Saved: ~\n Window Geometry:\n   Displays:\n     collapsed: false\n-  Height: 912\n+  Height: 734\n   Hide Left Dock: false\n   Hide Right Dock: true\n-  QMainWindow State: 000000ff00000000fd0000000400000000000001df00000309fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000006400fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d00610067006501000000280000012c0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d006100670065010000015a000000b30000001600fffffffb000000100044006900730070006c00610079007301000002130000011e000000dd00fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000016a00fffffffb0000000a00560069006500770073000000023f0000016a0000010f00fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e100000197000000030000056e0000003bfc0100000002fb0000000800540069006d006501000000000000056e0000030000fffffffb0000000800540069006d00650100000000000004500000000000000000000003890000030900000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n+  QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000000de0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650100000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001ac000000d4000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002a30000003bfc0100000002fb0000000800540069006d00650100000000000002a3000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000001470000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n   Selection:\n     collapsed: false\n   Time:\n@@ -497,9 +505,9 @@ Window Geometry:\n     collapsed: false\n   Views:\n     collapsed: true\n-  Width: 1390\n-  X: 2127\n-  Y: 109\n+  Width: 675\n+  X: 685\n+  Y: 4\n   loop_match_image:\n     collapsed: false\n   raw_image:\ndiff --git a/feature_tracker/CMakeLists.txt b/feature_tracker/CMakeLists.txt\nindex bc3fb44..439f706 100644\n--- a/feature_tracker/CMakeLists.txt\n+++ b/feature_tracker/CMakeLists.txt\n@@ -11,6 +11,7 @@ find_package(catkin REQUIRED COMPONENTS\n     sensor_msgs\n     cv_bridge\n     camera_model\n+    message_filters\n     )\n \n find_package(OpenCV REQUIRED)\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex 41322b8..9d5de16 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -33,12 +33,13 @@ FeatureTracker::FeatureTracker()\n {\n }\n \n-void FeatureTracker::setMask()\n+void FeatureTracker::setMask(const cv::Mat &_mask)\n {\n     if(FISHEYE)\n         mask = fisheye_mask.clone();\n     else\n-        mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255));\n+        //mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255));\n+        mask = _mask.clone();\n     \n \n     // prefer to keep features that are tracked for long time\n@@ -78,7 +79,7 @@ void FeatureTracker::addPoints()\n     }\n }\n \n-void FeatureTracker::readImage(const cv::Mat &_img, double _cur_time)\n+void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time)\n {\n     cv::Mat img;\n     TicToc t_r;\n@@ -132,7 +133,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, double _cur_time)\n         rejectWithF();\n         ROS_DEBUG(\"set mask begins\");\n         TicToc t_m;\n-        setMask();\n+        setMask(_mask);\n         ROS_DEBUG(\"set mask costs %fms\", t_m.toc());\n \n         ROS_DEBUG(\"detect feature begins\");\ndiff --git a/feature_tracker/src/feature_tracker.h b/feature_tracker/src/feature_tracker.h\nindex 9862ec1..cb376db 100644\n--- a/feature_tracker/src/feature_tracker.h\n+++ b/feature_tracker/src/feature_tracker.h\n@@ -30,9 +30,9 @@ class FeatureTracker\n   public:\n     FeatureTracker();\n \n-    void readImage(const cv::Mat &_img,double _cur_time);\n+    void readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time);\n \n-    void setMask();\n+    void setMask(const cv::Mat &_mask);\n \n     void addPoints();\n \ndiff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp\nindex 5a5261c..a0dbd25 100644\n--- a/feature_tracker/src/feature_tracker_node.cpp\n+++ b/feature_tracker/src/feature_tracker_node.cpp\n@@ -6,6 +6,7 @@\n #include <std_msgs/Bool.h>\n #include <cv_bridge/cv_bridge.h>\n #include <message_filters/subscriber.h>\n+#include <message_filters/time_synchronizer.h>\n \n #include \"feature_tracker.h\"\n \n@@ -25,7 +26,7 @@ bool first_image_flag = true;\n double last_image_time = 0;\n bool init_pub = 0;\n \n-void img_callback(const sensor_msgs::ImageConstPtr &img_msg)\n+void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::ImageConstPtr &mask_msg)\n {\n     if(first_image_flag)\n     {\n@@ -61,7 +62,7 @@ void img_callback(const sensor_msgs::ImageConstPtr &img_msg)\n     else\n         PUB_THIS_FRAME = false;\n \n-    cv_bridge::CvImageConstPtr ptr;\n+    cv_bridge::CvImageConstPtr ptr, mask_ptr;\n     if (img_msg->encoding == \"8UC1\")\n     {\n         sensor_msgs::Image img;\n@@ -77,13 +78,15 @@ void img_callback(const sensor_msgs::ImageConstPtr &img_msg)\n     else\n         ptr = cv_bridge::toCvCopy(img_msg, sensor_msgs::image_encodings::MONO8);\n \n+    mask_ptr = cv_bridge::toCvCopy(mask_msg, sensor_msgs::image_encodings::MONO8);\n     cv::Mat show_img = ptr->image;\n+\n     TicToc t_r;\n     for (int i = 0; i < NUM_OF_CAM; i++)\n     {\n         ROS_DEBUG(\"processing camera %d\", i);\n         if (i != 1 || !STEREO_TRACK)\n-            trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), img_msg->header.stamp.toSec());\n+            trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image, img_msg->header.stamp.toSec());\n         else\n         {\n             if (EQUALIZE)\n@@ -201,8 +204,187 @@ void img_callback(const sensor_msgs::ImageConstPtr &img_msg)\n         }\n     }\n     ROS_INFO(\"whole feature tracker processing costs: %f\", t_r.toc());\n+\n }\n \n+//void img_callback(const sensor_msgs::ImageConstPtr &img_msg)\n+//{\n+//    if(first_image_flag)\n+//    {\n+//        first_image_flag = false;\n+//        first_image_time = img_msg->header.stamp.toSec();\n+//        last_image_time = img_msg->header.stamp.toSec();\n+//        return;\n+//    }\n+//    // detect unstable camera stream\n+//    if (img_msg->header.stamp.toSec() - last_image_time > 1.0 || img_msg->header.stamp.toSec() < last_image_time)\n+//    {\n+//        ROS_WARN(\"image discontinue! reset the feature tracker!\");\n+//        first_image_flag = true; \n+//        last_image_time = 0;\n+//        pub_count = 1;\n+//        std_msgs::Bool restart_flag;\n+//        restart_flag.data = true;\n+//        pub_restart.publish(restart_flag);\n+//        return;\n+//    }\n+//    last_image_time = img_msg->header.stamp.toSec();\n+//    // frequency control\n+//    if (round(1.0 * pub_count / (img_msg->header.stamp.toSec() - first_image_time)) <= FREQ)\n+//    {\n+//        PUB_THIS_FRAME = true;\n+//        // reset the frequency control\n+//        if (abs(1.0 * pub_count / (img_msg->header.stamp.toSec() - first_image_time) - FREQ) < 0.01 * FREQ)\n+//        {\n+//            first_image_time = img_msg->header.stamp.toSec();\n+//            pub_count = 0;\n+//        }\n+//    }\n+//    else\n+//        PUB_THIS_FRAME = false;\n+//\n+//    cv_bridge::CvImageConstPtr ptr;\n+//    if (img_msg->encoding == \"8UC1\")\n+//    {\n+//        sensor_msgs::Image img;\n+//        img.header = img_msg->header;\n+//        img.height = img_msg->height;\n+//        img.width = img_msg->width;\n+//        img.is_bigendian = img_msg->is_bigendian;\n+//        img.step = img_msg->step;\n+//        img.data = img_msg->data;\n+//        img.encoding = \"mono8\";\n+//        ptr = cv_bridge::toCvCopy(img, sensor_msgs::image_encodings::MONO8);\n+//    }\n+//    else\n+//        ptr = cv_bridge::toCvCopy(img_msg, sensor_msgs::image_encodings::MONO8);\n+//\n+//    cv::Mat show_img = ptr->image;\n+//    TicToc t_r;\n+//    for (int i = 0; i < NUM_OF_CAM; i++)\n+//    {\n+//        ROS_DEBUG(\"processing camera %d\", i);\n+//        if (i != 1 || !STEREO_TRACK)\n+//            trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), img_msg->header.stamp.toSec());\n+//        else\n+//        {\n+//            if (EQUALIZE)\n+//            {\n+//                cv::Ptr<cv::CLAHE> clahe = cv::createCLAHE();\n+//                clahe->apply(ptr->image.rowRange(ROW * i, ROW * (i + 1)), trackerData[i].cur_img);\n+//            }\n+//            else\n+//                trackerData[i].cur_img = ptr->image.rowRange(ROW * i, ROW * (i + 1));\n+//        }\n+//\n+//#if SHOW_UNDISTORTION\n+//        trackerData[i].showUndistortion(\"undistrotion_\" + std::to_string(i));\n+//#endif\n+//    }\n+//\n+//    for (unsigned int i = 0;; i++)\n+//    {\n+//        bool completed = false;\n+//        for (int j = 0; j < NUM_OF_CAM; j++)\n+//            if (j != 1 || !STEREO_TRACK)\n+//                completed |= trackerData[j].updateID(i);\n+//        if (!completed)\n+//            break;\n+//    }\n+//\n+//   if (PUB_THIS_FRAME)\n+//   {\n+//        pub_count++;\n+//        sensor_msgs::PointCloudPtr feature_points(new sensor_msgs::PointCloud);\n+//        sensor_msgs::ChannelFloat32 id_of_point;\n+//        sensor_msgs::ChannelFloat32 u_of_point;\n+//        sensor_msgs::ChannelFloat32 v_of_point;\n+//        sensor_msgs::ChannelFloat32 velocity_x_of_point;\n+//        sensor_msgs::ChannelFloat32 velocity_y_of_point;\n+//\n+//        feature_points->header = img_msg->header;\n+//        feature_points->header.frame_id = \"world\";\n+//\n+//        vector<set<int>> hash_ids(NUM_OF_CAM);\n+//        for (int i = 0; i < NUM_OF_CAM; i++)\n+//        {\n+//            auto &un_pts = trackerData[i].cur_un_pts;\n+//            auto &cur_pts = trackerData[i].cur_pts;\n+//            auto &ids = trackerData[i].ids;\n+//            auto &pts_velocity = trackerData[i].pts_velocity;\n+//            for (unsigned int j = 0; j < ids.size(); j++)\n+//            {\n+//                if (trackerData[i].track_cnt[j] > 1)\n+//                {\n+//                    int p_id = ids[j];\n+//                    hash_ids[i].insert(p_id);\n+//                    geometry_msgs::Point32 p;\n+//                    p.x = un_pts[j].x;\n+//                    p.y = un_pts[j].y;\n+//                    p.z = 1;\n+//\n+//                    feature_points->points.push_back(p);\n+//                    id_of_point.values.push_back(p_id * NUM_OF_CAM + i);\n+//                    u_of_point.values.push_back(cur_pts[j].x);\n+//                    v_of_point.values.push_back(cur_pts[j].y);\n+//                    velocity_x_of_point.values.push_back(pts_velocity[j].x);\n+//                    velocity_y_of_point.values.push_back(pts_velocity[j].y);\n+//                }\n+//            }\n+//        }\n+//        feature_points->channels.push_back(id_of_point);\n+//        feature_points->channels.push_back(u_of_point);\n+//        feature_points->channels.push_back(v_of_point);\n+//        feature_points->channels.push_back(velocity_x_of_point);\n+//        feature_points->channels.push_back(velocity_y_of_point);\n+//        ROS_DEBUG(\"publish %f, at %f\", feature_points->header.stamp.toSec(), ros::Time::now().toSec());\n+//        // skip the first image; since no optical speed on frist image\n+//        if (!init_pub)\n+//        {\n+//            init_pub = 1;\n+//        }\n+//        else\n+//            pub_img.publish(feature_points);\n+//\n+//        if (SHOW_TRACK)\n+//        {\n+//            ptr = cv_bridge::cvtColor(ptr, sensor_msgs::image_encodings::BGR8);\n+//            //cv::Mat stereo_img(ROW * NUM_OF_CAM, COL, CV_8UC3);\n+//            cv::Mat stereo_img = ptr->image;\n+//\n+//            for (int i = 0; i < NUM_OF_CAM; i++)\n+//            {\n+//                cv::Mat tmp_img = stereo_img.rowRange(i * ROW, (i + 1) * ROW);\n+//                cv::cvtColor(show_img, tmp_img, CV_GRAY2RGB);\n+//\n+//                for (unsigned int j = 0; j < trackerData[i].cur_pts.size(); j++)\n+//                {\n+//                    double len = std::min(1.0, 1.0 * trackerData[i].track_cnt[j] / WINDOW_SIZE);\n+//                    cv::circle(tmp_img, trackerData[i].cur_pts[j], 2, cv::Scalar(255 * (1 - len), 0, 255 * len), 2);\n+//                    //draw speed line\n+//                    /*\n+//                    Vector2d tmp_cur_un_pts (trackerData[i].cur_un_pts[j].x, trackerData[i].cur_un_pts[j].y);\n+//                    Vector2d tmp_pts_velocity (trackerData[i].pts_velocity[j].x, trackerData[i].pts_velocity[j].y);\n+//                    Vector3d tmp_prev_un_pts;\n+//                    tmp_prev_un_pts.head(2) = tmp_cur_un_pts - 0.10 * tmp_pts_velocity;\n+//                    tmp_prev_un_pts.z() = 1;\n+//                    Vector2d tmp_prev_uv;\n+//                    trackerData[i].m_camera->spaceToPlane(tmp_prev_un_pts, tmp_prev_uv);\n+//                    cv::line(tmp_img, trackerData[i].cur_pts[j], cv::Point2f(tmp_prev_uv.x(), tmp_prev_uv.y()), cv::Scalar(255 , 0, 0), 1 , 8, 0);\n+//                    */\n+//                    //char name[10];\n+//                    //sprintf(name, \"%d\", trackerData[i].ids[j]);\n+//                    //cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0));\n+//                }\n+//            }\n+//            //cv::imshow(\"vis\", stereo_img);\n+//            //cv::waitKey(5);\n+//            pub_match.publish(ptr->toImageMsg());\n+//        }\n+//    }\n+//    ROS_INFO(\"whole feature tracker processing costs: %f\", t_r.toc());\n+//}\n+//\n int main(int argc, char **argv)\n {\n     ros::init(argc, argv, \"feature_tracker\");\n@@ -228,7 +410,12 @@ int main(int argc, char **argv)\n         }\n     }\n \n-    ros::Subscriber sub_img = n.subscribe(IMAGE_TOPIC, 100, img_callback);\n+    //ros::Subscriber sub_img = n.subscribe(IMAGE_TOPIC, 100, img_callback);\n+    message_filters::Subscriber<sensor_msgs::Image> image_sub(n, IMAGE_TOPIC, 100);\n+    message_filters::Subscriber<sensor_msgs::Image> mask_image_sub(n, MASK_TOPIC, 100);\n+\n+    message_filters::TimeSynchronizer<sensor_msgs::Image, sensor_msgs::Image> sync(image_sub, mask_image_sub, 10);\n+    sync.registerCallback(boost::bind(&callback, _1, _2));\n \n     pub_img = n.advertise<sensor_msgs::PointCloud>(\"feature\", 1000);\n     pub_match = n.advertise<sensor_msgs::Image>(\"feature_img\",1000);\ndiff --git a/feature_tracker/src/parameters.cpp b/feature_tracker/src/parameters.cpp\nindex 23c35c8..417efb2 100644\n--- a/feature_tracker/src/parameters.cpp\n+++ b/feature_tracker/src/parameters.cpp\n@@ -1,6 +1,7 @@\n #include \"parameters.h\"\n \n std::string IMAGE_TOPIC;\n+std::string MASK_TOPIC;\n std::string IMU_TOPIC;\n std::vector<std::string> CAM_NAMES;\n std::string FISHEYE_MASK;\n@@ -46,6 +47,7 @@ void readParameters(ros::NodeHandle &n)\n     std::string VINS_FOLDER_PATH = readParam<std::string>(n, \"vins_folder\");\n \n     fsSettings[\"image_topic\"] >> IMAGE_TOPIC;\n+    fsSettings[\"mask_topic\"] >> MASK_TOPIC;\n     fsSettings[\"imu_topic\"] >> IMU_TOPIC;\n     MAX_CNT = fsSettings[\"max_cnt\"];\n     MIN_DIST = fsSettings[\"min_dist\"];\ndiff --git a/feature_tracker/src/parameters.h b/feature_tracker/src/parameters.h\nindex 1bb578b..44e5f9c 100644\n--- a/feature_tracker/src/parameters.h\n+++ b/feature_tracker/src/parameters.h\n@@ -9,6 +9,7 @@ const int NUM_OF_CAM = 1;\n \n \n extern std::string IMAGE_TOPIC;\n+extern std::string MASK_TOPIC;\n extern std::string IMU_TOPIC;\n extern std::string FISHEYE_MASK;\n extern std::vector<std::string> CAM_NAMES;\n-- \n2.17.1\n\n\nFrom 8948e61ce641d532d902f6adacd97a890c7456dd Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Mon, 19 Oct 2020 11:11:28 +0530\nSubject: [PATCH 13/42] Apply mask after extraction and with erosion\n\n---\n feature_tracker/src/feature_tracker.cpp | 22 +++++++++++++---------\n feature_tracker/src/feature_tracker.h   |  4 ++--\n 2 files changed, 15 insertions(+), 11 deletions(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex 9d5de16..4384928 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -33,13 +33,12 @@ FeatureTracker::FeatureTracker()\n {\n }\n \n-void FeatureTracker::setMask(const cv::Mat &_mask)\n+void FeatureTracker::setMask()\n {\n     if(FISHEYE)\n         mask = fisheye_mask.clone();\n     else\n-        //mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255));\n-        mask = _mask.clone();\n+        mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255));\n     \n \n     // prefer to keep features that are tracked for long time\n@@ -69,13 +68,18 @@ void FeatureTracker::setMask(const cv::Mat &_mask)\n     }\n }\n \n-void FeatureTracker::addPoints()\n+void FeatureTracker::addPoints(const cv::Mat &_mask)\n {\n+    cv::Mat dynamic_mask;\n+    cv::erode(_mask, dynamic_mask, cv::Mat(), cv::Point(-1, -1), 2, 1, 1);\n     for (auto &p : n_pts)\n     {\n-        forw_pts.push_back(p);\n-        ids.push_back(-1);\n-        track_cnt.push_back(1);\n+        if(dynamic_mask.at<uchar>(p) == 255)\n+        {\n+            forw_pts.push_back(p);\n+            ids.push_back(-1);\n+            track_cnt.push_back(1);\n+        }\n     }\n }\n \n@@ -133,7 +137,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double\n         rejectWithF();\n         ROS_DEBUG(\"set mask begins\");\n         TicToc t_m;\n-        setMask(_mask);\n+        setMask();\n         ROS_DEBUG(\"set mask costs %fms\", t_m.toc());\n \n         ROS_DEBUG(\"detect feature begins\");\n@@ -155,7 +159,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double\n \n         ROS_DEBUG(\"add feature begins\");\n         TicToc t_a;\n-        addPoints();\n+        addPoints(_mask);\n         ROS_DEBUG(\"selectFeature costs: %fms\", t_a.toc());\n     }\n     prev_img = cur_img;\ndiff --git a/feature_tracker/src/feature_tracker.h b/feature_tracker/src/feature_tracker.h\nindex cb376db..c98c7ac 100644\n--- a/feature_tracker/src/feature_tracker.h\n+++ b/feature_tracker/src/feature_tracker.h\n@@ -32,9 +32,9 @@ class FeatureTracker\n \n     void readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time);\n \n-    void setMask(const cv::Mat &_mask);\n+    void setMask();\n \n-    void addPoints();\n+    void addPoints(const cv::Mat &_mask);\n \n     bool updateID(unsigned int i);\n \n-- \n2.17.1\n\n\nFrom cac43fd693103c98cbf95fa628b5f953681b3384 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sun, 29 Nov 2020 11:29:04 +0530\nSubject: [PATCH 14/42] Changes for debugging\n\n---\n vins_estimator/src/estimator.cpp          | 17 ++++++++++-------\n vins_estimator/src/initial/solve_5pts.cpp |  7 ++++---\n 2 files changed, 14 insertions(+), 10 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 096ef10..5b302ea 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -275,9 +275,12 @@ bool Estimator::initialStructure()\n         return false;\n     }\n     GlobalSFM sfm;\n-    if(!sfm.constructH(frame_count + 1, Q, T, l,\n-              relative_R, relative_T, n,\n-              sfm_f, sfm_tracked_points))\n+    //if(!sfm.constructH(frame_count + 1, Q, T, l,\n+    //          relative_R, relative_T, n,\n+    //          sfm_f, sfm_tracked_points))\n+    if(!sfm.construct(frame_count + 1, Q, T, l,\n+                relative_R, relative_T,\n+                sfm_f, sfm_tracked_points))\n     {\n         ROS_DEBUG(\"global SFM failed!\");\n         marginalization_flag = MARGIN_OLD;\n@@ -808,8 +811,8 @@ void Estimator::optimization()\n                 ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n \n-                ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n-                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n+                //ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n+                //problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n         }\n@@ -846,7 +849,7 @@ void Estimator::optimization()\n                     ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                     problem.AddResidualBlock(f, loss_function, para_Pose[start], relo_Pose, para_Ex_Pose[0], para_Feature[feature_index]);\n                     retrive_feature_index++;\n-                }     \n+                }\n             }\n         }\n \n@@ -882,7 +885,7 @@ void Estimator::optimization()\n \n         if (last_marginalization_info)\n         {\n-            vector<int> drop_set;\n+            vector<int> drop_set; \n             for (int i = 0; i < static_cast<int>(last_marginalization_parameter_blocks.size()); i++)\n             {\n                 if (last_marginalization_parameter_blocks[i] == para_Pose[0] ||\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex bdc87a3..57485af 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -275,7 +275,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n \n             Vector3d t;\n             cv::cv2eigen(cv_ts[i], t);\n-            Tr.block(0,3,3,1) = t;\n+            Tr.block(0,3,3,1) = t.normalized();\n \n             //Tr = TrIC * Tr * TrIC.inverse();\n             Tr = Tr.inverse().eval();\n@@ -283,6 +283,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n             Vector3d e3(0, 0, 1);\n             Vector3d n;\n             cv::cv2eigen(cv_ns[i], n);\n+            n.normalize();\n             if(n.dot(e3) > 0)\n                 positive_depth_transforms.push_back(Tr);\n                 positive_depth_normals.push_back(n);\n@@ -311,13 +312,13 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n \n         Vector3d t;\n         cv::cv2eigen(cv_ts[0], t);\n-        Tr.block(0,3,3,1) = t;\n+        Tr.block(0,3,3,1) = t.normalized();\n \n         Vector3d n;\n         cv::cv2eigen(cv_ns[0], n);\n \n         //Tr = TrIC * Tr * TrIC.inverse();\n         est_Tr = Tr.inverse().eval();\n-        est_n = n;\n+        est_n = n.normalized();\n     }\n }\n-- \n2.17.1\n\n\nFrom 592df9a9008d83b575ec808197e53cfdf0084dff Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Tue, 1 Dec 2020 11:28:04 +0530\nSubject: [PATCH 15/42] Change K, avg parallax, and decomposition parameters\n\n---\n feature_tracker/src/feature_tracker.cpp   |  2 +-\n vins_estimator/src/estimator.cpp          | 15 ++++++---------\n vins_estimator/src/initial/solve_5pts.cpp | 11 +++++++----\n 3 files changed, 14 insertions(+), 14 deletions(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex 4384928..50251ca 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -173,7 +173,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double\n \n void FeatureTracker::rejectWithF()\n {\n-    if (forw_pts.size() >= 8)\n+    if (forw_pts.size() >= 4)\n     {\n         ROS_DEBUG(\"FM ransac begins\");\n         TicToc t_f;\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 5b302ea..d493036 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -275,12 +275,9 @@ bool Estimator::initialStructure()\n         return false;\n     }\n     GlobalSFM sfm;\n-    //if(!sfm.constructH(frame_count + 1, Q, T, l,\n-    //          relative_R, relative_T, n,\n-    //          sfm_f, sfm_tracked_points))\n-    if(!sfm.construct(frame_count + 1, Q, T, l,\n-                relative_R, relative_T,\n-                sfm_f, sfm_tracked_points))\n+    if(!sfm.constructH(frame_count + 1, Q, T, l,\n+              relative_R, relative_T, n,\n+              sfm_f, sfm_tracked_points))\n     {\n         ROS_DEBUG(\"global SFM failed!\");\n         marginalization_flag = MARGIN_OLD;\n@@ -507,7 +504,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector\n             for(int k = WINDOW_SIZE - 1; k > i; k--)\n                 R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n \n-            if(average_parallax * 660 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n))\n+            if(average_parallax * 480 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n))\n             {\n                 l = i;\n                 ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\n@@ -811,8 +808,8 @@ void Estimator::optimization()\n                 ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n \n-                //ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n-                //problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n+                ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n+                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n         }\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex 57485af..94a2a04 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -240,7 +240,8 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         }\n         cv::Mat mask;\n         //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask);\n-        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/665);\n+        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480);\n+        //cv::Mat K = (cv::Mat_<double>(3, 3) << 320, 0, 320, 0, 320, 240, 0, 0, 1);\n         cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n         //cv::Mat K = (cv::Mat_<double>(3,3) << 665, 0, 511, 0, 665, 383, 0, 0, 1);\n \n@@ -275,7 +276,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n \n             Vector3d t;\n             cv::cv2eigen(cv_ts[i], t);\n-            Tr.block(0,3,3,1) = t.normalized();\n+            Tr.block(0,3,3,1) = t;\n \n             //Tr = TrIC * Tr * TrIC.inverse();\n             Tr = Tr.inverse().eval();\n@@ -285,8 +286,10 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n             cv::cv2eigen(cv_ns[i], n);\n             n.normalize();\n             if(n.dot(e3) > 0)\n+            {\n                 positive_depth_transforms.push_back(Tr);\n                 positive_depth_normals.push_back(n);\n+            }\n         }\n \n         vector<double> rot_diff;\n@@ -300,7 +303,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n \n         int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin();\n         est_Tr = positive_depth_transforms[min_index];\n-        est_n = positive_depth_normals[min_index];\n+        est_n = positive_depth_normals[min_index].normalized();\n     }\n \n     else\n@@ -312,7 +315,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n \n         Vector3d t;\n         cv::cv2eigen(cv_ts[0], t);\n-        Tr.block(0,3,3,1) = t.normalized();\n+        Tr.block(0,3,3,1) = t;\n \n         Vector3d n;\n         cv::cv2eigen(cv_ns[0], n);\n-- \n2.17.1\n\n\nFrom 0cf646f9323df0d94551e541e1221febd4ad3e7a Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Wed, 2 Dec 2020 08:17:02 +0530\nSubject: [PATCH 16/42] Use h only for two-frame init\n\n---\n config/vins_rviz_config.rviz     | 28 +++++++++++++++++++---------\n vins_estimator/src/estimator.cpp |  6 +++---\n 2 files changed, 22 insertions(+), 12 deletions(-)\n\ndiff --git a/config/vins_rviz_config.rviz b/config/vins_rviz_config.rviz\nindex 92ff49d..55ed131 100644\n--- a/config/vins_rviz_config.rviz\n+++ b/config/vins_rviz_config.rviz\n@@ -3,7 +3,8 @@ Panels:\n     Help Height: 0\n     Name: Displays\n     Property Tree Widget:\n-      Expanded: ~\n+      Expanded:\n+        - /VIO1\n       Splitter Ratio: 0.4651159942150116\n     Tree Height: 151\n   - Class: rviz/Selection\n@@ -24,7 +25,7 @@ Panels:\n     Experimental: false\n     Name: Time\n     SyncMode: 0\n-    SyncSource: \"\"\n+    SyncSource: tracked image\n   - Class: rviz/Displays\n     Help Height: 78\n     Name: Displays\n@@ -141,7 +142,7 @@ Visualization Manager:\n           Marker Topic: /vins_estimator/camera_pose_visual\n           Name: camera_visual\n           Namespaces:\n-            {}\n+            CameraPoseVisualization: true\n           Queue Size: 100\n           Value: true\n         - Alpha: 1\n@@ -186,7 +187,7 @@ Visualization Manager:\n           Color: 255; 255; 255\n           Color Transformer: FlatColor\n           Decay Time: 3000\n-          Enabled: false\n+          Enabled: true\n           Invert Rainbow: true\n           Max Color: 0; 0; 0\n           Max Intensity: 4096\n@@ -203,19 +204,28 @@ Visualization Manager:\n           Unreliable: false\n           Use Fixed Frame: true\n           Use rainbow: false\n-          Value: false\n+          Value: true\n         - Class: rviz/TF\n           Enabled: true\n           Frame Timeout: 15\n           Frames:\n             All Enabled: true\n+            body:\n+              Value: true\n+            camera:\n+              Value: true\n+            world:\n+              Value: true\n           Marker Scale: 1\n           Name: TF\n           Show Arrows: true\n           Show Axes: true\n           Show Names: true\n           Tree:\n-            {}\n+            world:\n+              body:\n+                camera:\n+                  {}\n           Update Interval: 0\n           Value: true\n       Enabled: true\n@@ -470,7 +480,7 @@ Visualization Manager:\n   Views:\n     Current:\n       Class: rviz/XYOrbit\n-      Distance: 20.968406677246094\n+      Distance: 117.1939697265625\n       Enable Stereo Rendering:\n         Stereo Eye Separation: 0.05999999865889549\n         Stereo Focal Distance: 1\n@@ -485,10 +495,10 @@ Visualization Manager:\n       Invert Z Axis: false\n       Name: Current View\n       Near Clip Distance: 0.009999999776482582\n-      Pitch: 1.0797961950302124\n+      Pitch: 0.01979677751660347\n       Target Frame: <Fixed Frame>\n       Value: XYOrbit (rviz)\n-      Yaw: 3.087226629257202\n+      Yaw: 3.0572268962860107\n     Saved: ~\n Window Geometry:\n   Displays:\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex d493036..84fa09e 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -275,8 +275,8 @@ bool Estimator::initialStructure()\n         return false;\n     }\n     GlobalSFM sfm;\n-    if(!sfm.constructH(frame_count + 1, Q, T, l,\n-              relative_R, relative_T, n,\n+    if(!sfm.construct(frame_count + 1, Q, T, l,\n+              relative_R, relative_T,\n               sfm_f, sfm_tracked_points))\n     {\n         ROS_DEBUG(\"global SFM failed!\");\n@@ -809,7 +809,7 @@ void Estimator::optimization()\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n \n                 ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n-                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n+                //problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n         }\n-- \n2.17.1\n\n\nFrom d625b4dc0982f33cdd17d64620d40de34244f78c Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 17 Dec 2020 13:42:40 +0530\nSubject: [PATCH 17/42] Change to qn parameterization, with plane depth\n\n---\n vins_estimator/src/estimator.cpp              | 22 +++++++++++-------\n vins_estimator/src/estimator.h                |  4 +++-\n vins_estimator/src/factor/homography_factor.h | 12 ++++++----\n vins_estimator/src/initial/initial_sfm.cpp    | 23 ++++++++++---------\n vins_estimator/src/initial/initial_sfm.h      |  4 ++--\n vins_estimator/src/initial/solve_5pts.cpp     |  4 +---\n 6 files changed, 40 insertions(+), 29 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 84fa09e..7a5245c 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -275,17 +275,18 @@ bool Estimator::initialStructure()\n         return false;\n     }\n     GlobalSFM sfm;\n-    if(!sfm.construct(frame_count + 1, Q, T, l,\n-              relative_R, relative_T,\n+    if(!sfm.constructH(frame_count + 1, Q, T, l,\n+              relative_R, relative_T, n,\n               sfm_f, sfm_tracked_points))\n     {\n         ROS_DEBUG(\"global SFM failed!\");\n         marginalization_flag = MARGIN_OLD;\n         return false;\n     }\n-    para_N[0] = n[0];\n-    para_N[1] = n[1];\n-    para_N[2] = n[2];\n+    para_N[0] = 0;\n+    para_N[1] = n[0];\n+    para_N[2] = n[1];\n+    para_N[3] = n[2];\n \n     //solve pnp for all frame\n     map<double, ImageFrame>::iterator frame_it;\n@@ -400,7 +401,8 @@ bool Estimator::visualInitialAlign()\n     f_manager.setRic(ric);\n     f_manager.triangulate(Ps, &(TIC_TMP[0]), &(RIC[0]));\n \n-    double s = (x.tail<1>())(0);\n+    s = (x.tail<1>())(0);\n+    para_s[0] = s;\n     for (int i = 0; i <= WINDOW_SIZE; i++)\n     {\n         pre_integrations[i]->repropagate(Vector3d::Zero(), Bgs[i]);\n@@ -504,7 +506,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector\n             for(int k = WINDOW_SIZE - 1; k > i; k--)\n                 R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n \n-            if(average_parallax * 480 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n))\n+            if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n))\n             {\n                 l = i;\n                 ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\n@@ -724,6 +726,10 @@ void Estimator::optimization()\n         problem.AddParameterBlock(para_Pose[i], SIZE_POSE, local_parameterization);\n         problem.AddParameterBlock(para_SpeedBias[i], SIZE_SPEEDBIAS);\n     }\n+\n+    ceres::LocalParameterization *local_n_parameterization = new ceres::QuaternionParameterization();\n+    problem.AddParameterBlock(para_N, 4, local_n_parameterization);\n+\n     for (int i = 0; i < NUM_OF_CAM; i++)\n     {\n         ceres::LocalParameterization *local_parameterization = new PoseLocalParameterization();\n@@ -809,7 +815,7 @@ void Estimator::optimization()\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n \n                 ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n-                //problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Ex_Pose[0]);\n+                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Feature[feature_index], para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n         }\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex f3feedf..ba73f41 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -79,6 +79,7 @@ class Estimator\n     Vector3d Bas[(WINDOW_SIZE + 1)];\n     Vector3d Bgs[(WINDOW_SIZE + 1)];\n     double td;\n+    double s;\n \n     Matrix3d back_R0, last_R, last_R0;\n     Vector3d back_P0, last_P, last_P0;\n@@ -115,7 +116,8 @@ class Estimator\n     double para_Retrive_Pose[SIZE_POSE];\n     double para_Td[1][1];\n     double para_Tr[1][1];\n-    double para_N[3];\n+    double para_N[4];\n+    double para_s[1];\n \n     int loop_window_index;\n \ndiff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h\nindex 2a35d40..2fdccb5 100644\n--- a/vins_estimator/src/factor/homography_factor.h\n+++ b/vins_estimator/src/factor/homography_factor.h\n@@ -7,7 +7,7 @@ struct HomographyFactor\n     HomographyFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) {}\n \n     template <typename T>\n-        bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const ex_pose, \n+        bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const para_inv_depth, const T* const ex_pose, \n         T* residuals) const\n         {\n             Eigen::Map<const Eigen::Matrix<T, 3, 1>> pi(pose_i);\n@@ -22,14 +22,18 @@ struct HomographyFactor\n             Eigen::Quaternion<T> qic;\n             qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6];\n \n-            Eigen::Map<const Eigen::Matrix<T, 3, 1>> n(para_n);\n+            Eigen::Map<const Eigen::Matrix<T, 4, 1>> q_n(para_n);\n+            Eigen::Matrix<T, 3, 1> n;\n+            n << q_n[1], q_n[2], q_n[3];\n             Eigen::Matrix<T, 3, 1> n_imu_1 = qic*(n.normalized()) + tic;\n             Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi;\n \n+            Eigen::Map<const Eigen::Matrix<T, 1, 1>> inv_depth(para_inv_depth);\n+\n             Eigen::Quaternion<T> qji = qj.inverse() * qi;\n             Eigen::Matrix<T, 3, 1> tji = qj.inverse() * (pi - pj);\n             Eigen::Matrix<T, 3, 1> pts_imu_i = qic * pts_i.cast<T>() + tic;\n-            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + tji * n_imu_i.transpose() * pts_imu_i;\n+            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + ((tji*inv_depth[0]) * n_imu_i.transpose()) * pts_imu_i;\n             Eigen::Matrix<T, 3, 1> pts_cam_j = qic.inverse() * (pts_imu_j - tic);\n \n             pts_cam_j = (pts_cam_j / pts_cam_j[2]);\n@@ -41,7 +45,7 @@ struct HomographyFactor\n \n     static ceres::CostFunction* Create(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j)\n     {\n-        return (new ceres::AutoDiffCostFunction<HomographyFactor, 2, 7, 7, 3, 7>\n+        return (new ceres::AutoDiffCostFunction<HomographyFactor, 2, 7, 7, 4, 1, 7>\n                 (new HomographyFactor(_pts_i, _pts_j)));\n     }\n \ndiff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp\nindex 76a4f88..2877082 100644\n--- a/vins_estimator/src/initial/initial_sfm.cpp\n+++ b/vins_estimator/src/initial/initial_sfm.cpp\n@@ -342,7 +342,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \tQuaterniond c_Quat[frame_num];\n \tdouble c_rotation[frame_num][4];\n \tdouble c_translation[frame_num][3];\n-    double c_normal[3];\n+    double c_normal[4];\n \tEigen::Matrix<double, 3, 4> Pose[frame_num];\n \n \tc_Quat[l] = q[l].inverse();\n@@ -462,10 +462,11 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t}\n \t}\n \n-    c_normal[0] = n[0];\n-    c_normal[1] = n[1];\n-    c_normal[2] = n[2];\n-    problem.AddParameterBlock(c_normal, 3);\n+\tc_normal[0] = 0;\n+    c_normal[1] = n[0];\n+    c_normal[2] = n[1];\n+    c_normal[3] = n[2];\n+    problem.AddParameterBlock(c_normal, 4, local_parameterization);\n \n \tfor (int i = 0; i < feature_num; i++)\n \t{\n@@ -479,14 +480,14 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n \n-    \t\tproblem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l],\n-    \t\t\t\t\t\t\t\tsfm_f[i].position); \n+    \t\t//problem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l],\n+    \t\t//\t\t\t\t\t\tsfm_f[i].position); \n \n \t\t\tceres::CostFunction* h_cost_function = ReprojectionErrorH::Create(\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n \n-    \t\tproblem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l], c_normal, sfm_f[i].image);\n+    \t\tproblem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l], c_normal, sfm_f[i].position);\n \t\t}\n \t}\n \tceres::Solver::Options options;\n@@ -526,9 +527,9 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t\tsfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]);\n \t}\n \n-    n[0] = c_normal[0];\n-    n[1] = c_normal[1];\n-    n[2] = c_normal[2];\n+    n[0] = c_normal[1];\n+    n[1] = c_normal[2];\n+    n[2] = c_normal[3];\n \n \treturn true;\n \ndiff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h\nindex e79dcea..f191434 100644\n--- a/vins_estimator/src/initial/initial_sfm.h\n+++ b/vins_estimator/src/initial/initial_sfm.h\n@@ -66,7 +66,7 @@ struct ReprojectionErrorH\n \t{\n \t\tT p[3];\n         ceres::QuaternionRotatePoint(R, point, p);\n-        T np = n[0] * point[0] + n[1] * point[1] + n[2] * point[2];\n+        T np = n[1] * point[0] + n[2] * point[1] + n[3] * point[2];\n         p[0] += t[0]*np; p[1] += t[1]*np; p[2] += t[2]*np;\n \t\tT xp = p[0] / p[2];\n     \tT yp = p[1] / p[2];\n@@ -79,7 +79,7 @@ struct ReprojectionErrorH\n \t                                   const double observed_y) \n \t{\n \t  return (new ceres::AutoDiffCostFunction<\n-\t          ReprojectionErrorH, 2, 4, 3, 3, 3>(\n+\t          ReprojectionErrorH, 2, 4, 3, 4, 3>(\n \t          \tnew ReprojectionErrorH(observed_x,observed_y)));\n \t}\n \ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex 94a2a04..d05faa5 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -241,9 +241,7 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         cv::Mat mask;\n         //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask);\n         cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480);\n-        //cv::Mat K = (cv::Mat_<double>(3, 3) << 320, 0, 320, 0, 320, 240, 0, 0, 1);\n         cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n-        //cv::Mat K = (cv::Mat_<double>(3,3) << 665, 0, 511, 0, 665, 383, 0, 0, 1);\n \n         Eigen::Matrix4d est_Tr;\n         Eigen::Vector3d est_n;\n@@ -303,7 +301,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n \n         int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin();\n         est_Tr = positive_depth_transforms[min_index];\n-        est_n = positive_depth_normals[min_index].normalized();\n+        est_n = positive_depth_normals[min_index];\n     }\n \n     else\n-- \n2.17.1\n\n\nFrom 208ef88ed7f9fc41793852797b86c3d411f1c1e4 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 1 Jan 2021 12:53:32 +0530\nSubject: [PATCH 18/42] Use mean point vector to disambiguate H, change\n thresholds\n\n---\n feature_tracker/src/feature_tracker.cpp   |  2 +-\n vins_estimator/src/initial/solve_5pts.cpp | 20 ++++++++++++++++----\n vins_estimator/src/initial/solve_5pts.h   |  2 +-\n 3 files changed, 18 insertions(+), 6 deletions(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex 50251ca..afc174e 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -194,7 +194,7 @@ void FeatureTracker::rejectWithF()\n \n         vector<uchar> status;\n         //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n-        cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status);\n+        cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 0.99, status);\n         int size_a = cur_pts.size();\n         reduceVector(prev_pts, status);\n         reduceVector(cur_pts, status);\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex d05faa5..f86c2bf 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -240,12 +240,23 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         }\n         cv::Mat mask;\n         //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask);\n-        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480);\n+        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/460);\n         cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n \n+        // Compute mean point vector\n+        Eigen::Vector3d mean_l(0, 0, 1);\n+        for(cv::Point2f &point : ll)\n+        {\n+            mean_l(0) += point.x;\n+            mean_l(1) += point.y;\n+        }\n+\n+        mean_l(0) /= int(ll.size());\n+        mean_l(1) /= int(ll.size());\n+\n         Eigen::Matrix4d est_Tr;\n         Eigen::Vector3d est_n;\n-        decomposeH(H, K, R_imu, TrIC, est_Tr, est_n);\n+        decomposeH(H, K, R_imu, TrIC, mean_l, est_Tr, est_n);\n         Rotation = est_Tr.block(0,0,3,3);\n         Translation = est_Tr.block(0,3,3,1);\n         n = est_n;\n@@ -255,7 +266,8 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n }\n \n \n-void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix4d &est_Tr, Vector3d &est_n)\n+void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, \n+    const Matrix4d &TrIC, const Vector3d &mean_l, Matrix4d &est_Tr, Vector3d &est_n)\n {\n     vector<cv::Mat> cv_Rs, cv_ts, cv_ns;\n     int n_sols = cv::decomposeHomographyMat(H, K, cv_Rs, cv_ts, cv_ns);\n@@ -283,7 +295,7 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n             Vector3d n;\n             cv::cv2eigen(cv_ns[i], n);\n             n.normalize();\n-            if(n.dot(e3) > 0)\n+            if(n.dot(mean_l) > 0)\n             {\n                 positive_depth_transforms.push_back(Tr);\n                 positive_depth_normals.push_back(n);\ndiff --git a/vins_estimator/src/initial/solve_5pts.h b/vins_estimator/src/initial/solve_5pts.h\nindex f090370..0da7ba2 100644\n--- a/vins_estimator/src/initial/solve_5pts.h\n+++ b/vins_estimator/src/initial/solve_5pts.h\n@@ -16,7 +16,7 @@ class MotionEstimator\n \n     bool solveRelativeRT(const vector<pair<Vector3d, Vector3d>> &corres, Matrix3d &R, Vector3d &T);\n     bool solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T, Vector3d &n);\n-    void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const  Matrix4d &TrIC, Matrix4d &est_Tr, Vector3d &est_n);\n+    void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const  Matrix4d &TrIC, const Vector3d &mean_l, Matrix4d &est_Tr, Vector3d &est_n);\n \n   private:\n     double testTriangulation(const vector<cv::Point2f> &l,\n-- \n2.17.1\n\n\nFrom c3c81a017e2a2ecf8d54ce1efacc0e602a8bea3a Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 1 Jan 2021 12:56:14 +0530\nSubject: [PATCH 19/42] Propogate plane-d from initial window, instead of using\n per-point depth\n\n---\n vins_estimator/src/estimator.cpp              | 5 ++---\n vins_estimator/src/estimator.h                | 2 +-\n vins_estimator/src/factor/homography_factor.h | 4 +++-\n 3 files changed, 6 insertions(+), 5 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 7a5245c..7105514 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -402,7 +402,7 @@ bool Estimator::visualInitialAlign()\n     f_manager.triangulate(Ps, &(TIC_TMP[0]), &(RIC[0]));\n \n     s = (x.tail<1>())(0);\n-    para_s[0] = s;\n+    para_d[0] = s;\n     for (int i = 0; i <= WINDOW_SIZE; i++)\n     {\n         pre_integrations[i]->repropagate(Vector3d::Zero(), Bgs[i]);\n@@ -462,7 +462,6 @@ bool Estimator::relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l)\n                 Vector2d pts_1(corres[j].second(0), corres[j].second(1));\n                 double parallax = (pts_0 - pts_1).norm();\n                 sum_parallax = sum_parallax + parallax;\n-\n             }\n             average_parallax = 1.0 * sum_parallax / int(corres.size());\n             if(average_parallax * 460 > 30 && m_estimator.solveRelativeRT(corres, relative_R, relative_T))\n@@ -815,7 +814,7 @@ void Estimator::optimization()\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n \n                 ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n-                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_Feature[feature_index], para_Ex_Pose[0]);\n+                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_d, para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n         }\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex ba73f41..adad3cd 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -117,7 +117,7 @@ class Estimator\n     double para_Td[1][1];\n     double para_Tr[1][1];\n     double para_N[4];\n-    double para_s[1];\n+    double para_d[1];\n \n     int loop_window_index;\n \ndiff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h\nindex 2fdccb5..ce6612b 100644\n--- a/vins_estimator/src/factor/homography_factor.h\n+++ b/vins_estimator/src/factor/homography_factor.h\n@@ -32,8 +32,10 @@ struct HomographyFactor\n \n             Eigen::Quaternion<T> qji = qj.inverse() * qi;\n             Eigen::Matrix<T, 3, 1> tji = qj.inverse() * (pi - pj);\n+            Eigen::Matrix<T, 1, 1> di;\n+            di(0,0) = inv_depth(0,0) - pi.dot(n_imu_i);\n             Eigen::Matrix<T, 3, 1> pts_imu_i = qic * pts_i.cast<T>() + tic;\n-            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + ((tji*inv_depth[0]) * n_imu_i.transpose()) * pts_imu_i;\n+            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + (tji*(1.0/di(0,0)) * n_imu_i.transpose()) * pts_imu_i;\n             Eigen::Matrix<T, 3, 1> pts_cam_j = qic.inverse() * (pts_imu_j - tic);\n \n             pts_cam_j = (pts_cam_j / pts_cam_j[2]);\n-- \n2.17.1\n\n\nFrom c9f6349946574ca616acb11987d76d40055e59cf Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 7 Jan 2021 11:05:10 +0530\nSubject: [PATCH 20/42] Revert to previous thresholds\n\n---\n feature_tracker/src/feature_tracker.cpp    |  2 +-\n vins_estimator/src/estimator.cpp           |  2 --\n vins_estimator/src/initial/initial_sfm.cpp |  4 ++--\n vins_estimator/src/initial/initial_sfm.h   | 10 +++++-----\n vins_estimator/src/initial/solve_5pts.cpp  |  2 +-\n 5 files changed, 9 insertions(+), 11 deletions(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex afc174e..50251ca 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -194,7 +194,7 @@ void FeatureTracker::rejectWithF()\n \n         vector<uchar> status;\n         //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n-        cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 0.99, status);\n+        cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status);\n         int size_a = cur_pts.size();\n         reduceVector(prev_pts, status);\n         reduceVector(cur_pts, status);\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 7105514..bfa5c1e 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -779,7 +779,6 @@ void Estimator::optimization()\n         int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;\n         \n         Vector3d pts_i = it_per_id.feature_per_frame[0].point;\n-        Vector3d img_pts_i(it_per_id.feature_per_frame[0].uv.x(), it_per_id.feature_per_frame[0].uv.y(), 1);\n \n         for (auto &it_per_frame : it_per_id.feature_per_frame)\n         {\n@@ -790,7 +789,6 @@ void Estimator::optimization()\n             }\n \n             Vector3d pts_j = it_per_frame.point;\n-            Vector3d img_pts_j(it_per_frame.uv.x(), it_per_frame.uv.y(), 1);\n \n             if (ESTIMATE_TD)\n             {\ndiff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp\nindex 2877082..ad9f729 100644\n--- a/vins_estimator/src/initial/initial_sfm.cpp\n+++ b/vins_estimator/src/initial/initial_sfm.cpp\n@@ -480,8 +480,8 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n \n-    \t\t//problem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l],\n-    \t\t//\t\t\t\t\t\tsfm_f[i].position); \n+    \t\tproblem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l],\n+    \t\t\t\t\t\t\t\tsfm_f[i].position); \n \n \t\t\tceres::CostFunction* h_cost_function = ReprojectionErrorH::Create(\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\ndiff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h\nindex f191434..a070742 100644\n--- a/vins_estimator/src/initial/initial_sfm.h\n+++ b/vins_estimator/src/initial/initial_sfm.h\n@@ -64,12 +64,12 @@ struct ReprojectionErrorH\n \ttemplate <typename T>\n \tbool operator()(const T* const R, const T* const t, const T* const n, const T* point, T* residuals) const\n \t{\n-\t\tT p[3];\n-        ceres::QuaternionRotatePoint(R, point, p);\n+\t\tT rp[3];\n+        ceres::QuaternionRotatePoint(R, point, rp);\n         T np = n[1] * point[0] + n[2] * point[1] + n[3] * point[2];\n-        p[0] += t[0]*np; p[1] += t[1]*np; p[2] += t[2]*np;\n-\t\tT xp = p[0] / p[2];\n-    \tT yp = p[1] / p[2];\n+        rp[0] += t[0]*np; rp[1] += t[1]*np; rp[2] += t[2]*np;\n+\t\tT xp = rp[0] / rp[2];\n+    \tT yp = rp[1] / rp[2];\n     \tresiduals[0] = xp - T(observed_u);\n     \tresiduals[1] = yp - T(observed_v);\n     \treturn true;\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex f86c2bf..a14ec1c 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -240,7 +240,7 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         }\n         cv::Mat mask;\n         //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask);\n-        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/460);\n+        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480);\n         cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n \n         // Compute mean point vector\n-- \n2.17.1\n\n\nFrom 2c6255f97f3f02da71082d34e861befc5e9accec Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 8 Jan 2021 11:40:55 +0530\nSubject: [PATCH 21/42] Add code for generic mask to single mask conversion\n\n---\n feature_tracker/src/feature_tracker.cpp      | 40 ++++++++++++++++++--\n feature_tracker/src/feature_tracker_node.cpp |  2 +-\n 2 files changed, 37 insertions(+), 5 deletions(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex 50251ca..178a7a7 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -28,7 +28,6 @@ void reduceVector(vector<int> &v, vector<uchar> status)\n     v.resize(j);\n }\n \n-\n FeatureTracker::FeatureTracker()\n {\n }\n@@ -70,11 +69,44 @@ void FeatureTracker::setMask()\n \n void FeatureTracker::addPoints(const cv::Mat &_mask)\n {\n-    cv::Mat dynamic_mask;\n-    cv::erode(_mask, dynamic_mask, cv::Mat(), cv::Point(-1, -1), 2, 1, 1);\n+    // Find unique labels\n+    cv::Mat mask_copy = _mask.clone();\n+    std::sort(mask_copy.begin<uchar>(), mask_copy.end<uchar>());\n+    auto last = std::unique(mask_copy.begin<uchar>(), mask_copy.end<uchar>());\n+    std::vector<uchar> unique_labels(mask_copy.begin<uchar>(), last);\n+\n+    // Create mask for each label and erode, find largest mask\n+    std::map<uchar, cv::Mat> unique_masks;\n+    for(auto &label : unique_labels)\n+    {\n+        unique_masks.emplace(label, cv::Mat::zeros(_mask.size(), CV_8UC1));\n+        std::cout << int(label) << std::endl;\n+    }\n+\n+    for(int i = 0; i < _mask.rows; i++)\n+        for(int j = 0; j < _mask.cols; j++)\n+            unique_masks[_mask.at<uchar>(i,j)].at<uchar>(i,j) = 255;\n+\n+    uchar largest_label = 0;\n+    int largest_label_count = 0;\n+    for(auto &mask : unique_masks)\n+    {   \n+        if(mask.first == 0)\n+            continue;\n+        cv::erode(mask.second, mask.second, cv::Mat(), cv::Point(-1, -1), 2, 1, 1);\n+        mask.second);\n+        int label_count = cv::countNonZero(mask.second);\n+        if(label_count > largest_label_count)\n+        {\n+            largest_label_count = label_count;\n+            largest_label = mask.first;\n+        }\n+    }\n+\n+    // Track features that only belong to largest mask\n     for (auto &p : n_pts)\n     {\n-        if(dynamic_mask.at<uchar>(p) == 255)\n+        if(unique_masks[largest_label].at<uchar>(p) == 255)\n         {\n             forw_pts.push_back(p);\n             ids.push_back(-1);\ndiff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp\nindex a0dbd25..d1a9432 100644\n--- a/feature_tracker/src/feature_tracker_node.cpp\n+++ b/feature_tracker/src/feature_tracker_node.cpp\n@@ -86,7 +86,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n     {\n         ROS_DEBUG(\"processing camera %d\", i);\n         if (i != 1 || !STEREO_TRACK)\n-            trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image, img_msg->header.stamp.toSec());\n+            trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image.rowRange(ROW*i,ROW*(i+1)), img_msg->header.stamp.toSec());\n         else\n         {\n             if (EQUALIZE)\n-- \n2.17.1\n\n\nFrom cb655fa6061ceb11c7ae7fbddd11af779d738430 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Wed, 13 Jan 2021 11:48:28 +0530\nSubject: [PATCH 22/42] Publish plane id for each feature\n\n---\n feature_tracker/src/feature_tracker.cpp      | 58 ++++++++++----------\n feature_tracker/src/feature_tracker.h        |  1 +\n feature_tracker/src/feature_tracker_node.cpp | 10 +++-\n 3 files changed, 37 insertions(+), 32 deletions(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex 178a7a7..0de4262 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -41,26 +41,31 @@ void FeatureTracker::setMask()\n     \n \n     // prefer to keep features that are tracked for long time\n-    vector<pair<int, pair<cv::Point2f, int>>> cnt_pts_id;\n+    vector<pair<int, pair<cv::Point2f, pair<int,int>>>> cnt_pts_id_pid;\n \n     for (unsigned int i = 0; i < forw_pts.size(); i++)\n-        cnt_pts_id.push_back(make_pair(track_cnt[i], make_pair(forw_pts[i], ids[i])));\n+        cnt_pts_id_pid.push_back(make_pair(track_cnt[i], \n+            make_pair(forw_pts[i], make_pair(ids[i], plane_ids[i]))));\n \n-    sort(cnt_pts_id.begin(), cnt_pts_id.end(), [](const pair<int, pair<cv::Point2f, int>> &a, const pair<int, pair<cv::Point2f, int>> &b)\n+    sort(cnt_pts_id_pid.begin(), cnt_pts_id_pid.end(), \n+        [](const pair<int, pair<cv::Point2f, pair<int,int>>> &a, \n+            const pair<int, pair<cv::Point2f, pair<int,int>>> &b)\n          {\n             return a.first > b.first;\n          });\n \n     forw_pts.clear();\n     ids.clear();\n+    plane_ids.clear();\n     track_cnt.clear();\n \n-    for (auto &it : cnt_pts_id)\n+    for (auto &it : cnt_pts_id_pid)\n     {\n         if (mask.at<uchar>(it.second.first) == 255)\n         {\n             forw_pts.push_back(it.second.first);\n-            ids.push_back(it.second.second);\n+            ids.push_back(it.second.second.first);\n+            plane_ids.push_back(it.second.second.second);\n             track_cnt.push_back(it.first);\n             cv::circle(mask, it.second.first, MIN_DIST, 0, -1);\n         }\n@@ -75,42 +80,35 @@ void FeatureTracker::addPoints(const cv::Mat &_mask)\n     auto last = std::unique(mask_copy.begin<uchar>(), mask_copy.end<uchar>());\n     std::vector<uchar> unique_labels(mask_copy.begin<uchar>(), last);\n \n-    // Create mask for each label and erode, find largest mask\n+    // Create mask for each label\n     std::map<uchar, cv::Mat> unique_masks;\n     for(auto &label : unique_labels)\n-    {\n         unique_masks.emplace(label, cv::Mat::zeros(_mask.size(), CV_8UC1));\n-        std::cout << int(label) << std::endl;\n-    }\n \n     for(int i = 0; i < _mask.rows; i++)\n         for(int j = 0; j < _mask.cols; j++)\n             unique_masks[_mask.at<uchar>(i,j)].at<uchar>(i,j) = 255;\n \n-    uchar largest_label = 0;\n-    int largest_label_count = 0;\n-    for(auto &mask : unique_masks)\n-    {   \n-        if(mask.first == 0)\n-            continue;\n-        cv::erode(mask.second, mask.second, cv::Mat(), cv::Point(-1, -1), 2, 1, 1);\n-        mask.second);\n-        int label_count = cv::countNonZero(mask.second);\n-        if(label_count > largest_label_count)\n-        {\n-            largest_label_count = label_count;\n-            largest_label = mask.first;\n-        }\n-    }\n+    // Erode each mask to avoid dynamic edges\n+    for(auto &unique_mask : unique_masks)\n+        cv::erode(unique_mask.second, unique_mask.second, cv::Mat(), cv::Point(-1, -1), 2, 1, 1);\n \n-    // Track features that only belong to largest mask\n+    // Store plane id for each feature\n     for (auto &p : n_pts)\n     {\n-        if(unique_masks[largest_label].at<uchar>(p) == 255)\n+        for(auto &unique_mask : unique_masks)\n         {\n-            forw_pts.push_back(p);\n-            ids.push_back(-1);\n-            track_cnt.push_back(1);\n+            if(unique_mask.first == 0)\n+                continue;\n+                \n+            if(unique_mask.second.at<uchar>(p) == 255)\n+            {\n+                forw_pts.push_back(p);\n+                ids.push_back(-1);\n+                track_cnt.push_back(1);\n+                plane_ids.push_back(unique_mask.first);\n+                break;\n+            }\n         }\n     }\n }\n@@ -156,6 +154,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double\n         reduceVector(cur_pts, status);\n         reduceVector(forw_pts, status);\n         reduceVector(ids, status);\n+        reduceVector(plane_ids, status);\n         reduceVector(cur_un_pts, status);\n         reduceVector(track_cnt, status);\n         ROS_DEBUG(\"temporal optical flow costs: %fms\", t_o.toc());\n@@ -233,6 +232,7 @@ void FeatureTracker::rejectWithF()\n         reduceVector(forw_pts, status);\n         reduceVector(cur_un_pts, status);\n         reduceVector(ids, status);\n+        reduceVector(plane_ids, status);\n         reduceVector(track_cnt, status);\n         ROS_DEBUG(\"FM ransac: %d -> %lu: %f\", size_a, forw_pts.size(), 1.0 * forw_pts.size() / size_a);\n         ROS_DEBUG(\"FM ransac costs: %fms\", t_f.toc());\ndiff --git a/feature_tracker/src/feature_tracker.h b/feature_tracker/src/feature_tracker.h\nindex c98c7ac..5db3810 100644\n--- a/feature_tracker/src/feature_tracker.h\n+++ b/feature_tracker/src/feature_tracker.h\n@@ -54,6 +54,7 @@ class FeatureTracker\n     vector<cv::Point2f> prev_un_pts, cur_un_pts;\n     vector<cv::Point2f> pts_velocity;\n     vector<int> ids;\n+    vector<int> plane_ids;\n     vector<int> track_cnt;\n     map<int, cv::Point2f> cur_un_pts_map;\n     map<int, cv::Point2f> prev_un_pts_map;\ndiff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp\nindex d1a9432..b9de94a 100644\n--- a/feature_tracker/src/feature_tracker_node.cpp\n+++ b/feature_tracker/src/feature_tracker_node.cpp\n@@ -118,6 +118,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n         pub_count++;\n         sensor_msgs::PointCloudPtr feature_points(new sensor_msgs::PointCloud);\n         sensor_msgs::ChannelFloat32 id_of_point;\n+        sensor_msgs::ChannelFloat32 plane_id_of_point; \n         sensor_msgs::ChannelFloat32 u_of_point;\n         sensor_msgs::ChannelFloat32 v_of_point;\n         sensor_msgs::ChannelFloat32 velocity_x_of_point;\n@@ -132,6 +133,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n             auto &un_pts = trackerData[i].cur_un_pts;\n             auto &cur_pts = trackerData[i].cur_pts;\n             auto &ids = trackerData[i].ids;\n+            auto &plane_ids = trackerData[i].plane_ids;\n             auto &pts_velocity = trackerData[i].pts_velocity;\n             for (unsigned int j = 0; j < ids.size(); j++)\n             {\n@@ -146,6 +148,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n \n                     feature_points->points.push_back(p);\n                     id_of_point.values.push_back(p_id * NUM_OF_CAM + i);\n+                    plane_id_of_point.values.push_back(plane_ids[j]);\n                     u_of_point.values.push_back(cur_pts[j].x);\n                     v_of_point.values.push_back(cur_pts[j].y);\n                     velocity_x_of_point.values.push_back(pts_velocity[j].x);\n@@ -154,6 +157,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n             }\n         }\n         feature_points->channels.push_back(id_of_point);\n+        feature_points->channels.push_back(plane_id_of_point);\n         feature_points->channels.push_back(u_of_point);\n         feature_points->channels.push_back(v_of_point);\n         feature_points->channels.push_back(velocity_x_of_point);\n@@ -193,9 +197,9 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n                     trackerData[i].m_camera->spaceToPlane(tmp_prev_un_pts, tmp_prev_uv);\n                     cv::line(tmp_img, trackerData[i].cur_pts[j], cv::Point2f(tmp_prev_uv.x(), tmp_prev_uv.y()), cv::Scalar(255 , 0, 0), 1 , 8, 0);\n                     */\n-                    //char name[10];\n-                    //sprintf(name, \"%d\", trackerData[i].ids[j]);\n-                    //cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0));\n+                    char name[10];\n+                    sprintf(name, \"%d\", trackerData[i].plane_ids[j]);\n+                    cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0));\n                 }\n             }\n             //cv::imshow(\"vis\", stereo_img);\n-- \n2.17.1\n\n\nFrom b54e7f50b03e22ccf7bf9acd91a75f7486d828c6 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Mon, 18 Jan 2021 18:34:33 +0530\nSubject: [PATCH 23/42] process plane id up to homography computation\n\n---\n vins_estimator/src/estimator.cpp              | 121 ++++++++++--------\n vins_estimator/src/estimator.h                |   2 +-\n vins_estimator/src/estimator_node.cpp         |  17 +--\n vins_estimator/src/feature_manager.cpp        |  11 +-\n vins_estimator/src/feature_manager.h          |  12 +-\n .../src/initial/initial_alignment.h           |   4 +-\n vins_estimator/src/initial/initial_sfm.h      |   1 +\n 7 files changed, 91 insertions(+), 77 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex bfa5c1e..d4b94b3 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -117,7 +117,7 @@ void Estimator::processIMU(double dt, const Vector3d &linear_acceleration, const\n     gyr_0 = angular_velocity;\n }\n \n-void Estimator::processImage(const map<int, vector<pair<int, Eigen::Matrix<double, 7, 1>>>> &image, const std_msgs::Header &header)\n+void Estimator::processImage(const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, const std_msgs::Header &header)\n {\n     ROS_DEBUG(\"new image coming ------------------------------------------\");\n     ROS_DEBUG(\"Adding feature points %lu\", image.size());\n@@ -137,23 +137,23 @@ void Estimator::processImage(const map<int, vector<pair<int, Eigen::Matrix<doubl\n     all_image_frame.insert(make_pair(header.stamp.toSec(), imageframe));\n     tmp_pre_integration = new IntegrationBase{acc_0, gyr_0, Bas[frame_count], Bgs[frame_count]};\n \n-    if(ESTIMATE_EXTRINSIC == 2)\n-    {\n-        ROS_INFO(\"calibrating extrinsic param, rotation movement is needed\");\n-        if (frame_count != 0)\n-        {\n-            vector<pair<Vector3d, Vector3d>> corres = f_manager.getCorresponding(frame_count - 1, frame_count);\n-            Matrix3d calib_ric;\n-            if (initial_ex_rotation.CalibrationExRotation(corres, pre_integrations[frame_count]->delta_q, calib_ric))\n-            {\n-                ROS_WARN(\"initial extrinsic rotation calib success\");\n-                ROS_WARN_STREAM(\"initial extrinsic rotation: \" << endl << calib_ric);\n-                ric[0] = calib_ric;\n-                RIC[0] = calib_ric;\n-                ESTIMATE_EXTRINSIC = 1;\n-            }\n-        }\n-    }\n+    //if(ESTIMATE_EXTRINSIC == 2)\n+    //{\n+    //    ROS_INFO(\"calibrating extrinsic param, rotation movement is needed\");\n+    //    if (frame_count != 0)\n+    //    {\n+    //        vector<pair<Vector3d, Vector3d>> corres = f_manager.getCorresponding(frame_count - 1, frame_count);\n+    //        Matrix3d calib_ric;\n+    //        if (initial_ex_rotation.CalibrationExRotation(corres, pre_integrations[frame_count]->delta_q, calib_ric))\n+    //        {\n+    //            ROS_WARN(\"initial extrinsic rotation calib success\");\n+    //            ROS_WARN_STREAM(\"initial extrinsic rotation: \" << endl << calib_ric);\n+    //            ric[0] = calib_ric;\n+    //            RIC[0] = calib_ric;\n+    //            ESTIMATE_EXTRINSIC = 1;\n+    //        }\n+    //    }\n+    //}\n \n     if (solver_flag == INITIAL)\n     {\n@@ -257,6 +257,7 @@ bool Estimator::initialStructure()\n         SFMFeature tmp_feature;\n         tmp_feature.state = false;\n         tmp_feature.id = it_per_id.feature_id;\n+        tmp_feature.plane_id = it_per_id.plane_id;\n         for (auto &it_per_frame : it_per_id.feature_per_frame)\n         {\n             imu_j++;\n@@ -445,56 +446,66 @@ bool Estimator::visualInitialAlign()\n     return true;\n }\n \n-bool Estimator::relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l)\n+//bool Estimator::relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l)\n+//{\n+//    // find previous frame which contians enough correspondance and parallex with newest frame\n+//    for (int i = 0; i < WINDOW_SIZE; i++)\n+//    {\n+//        vector<pair<Vector3d, Vector3d>> corres;\n+//        corres = f_manager.getCorresponding(i, WINDOW_SIZE);\n+//        if (corres.size() > 20)\n+//        {\n+//            double sum_parallax = 0;\n+//            double average_parallax;\n+//            for (int j = 0; j < int(corres.size()); j++)\n+//            {\n+//                Vector2d pts_0(corres[j].first(0), corres[j].first(1));\n+//                Vector2d pts_1(corres[j].second(0), corres[j].second(1));\n+//                double parallax = (pts_0 - pts_1).norm();\n+//                sum_parallax = sum_parallax + parallax;\n+//            }\n+//            average_parallax = 1.0 * sum_parallax / int(corres.size());\n+//            if(average_parallax * 460 > 30 && m_estimator.solveRelativeRT(corres, relative_R, relative_T))\n+//            {\n+//                l = i;\n+//                ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\n+//                return true;\n+//            }\n+//        }\n+//    }\n+//    return false;\n+//}\n+\n+bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l)\n {\n-    // find previous frame which contians enough correspondance and parallex with newest frame\n+    // find previous frame which contains enough correspondence and parallex with newest frame\n     for (int i = 0; i < WINDOW_SIZE; i++)\n     {\n-        vector<pair<Vector3d, Vector3d>> corres;\n-        corres = f_manager.getCorresponding(i, WINDOW_SIZE);\n-        if (corres.size() > 20)\n+        int lplane_id, lplane_size = 0;\n+        vector<pair<Vector3d, Vector3d>> lplane_corres;\n+        map<int, vector<pair<Vector3d, Vector3d>>> plane_corres = f_manager.getCorresponding(i, WINDOW_SIZE);\n+        for(auto &corres : plane_corres)\n         {\n-            double sum_parallax = 0;\n-            double average_parallax;\n-            for (int j = 0; j < int(corres.size()); j++)\n+            if(corres.second.size() > lplane_size)\n             {\n-                Vector2d pts_0(corres[j].first(0), corres[j].first(1));\n-                Vector2d pts_1(corres[j].second(0), corres[j].second(1));\n-                double parallax = (pts_0 - pts_1).norm();\n-                sum_parallax = sum_parallax + parallax;\n-            }\n-            average_parallax = 1.0 * sum_parallax / int(corres.size());\n-            if(average_parallax * 460 > 30 && m_estimator.solveRelativeRT(corres, relative_R, relative_T))\n-            {\n-                l = i;\n-                ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\n-                return true;\n+                lplane_id = corres.first;\n+                lplane_size = corres.second.size();\n+                lplane_corres = corres.second;\n             }\n         }\n-    }\n-    return false;\n-}\n-\n-bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l)\n-{\n-    // find previous frame which contains enough correspondence and parallex with newest frame\n-    for (int i = 0; i < WINDOW_SIZE; i++)\n-    {\n-        vector<pair<Vector3d, Vector3d>> corres;\n-        corres = f_manager.getCorresponding(i, WINDOW_SIZE);\n-        if (corres.size() > 20)\n+        if (lplane_size > 20)\n         {\n             double sum_parallax = 0;\n             double average_parallax;\n-            for (int j = 0; j < int(corres.size()); j++)\n+            for (int j = 0; j < int(lplane_corres.size()); j++)\n             {\n-                Vector2d pts_0(corres[j].first(0), corres[j].first(1));\n-                Vector2d pts_1(corres[j].second(0), corres[j].second(1));\n+                Vector2d pts_0(lplane_corres[j].first(0), lplane_corres[j].first(1));\n+                Vector2d pts_1(lplane_corres[j].second(0), lplane_corres[j].second(1));\n                 double parallax = (pts_0 - pts_1).norm();\n                 sum_parallax = sum_parallax + parallax;\n-\n             }\n-            average_parallax = 1.0 * sum_parallax / int(corres.size());\n+\n+            average_parallax = 1.0 * sum_parallax / int(lplane_corres.size());\n \n             Matrix4d TrIC = Matrix4d::Identity(); \n             TrIC.block(0,0,3,3) = ric[0];\n@@ -505,7 +516,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector\n             for(int k = WINDOW_SIZE - 1; k > i; k--)\n                 R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n \n-            if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, relative_R, relative_T, n))\n+            if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(lplane_corres, R_imu, TrIC, relative_R, relative_T, n))\n             {\n                 l = i;\n                 ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex adad3cd..e109500 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -33,7 +33,7 @@ class Estimator\n \n     // interface\n     void processIMU(double t, const Vector3d &linear_acceleration, const Vector3d &angular_velocity);\n-    void processImage(const map<int, vector<pair<int, Eigen::Matrix<double, 7, 1>>>> &image, const std_msgs::Header &header);\n+    void processImage(const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, const std_msgs::Header &header);\n     void setReloFrame(double _frame_stamp, int _frame_index, vector<Vector3d> &_match_points, Vector3d _relo_t, Matrix3d _relo_r);\n \n     // internal\ndiff --git a/vins_estimator/src/estimator_node.cpp b/vins_estimator/src/estimator_node.cpp\nindex 1297936..d20f9eb 100644\n--- a/vins_estimator/src/estimator_node.cpp\n+++ b/vins_estimator/src/estimator_node.cpp\n@@ -293,23 +293,24 @@ void process()\n             ROS_DEBUG(\"processing vision data with stamp %f \\n\", img_msg->header.stamp.toSec());\n \n             TicToc t_s;\n-            map<int, vector<pair<int, Eigen::Matrix<double, 7, 1>>>> image;\n+            map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> image;\n             for (unsigned int i = 0; i < img_msg->points.size(); i++)\n             {\n                 int v = img_msg->channels[0].values[i] + 0.5;\n                 int feature_id = v / NUM_OF_CAM;\n                 int camera_id = v % NUM_OF_CAM;\n+                int plane_id = img_msg->channels[1].values[i];\n                 double x = img_msg->points[i].x;\n                 double y = img_msg->points[i].y;\n                 double z = img_msg->points[i].z;\n-                double p_u = img_msg->channels[1].values[i];\n-                double p_v = img_msg->channels[2].values[i];\n-                double velocity_x = img_msg->channels[3].values[i];\n-                double velocity_y = img_msg->channels[4].values[i];\n+                double p_u = img_msg->channels[2].values[i];\n+                double p_v = img_msg->channels[3].values[i];\n+                double velocity_x = img_msg->channels[4].values[i];\n+                double velocity_y = img_msg->channels[5].values[i];\n                 ROS_ASSERT(z == 1);\n-                Eigen::Matrix<double, 7, 1> xyz_uv_velocity;\n-                xyz_uv_velocity << x, y, z, p_u, p_v, velocity_x, velocity_y;\n-                image[feature_id].emplace_back(camera_id,  xyz_uv_velocity);\n+                Eigen::Matrix<double, 8, 1> xyz_uv_velocity_pid;\n+                xyz_uv_velocity_pid << x, y, z, p_u, p_v, velocity_x, velocity_y, plane_id;\n+                image[feature_id].emplace_back(camera_id,  xyz_uv_velocity_pid);\n             }\n             estimator.processImage(image, img_msg->header);\n \ndiff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp\nindex 7d5aed9..17f75b9 100644\n--- a/vins_estimator/src/feature_manager.cpp\n+++ b/vins_estimator/src/feature_manager.cpp\n@@ -42,7 +42,7 @@ int FeatureManager::getFeatureCount()\n }\n \n \n-bool FeatureManager::addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 7, 1>>>> &image, double td)\n+bool FeatureManager::addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, double td)\n {\n     ROS_DEBUG(\"input feature: %d\", (int)image.size());\n     ROS_DEBUG(\"num of feature: %d\", getFeatureCount());\n@@ -61,7 +61,8 @@ bool FeatureManager::addFeatureCheckParallax(int frame_count, const map<int, vec\n \n         if (it == feature.end())\n         {\n-            feature.push_back(FeaturePerId(feature_id, frame_count));\n+            int plane_id = id_pts.second[0].second(7);\n+            feature.push_back(FeaturePerId(feature_id, plane_id, frame_count));\n             feature.back().feature_per_frame.push_back(f_per_fra);\n         }\n         else if (it->feature_id == feature_id)\n@@ -117,9 +118,9 @@ void FeatureManager::debugShow()\n     }\n }\n \n-vector<pair<Vector3d, Vector3d>> FeatureManager::getCorresponding(int frame_count_l, int frame_count_r)\n+map<int, vector<pair<Vector3d, Vector3d>>> FeatureManager::getCorresponding(int frame_count_l, int frame_count_r)\n {\n-    vector<pair<Vector3d, Vector3d>> corres;\n+    map<int, vector<pair<Vector3d, Vector3d>>> corres;\n     for (auto &it : feature)\n     {\n         if (it.start_frame <= frame_count_l && it.endFrame() >= frame_count_r)\n@@ -132,7 +133,7 @@ vector<pair<Vector3d, Vector3d>> FeatureManager::getCorresponding(int frame_coun\n \n             b = it.feature_per_frame[idx_r].point;\n             \n-            corres.push_back(make_pair(a, b));\n+            corres[it.plane_id].push_back(make_pair(a, b));\n         }\n     }\n     return corres;\ndiff --git a/vins_estimator/src/feature_manager.h b/vins_estimator/src/feature_manager.h\nindex 4e5d3ce..ca94634 100644\n--- a/vins_estimator/src/feature_manager.h\n+++ b/vins_estimator/src/feature_manager.h\n@@ -18,7 +18,7 @@ using namespace Eigen;\n class FeaturePerFrame\n {\n   public:\n-    FeaturePerFrame(const Eigen::Matrix<double, 7, 1> &_point, double td)\n+    FeaturePerFrame(const Eigen::Matrix<double, 8, 1> &_point, double td)\n     {\n         point.x() = _point(0);\n         point.y() = _point(1);\n@@ -44,7 +44,7 @@ class FeaturePerFrame\n class FeaturePerId\n {\n   public:\n-    const int feature_id;\n+    const int feature_id, plane_id;\n     int start_frame;\n     vector<FeaturePerFrame> feature_per_frame;\n \n@@ -56,8 +56,8 @@ class FeaturePerId\n \n     Vector3d gt_p;\n \n-    FeaturePerId(int _feature_id, int _start_frame)\n-        : feature_id(_feature_id), start_frame(_start_frame),\n+    FeaturePerId(int _feature_id, int _plane_id, int _start_frame)\n+        : feature_id(_feature_id), plane_id(_plane_id), start_frame(_start_frame),\n           used_num(0), estimated_depth(-1.0), solve_flag(0)\n     {\n     }\n@@ -76,9 +76,9 @@ class FeatureManager\n \n     int getFeatureCount();\n \n-    bool addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 7, 1>>>> &image, double td);\n+    bool addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, double td);\n     void debugShow();\n-    vector<pair<Vector3d, Vector3d>> getCorresponding(int frame_count_l, int frame_count_r);\n+    map<int, vector<pair<Vector3d, Vector3d>>> getCorresponding(int frame_count_l, int frame_count_r);\n \n     //void updateDepth(const VectorXd &x);\n     void setDepth(const VectorXd &x);\ndiff --git a/vins_estimator/src/initial/initial_alignment.h b/vins_estimator/src/initial/initial_alignment.h\nindex 49bc466..392337e 100644\n--- a/vins_estimator/src/initial/initial_alignment.h\n+++ b/vins_estimator/src/initial/initial_alignment.h\n@@ -14,11 +14,11 @@ class ImageFrame\n {\n     public:\n         ImageFrame(){};\n-        ImageFrame(const map<int, vector<pair<int, Eigen::Matrix<double, 7, 1>>>>& _points, double _t):t{_t},is_key_frame{false}\n+        ImageFrame(const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>>& _points, double _t):t{_t},is_key_frame{false}\n         {\n             points = _points;\n         };\n-        map<int, vector<pair<int, Eigen::Matrix<double, 7, 1>> > > points;\n+        map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>> > > points;\n         double t;\n         Matrix3d R;\n         Vector3d T;\ndiff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h\nindex a070742..3c51e44 100644\n--- a/vins_estimator/src/initial/initial_sfm.h\n+++ b/vins_estimator/src/initial/initial_sfm.h\n@@ -17,6 +17,7 @@ struct SFMFeature\n {\n     bool state;\n     int id;\n+\tint plane_id;\n     vector<pair<int,Vector2d>> observation;\n     double position[3];\n     double image[3]; // u v 1\n-- \n2.17.1\n\n\nFrom f953992bf0a51d466f28bcd0e78e155e8063560d Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Wed, 20 Jan 2021 14:53:49 +0530\nSubject: [PATCH 24/42] Compute homographies for each plane\n\n---\n vins_estimator/src/estimator.cpp | 61 +++++++++++++++++++++++---------\n vins_estimator/src/estimator.h   |  2 +-\n 2 files changed, 45 insertions(+), 18 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex d4b94b3..be5bc42 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -246,6 +246,19 @@ bool Estimator::initialStructure()\n             //return false;\n         }\n     }\n+    \n+    // Bootstrap SfM\n+    map<int, Matrix3d> relative_R;\n+    map<int, Vector3d> relative_T;\n+    map<int, Vector3d> n;\n+    int l, lplane_id;\n+\n+    if (!relativeHPose(relative_R, relative_T, n, l, lplane_id))\n+    {\n+        ROS_INFO(\"Not enough features or parallax; Move device around\");\n+        return false;\n+    }\n+\n     // global sfm\n     Quaterniond Q[frame_count + 1];\n     Vector3d T[frame_count + 1];\n@@ -253,6 +266,10 @@ bool Estimator::initialStructure()\n     vector<SFMFeature> sfm_f;\n     for (auto &it_per_id : f_manager.feature)\n     {\n+        // Consider only features from largest plne\n+        if(it_per_id.plane_id != lplane_id)\n+            continue;\n+\n         int imu_j = it_per_id.start_frame - 1;\n         SFMFeature tmp_feature;\n         tmp_feature.state = false;\n@@ -265,19 +282,11 @@ bool Estimator::initialStructure()\n             tmp_feature.observation.push_back(make_pair(imu_j, Eigen::Vector2d{pts_j.x(), pts_j.y()}));\n         }\n         sfm_f.push_back(tmp_feature);\n-    } \n-    Matrix3d relative_R;\n-    Vector3d relative_T;\n-    Vector3d n;\n-    int l;\n-    if (!relativeHPose(relative_R, relative_T, n, l))\n-    {\n-        ROS_INFO(\"Not enough features or parallax; Move device around\");\n-        return false;\n     }\n+\n     GlobalSFM sfm;\n     if(!sfm.constructH(frame_count + 1, Q, T, l,\n-              relative_R, relative_T, n,\n+              relative_R[lplane_id], relative_T[lplane_id], n[lplane_id],\n               sfm_f, sfm_tracked_points))\n     {\n         ROS_DEBUG(\"global SFM failed!\");\n@@ -285,9 +294,9 @@ bool Estimator::initialStructure()\n         return false;\n     }\n     para_N[0] = 0;\n-    para_N[1] = n[0];\n-    para_N[2] = n[1];\n-    para_N[3] = n[2];\n+    para_N[1] = n[lplane_id][0];\n+    para_N[2] = n[lplane_id][1];\n+    para_N[3] = n[lplane_id][2];\n \n     //solve pnp for all frame\n     map<double, ImageFrame>::iterator frame_it;\n@@ -476,12 +485,12 @@ bool Estimator::visualInitialAlign()\n //    return false;\n //}\n \n-bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l)\n+bool Estimator::relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d> &relative_t, map<int, Vector3d> &n, int &l, int &lplane_id)\n {\n     // find previous frame which contains enough correspondence and parallex with newest frame\n     for (int i = 0; i < WINDOW_SIZE; i++)\n     {\n-        int lplane_id, lplane_size = 0;\n+        int lplane_size = 0;\n         vector<pair<Vector3d, Vector3d>> lplane_corres;\n         map<int, vector<pair<Vector3d, Vector3d>>> plane_corres = f_manager.getCorresponding(i, WINDOW_SIZE);\n         for(auto &corres : plane_corres)\n@@ -507,7 +516,7 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector\n \n             average_parallax = 1.0 * sum_parallax / int(lplane_corres.size());\n \n-            Matrix4d TrIC = Matrix4d::Identity(); \n+            Matrix4d TrIC = Matrix4d::Identity();\n             TrIC.block(0,0,3,3) = ric[0];\n             TrIC.block(0,3,3,1) = tic[0];\n \n@@ -516,10 +525,28 @@ bool Estimator::relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector\n             for(int k = WINDOW_SIZE - 1; k > i; k--)\n                 R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n \n-            if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(lplane_corres, R_imu, TrIC, relative_R, relative_T, n))\n+            Eigen::Matrix3d est_R;\n+            Eigen::Vector3d est_t, est_n;\n+            if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(lplane_corres, R_imu, TrIC, est_R, est_t, est_n))\n             {\n                 l = i;\n                 ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\n+                relative_R[lplane_id] = est_R;\n+                relative_t[lplane_id] = est_t;\n+                n[lplane_id] = est_n;\n+\n+                for(auto &corres : plane_corres)\n+                {\n+                    if(corres.first == lplane_id)\n+                        continue;\n+                    \n+                    if(m_estimator.solveRelativeHRT(corres.second, R_imu, TrIC, est_R, est_t, est_n))\n+                    {\n+                        relative_R[corres.first] = est_R;\n+                        relative_t[corres.first] = est_t;\n+                        n[corres.first] = est_n;\n+                    }\n+                }\n                 return true;\n             }\n         }\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex e109500..8efe106 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -41,7 +41,7 @@ class Estimator\n     bool initialStructure();\n     bool visualInitialAlign();\n     bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l);\n-    bool relativeHPose(Matrix3d &relative_R, Vector3d &relative_T, Vector3d &n, int &l);\n+    bool relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d> &relative_T, map<int, Vector3d> &n, int &l, int &lplane_id);\n     void slideWindow();\n     void solveOdometry();\n     void slideWindowNew();\n-- \n2.17.1\n\n\nFrom 1ae6a50d3a5dbd1b9d193a7dfe00be2adadf987c Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 21 Jan 2021 19:28:53 +0530\nSubject: [PATCH 25/42] Use individual planes inside estimator\n\n---\n vins_estimator/src/estimator.cpp             | 46 ++++++++++++++------\n vins_estimator/src/estimator.h               |  8 ++--\n vins_estimator/src/feature_manager.cpp       |  1 +\n vins_estimator/src/utility/visualization.cpp |  1 +\n 4 files changed, 38 insertions(+), 18 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex be5bc42..74261f9 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -176,7 +176,6 @@ void Estimator::processImage(const map<int, vector<pair<int, Eigen::Matrix<doubl\n                 last_P = Ps[WINDOW_SIZE];\n                 last_R0 = Rs[0];\n                 last_P0 = Ps[0];\n-                \n             }\n             else\n                 slideWindow();\n@@ -215,6 +214,7 @@ void Estimator::processImage(const map<int, vector<pair<int, Eigen::Matrix<doubl\n         last_P0 = Ps[0];\n     }\n }\n+\n bool Estimator::initialStructure()\n {\n     TicToc t_sfm;\n@@ -251,7 +251,7 @@ bool Estimator::initialStructure()\n     map<int, Matrix3d> relative_R;\n     map<int, Vector3d> relative_T;\n     map<int, Vector3d> n;\n-    int l, lplane_id;\n+    int l;\n \n     if (!relativeHPose(relative_R, relative_T, n, l, lplane_id))\n     {\n@@ -293,10 +293,12 @@ bool Estimator::initialStructure()\n         marginalization_flag = MARGIN_OLD;\n         return false;\n     }\n-    para_N[0] = 0;\n-    para_N[1] = n[lplane_id][0];\n-    para_N[2] = n[lplane_id][1];\n-    para_N[3] = n[lplane_id][2];\n+\n+    // Save estimated normal variables\n+    for(auto &est_n : n)\n+    {\n+        para_N[est_n.first] = {0, est_n.second(0), est_n.second(1), est_n.second(2)};\n+    }\n \n     //solve pnp for all frame\n     map<double, ImageFrame>::iterator frame_it;\n@@ -366,17 +368,16 @@ bool Estimator::initialStructure()\n         frame_it->second.R = R_pnp * RIC[0].transpose();\n         frame_it->second.T = T_pnp;\n     }\n-    if (visualInitialAlign())\n+    if (visualInitialAlign(relative_T, T[frame_count]))\n         return true;\n     else\n     {\n         ROS_INFO(\"misalign visual structure with IMU\");\n         return false;\n     }\n-\n }\n \n-bool Estimator::visualInitialAlign()\n+bool Estimator::visualInitialAlign(map<int, Eigen::Vector3d> &relative_T, Eigen::Vector3d &lt)\n {\n     TicToc t_g;\n     VectorXd x;\n@@ -412,7 +413,7 @@ bool Estimator::visualInitialAlign()\n     f_manager.triangulate(Ps, &(TIC_TMP[0]), &(RIC[0]));\n \n     s = (x.tail<1>())(0);\n-    para_d[0] = s;\n+\n     for (int i = 0; i <= WINDOW_SIZE; i++)\n     {\n         pre_integrations[i]->repropagate(Vector3d::Zero(), Bgs[i]);\n@@ -434,9 +435,19 @@ bool Estimator::visualInitialAlign()\n         it_per_id.used_num = it_per_id.feature_per_frame.size();\n         if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n             continue;\n+\n         it_per_id.estimated_depth *= s;\n     }\n \n+    //estimate plane d variables\n+    for(auto &t : relative_T)\n+    {\n+        if(t.first == lplane_id)\n+            para_d[lplane_id] = {s};\n+        else\n+           para_d[t.first] = {s*lt.norm()/t.second.norm()}; \n+    } \n+\n     Matrix3d R0 = Utility::g2R(g);\n     double yaw = Utility::R2ypr(R0 * Rs[0]).x();\n     R0 = Utility::ypr2R(Eigen::Vector3d{-yaw, 0, 0}) * R0;\n@@ -764,8 +775,11 @@ void Estimator::optimization()\n         problem.AddParameterBlock(para_SpeedBias[i], SIZE_SPEEDBIAS);\n     }\n \n-    ceres::LocalParameterization *local_n_parameterization = new ceres::QuaternionParameterization();\n-    problem.AddParameterBlock(para_N, 4, local_n_parameterization);\n+    for(auto &N : para_N)\n+    {\n+        ceres::LocalParameterization *local_n_parameterization = new ceres::QuaternionParameterization();\n+        problem.AddParameterBlock(N.second.data(), 4, local_n_parameterization);\n+    }\n \n     for (int i = 0; i < NUM_OF_CAM; i++)\n     {\n@@ -811,12 +825,16 @@ void Estimator::optimization()\n         it_per_id.used_num = it_per_id.feature_per_frame.size();\n         if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n             continue;\n- \n+\n         ++feature_index;\n \n         int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;\n         \n         Vector3d pts_i = it_per_id.feature_per_frame[0].point;\n+        int pid = it_per_id.plane_id;\n+\n+        if(!(para_N.count(pid) > 0 && para_d.count(pid) > 0))\n+            continue;\n \n         for (auto &it_per_frame : it_per_id.feature_per_frame)\n         {\n@@ -850,7 +868,7 @@ void Estimator::optimization()\n                 problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n \n                 ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n-                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N, para_d, para_Ex_Pose[0]);\n+                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N[pid].data(), para_d[pid].data(), para_Ex_Pose[0]);\n             }\n             f_m_cnt++;\n         }\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex 8efe106..b7c32db 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -39,7 +39,7 @@ class Estimator\n     // internal\n     void clearState();\n     bool initialStructure();\n-    bool visualInitialAlign();\n+    bool visualInitialAlign(map<int,Eigen::Vector3d> &relative_T, Eigen::Vector3d &lt);\n     bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l);\n     bool relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d> &relative_T, map<int, Vector3d> &n, int &l, int &lplane_id);\n     void slideWindow();\n@@ -80,6 +80,7 @@ class Estimator\n     Vector3d Bgs[(WINDOW_SIZE + 1)];\n     double td;\n     double s;\n+    int lplane_id;\n \n     Matrix3d back_R0, last_R, last_R0;\n     Vector3d back_P0, last_P, last_P0;\n@@ -108,7 +109,6 @@ class Estimator\n     vector<Vector3d> key_poses;\n     double initial_timestamp;\n \n-\n     double para_Pose[WINDOW_SIZE + 1][SIZE_POSE];\n     double para_SpeedBias[WINDOW_SIZE + 1][SIZE_SPEEDBIAS];\n     double para_Feature[NUM_OF_F][SIZE_FEATURE];\n@@ -116,8 +116,8 @@ class Estimator\n     double para_Retrive_Pose[SIZE_POSE];\n     double para_Td[1][1];\n     double para_Tr[1][1];\n-    double para_N[4];\n-    double para_d[1];\n+    map<int, array<double,4>> para_N;\n+    map<int, array<double,1>> para_d;\n \n     int loop_window_index;\n \ndiff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp\nindex 17f75b9..45f14b8 100644\n--- a/vins_estimator/src/feature_manager.cpp\n+++ b/vins_estimator/src/feature_manager.cpp\n@@ -210,6 +210,7 @@ void FeatureManager::triangulate(Vector3d Ps[], Vector3d tic[], Matrix3d ric[])\n \n         if (it_per_id.estimated_depth > 0)\n             continue;\n+\n         int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;\n \n         ROS_ASSERT(NUM_OF_CAM == 1);\ndiff --git a/vins_estimator/src/utility/visualization.cpp b/vins_estimator/src/utility/visualization.cpp\nindex 167e913..eb87e13 100644\n--- a/vins_estimator/src/utility/visualization.cpp\n+++ b/vins_estimator/src/utility/visualization.cpp\n@@ -252,6 +252,7 @@ void pubPointCloud(const Estimator &estimator, const std_msgs::Header &header)\n             continue;\n         if (it_per_id.start_frame > WINDOW_SIZE * 3.0 / 4.0 || it_per_id.solve_flag != 1)\n             continue;\n+\n         int imu_i = it_per_id.start_frame;\n         Vector3d pts_i = it_per_id.feature_per_frame[0].point * it_per_id.estimated_depth;\n         Vector3d w_pts_i = estimator.Rs[imu_i] * (estimator.ric[0] * pts_i + estimator.tic[0]) + estimator.Ps[imu_i];\n-- \n2.17.1\n\n\nFrom 29ecb88038821e1d73c60136485cb1435ac38e06 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 22 Jan 2021 13:03:28 +0530\nSubject: [PATCH 26/42] Initialize new planes\n\n---\n vins_estimator/src/estimator.cpp       | 61 +++++++++++++++++++++++++-\n vins_estimator/src/estimator.h         |  3 +-\n vins_estimator/src/feature_manager.cpp | 25 +++++++++++\n vins_estimator/src/feature_manager.h   |  2 +\n 4 files changed, 89 insertions(+), 2 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 74261f9..eee27b5 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -186,6 +186,7 @@ void Estimator::processImage(const map<int, vector<pair<int, Eigen::Matrix<doubl\n     else\n     {\n         TicToc t_solve;\n+        initializePlanes();\n         solveOdometry();\n         ROS_DEBUG(\"solver costs: %fms\", t_solve.toc());\n \n@@ -513,6 +514,7 @@ bool Estimator::relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d>\n                 lplane_corres = corres.second;\n             }\n         }\n+\n         if (lplane_size > 20)\n         {\n             double sum_parallax = 0;\n@@ -550,12 +552,13 @@ bool Estimator::relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d>\n                 {\n                     if(corres.first == lplane_id)\n                         continue;\n-                    \n+\n                     if(m_estimator.solveRelativeHRT(corres.second, R_imu, TrIC, est_R, est_t, est_n))\n                     {\n                         relative_R[corres.first] = est_R;\n                         relative_t[corres.first] = est_t;\n                         n[corres.first] = est_n;\n+                        ROS_INFO(\"Initialized plane %d features\", corres.first);\n                     }\n                 }\n                 return true;\n@@ -565,6 +568,62 @@ bool Estimator::relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d>\n     return false;\n }\n \n+void Estimator::initializePlanes()\n+{\n+    // check for new plane ids\n+    for(auto &plane_id : f_manager.plane_ids)\n+    {\n+        if(para_N.count(plane_id) > 0)\n+            continue;\n+        else\n+        {\n+            for(int i = 0; i < WINDOW_SIZE - 1; i++)\n+            {\n+                vector<pair<Vector3d, Vector3d>> corres = f_manager.getCorrespondingByPlane(\n+                    i, WINDOW_SIZE-1, plane_id);\n+\n+                if(corres.size() > 20)\n+                {\n+                    double sum_parallax = 0;\n+                    double average_parallax;\n+                    for (int j = 0; j < int(corres.size()); j++)\n+                    {\n+                        Vector2d pts_0(corres[j].first(0), corres[j].first(1));\n+                        Vector2d pts_1(corres[j].second(0), corres[j].second(1));\n+                        double parallax = (pts_0 - pts_1).norm();\n+                        sum_parallax = sum_parallax + parallax;\n+                    }\n+\n+                    average_parallax = 1.0 * sum_parallax / int(corres.size());\n+\n+                    Matrix4d TrIC = Matrix4d::Identity();\n+                    TrIC.block(0,0,3,3) = ric[0];\n+                    TrIC.block(0,3,3,1) = tic[0];\n+\n+                    //compute corresponding preintegrated rotation\n+                    Matrix3d R_imu = Matrix3d::Identity();\n+                    for(int k = WINDOW_SIZE - 2; k > i; k--)\n+                        R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n+\n+                    Eigen::Matrix3d est_R;\n+                    Eigen::Vector3d est_t, est_n, vi_t;\n+                    if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, est_R, est_t, est_n))\n+                    {\n+                        // Transform est_n to global frame\n+                        est_n = Rs[i] * est_n;\n+                        para_N[plane_id] = {0, est_n(0), est_n(1), est_n(2)};\n+\n+                        // Estimate plane d using known metric t from vio \n+                        vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]);\n+                        para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())};\n+                        ROS_INFO(\"Initialized plane %d features\", plane_id);\n+                    }\n+                }\n+            }\n+        }\n+    }\n+}\n+\n void Estimator::solveOdometry()\n {\n     if (frame_count < WINDOW_SIZE)\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex b7c32db..7c54363 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -42,6 +42,7 @@ class Estimator\n     bool visualInitialAlign(map<int,Eigen::Vector3d> &relative_T, Eigen::Vector3d &lt);\n     bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l);\n     bool relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d> &relative_T, map<int, Vector3d> &n, int &l, int &lplane_id);\n+    void initializePlanes();\n     void slideWindow();\n     void solveOdometry();\n     void slideWindowNew();\n@@ -127,7 +128,7 @@ class Estimator\n     map<double, ImageFrame> all_image_frame;\n     IntegrationBase *tmp_pre_integration;\n \n-    //relocalization variable\n+    //relocalization variables\n     bool relocalization_info;\n     double relo_frame_stamp;\n     double relo_frame_index;\ndiff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp\nindex 45f14b8..64828ee 100644\n--- a/vins_estimator/src/feature_manager.cpp\n+++ b/vins_estimator/src/feature_manager.cpp\n@@ -62,6 +62,7 @@ bool FeatureManager::addFeatureCheckParallax(int frame_count, const map<int, vec\n         if (it == feature.end())\n         {\n             int plane_id = id_pts.second[0].second(7);\n+            plane_ids.insert(plane_id);\n             feature.push_back(FeaturePerId(feature_id, plane_id, frame_count));\n             feature.back().feature_per_frame.push_back(f_per_fra);\n         }\n@@ -139,6 +140,30 @@ map<int, vector<pair<Vector3d, Vector3d>>> FeatureManager::getCorresponding(int\n     return corres;\n }\n \n+vector<pair<Vector3d, Vector3d>> FeatureManager::getCorrespondingByPlane(int frame_count_l, int frame_count_r, int plane_id)\n+{\n+    vector<pair<Vector3d, Vector3d>> corres;\n+    for (auto &it : feature)\n+    {\n+        if(it.plane_id != plane_id)\n+            continue;\n+\n+        if (it.start_frame <= frame_count_l && it.endFrame() >= frame_count_r)\n+        {\n+            Vector3d a = Vector3d::Zero(), b = Vector3d::Zero();\n+            int idx_l = frame_count_l - it.start_frame;\n+            int idx_r = frame_count_r - it.start_frame;\n+\n+            a = it.feature_per_frame[idx_l].point;\n+\n+            b = it.feature_per_frame[idx_r].point;\n+            \n+            corres.push_back(make_pair(a, b));\n+        }\n+    }\n+    return corres;\n+}\n+\n void FeatureManager::setDepth(const VectorXd &x)\n {\n     int feature_index = -1;\ndiff --git a/vins_estimator/src/feature_manager.h b/vins_estimator/src/feature_manager.h\nindex ca94634..a7f46fc 100644\n--- a/vins_estimator/src/feature_manager.h\n+++ b/vins_estimator/src/feature_manager.h\n@@ -79,6 +79,8 @@ class FeatureManager\n     bool addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, double td);\n     void debugShow();\n     map<int, vector<pair<Vector3d, Vector3d>>> getCorresponding(int frame_count_l, int frame_count_r);\n+    vector<pair<Vector3d, Vector3d>> getCorrespondingByPlane(int frame_count_l, int frame_count_r, int plane_id);\n+    set<int> plane_ids;\n \n     //void updateDepth(const VectorXd &x);\n     void setDepth(const VectorXd &x);\n-- \n2.17.1\n\n\nFrom 41b63dbda66a5cef24d407c3ae2952ff8d876bca Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 22 Jan 2021 19:33:33 +0530\nSubject: [PATCH 27/42] Call add points only when there are new points\n\n---\n feature_tracker/src/feature_tracker.cpp      | 10 +++++-----\n feature_tracker/src/feature_tracker_node.cpp |  2 +-\n 2 files changed, 6 insertions(+), 6 deletions(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex 0de4262..c42426c 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -183,15 +183,15 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double\n             if (mask.size() != forw_img.size())\n                 cout << \"wrong size \" << endl;\n             cv::goodFeaturesToTrack(forw_img, n_pts, MAX_CNT - forw_pts.size(), 0.01, MIN_DIST, mask);\n+            ROS_DEBUG(\"add feature begins\");\n+            TicToc t_a;\n+            addPoints(_mask);\n+            ROS_DEBUG(\"selectFeature costs: %fms\", t_a.toc());\n         }\n+\n         else\n             n_pts.clear();\n         ROS_DEBUG(\"detect feature costs: %fms\", t_t.toc());\n-\n-        ROS_DEBUG(\"add feature begins\");\n-        TicToc t_a;\n-        addPoints(_mask);\n-        ROS_DEBUG(\"selectFeature costs: %fms\", t_a.toc());\n     }\n     prev_img = cur_img;\n     prev_pts = cur_pts;\ndiff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp\nindex b9de94a..ba64fed 100644\n--- a/feature_tracker/src/feature_tracker_node.cpp\n+++ b/feature_tracker/src/feature_tracker_node.cpp\n@@ -86,7 +86,7 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n     {\n         ROS_DEBUG(\"processing camera %d\", i);\n         if (i != 1 || !STEREO_TRACK)\n-            trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image.rowRange(ROW*i,ROW*(i+1)), img_msg->header.stamp.toSec());\n+            trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image.rowRange(ROW * i,ROW * (i + 1)), img_msg->header.stamp.toSec());\n         else\n         {\n             if (EQUALIZE)\n-- \n2.17.1\n\n\nFrom b8a80ed0a1ce5187c6eaafc514f0f75b6a724d11 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 22 Jan 2021 20:35:06 +0530\nSubject: [PATCH 28/42] Revert to F rejection\n\n---\n feature_tracker/src/feature_tracker.cpp | 6 +++---\n 1 file changed, 3 insertions(+), 3 deletions(-)\n\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex c42426c..6548c36 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -204,7 +204,7 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double\n \n void FeatureTracker::rejectWithF()\n {\n-    if (forw_pts.size() >= 4)\n+    if (forw_pts.size() >= 8)\n     {\n         ROS_DEBUG(\"FM ransac begins\");\n         TicToc t_f;\n@@ -224,8 +224,8 @@ void FeatureTracker::rejectWithF()\n         }\n \n         vector<uchar> status;\n-        //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n-        cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status);\n+        cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n+        //cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status);\n         int size_a = cur_pts.size();\n         reduceVector(prev_pts, status);\n         reduceVector(cur_pts, status);\n-- \n2.17.1\n\n\nFrom 9756346e6b551d799f7610884250094d83da9488 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sun, 24 Jan 2021 09:26:00 +0530\nSubject: [PATCH 29/42] Clear plane parameters upon failure, fix multiple\n initialization bug\n\n---\n vins_estimator/src/estimator.cpp       | 13 ++++++++++---\n vins_estimator/src/estimator.h         |  2 +-\n vins_estimator/src/feature_manager.cpp |  1 +\n 3 files changed, 12 insertions(+), 4 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex eee27b5..88e0942 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -52,6 +52,9 @@ void Estimator::clearState()\n         }\n     }\n \n+    para_N.clear();\n+    para_d.clear();\n+\n     solver_flag = INITIAL;\n     first_imu = false,\n     sum_of_back = 0;\n@@ -186,7 +189,7 @@ void Estimator::processImage(const map<int, vector<pair<int, Eigen::Matrix<doubl\n     else\n     {\n         TicToc t_solve;\n-        initializePlanes();\n+        initializeNewPlanes();\n         solveOdometry();\n         ROS_DEBUG(\"solver costs: %fms\", t_solve.toc());\n \n@@ -373,6 +376,8 @@ bool Estimator::initialStructure()\n         return true;\n     else\n     {\n+        para_N.clear();\n+        para_d.clear();\n         ROS_INFO(\"misalign visual structure with IMU\");\n         return false;\n     }\n@@ -447,6 +452,8 @@ bool Estimator::visualInitialAlign(map<int, Eigen::Vector3d> &relative_T, Eigen:\n             para_d[lplane_id] = {s};\n         else\n            para_d[t.first] = {s*lt.norm()/t.second.norm()}; \n+\n+        ROS_INFO(\"Initialized plane %d features\", t.first);\n     } \n \n     Matrix3d R0 = Utility::g2R(g);\n@@ -558,7 +565,6 @@ bool Estimator::relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d>\n                         relative_R[corres.first] = est_R;\n                         relative_t[corres.first] = est_t;\n                         n[corres.first] = est_n;\n-                        ROS_INFO(\"Initialized plane %d features\", corres.first);\n                     }\n                 }\n                 return true;\n@@ -568,7 +574,7 @@ bool Estimator::relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d>\n     return false;\n }\n \n-void Estimator::initializePlanes()\n+void Estimator::initializeNewPlanes()\n {\n     // check for new plane ids\n     for(auto &plane_id : f_manager.plane_ids)\n@@ -617,6 +623,7 @@ void Estimator::initializePlanes()\n                         vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]);\n                         para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())};\n                         ROS_INFO(\"Initialized plane %d features\", plane_id);\n+                        break;\n                     }\n                 }\n             }\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex 7c54363..35aaff8 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -42,7 +42,7 @@ class Estimator\n     bool visualInitialAlign(map<int,Eigen::Vector3d> &relative_T, Eigen::Vector3d &lt);\n     bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l);\n     bool relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d> &relative_T, map<int, Vector3d> &n, int &l, int &lplane_id);\n-    void initializePlanes();\n+    void initializeNewPlanes();\n     void slideWindow();\n     void solveOdometry();\n     void slideWindowNew();\ndiff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp\nindex 64828ee..e82c694 100644\n--- a/vins_estimator/src/feature_manager.cpp\n+++ b/vins_estimator/src/feature_manager.cpp\n@@ -23,6 +23,7 @@ void FeatureManager::setRic(Matrix3d _ric[])\n void FeatureManager::clearState()\n {\n     feature.clear();\n+    plane_ids.clear();\n }\n \n int FeatureManager::getFeatureCount()\n-- \n2.17.1\n\n\nFrom 6ae97fe2b8e72f6e48bd9e857e6ee3e6d21bce44 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sun, 24 Jan 2021 09:35:02 +0530\nSubject: [PATCH 30/42] Add airsim config, launch files\n\n---\n config/airsim_config.yaml                     | 83 +++++++++++++++++++\n .../launch/playback_vins_airsim.launch        | 29 +++++++\n 2 files changed, 112 insertions(+)\n create mode 100644 config/airsim_config.yaml\n create mode 100644 vins_estimator/launch/playback_vins_airsim.launch\n\ndiff --git a/config/airsim_config.yaml b/config/airsim_config.yaml\nnew file mode 100644\nindex 0000000..f7dda5c\n--- /dev/null\n+++ b/config/airsim_config.yaml\n@@ -0,0 +1,83 @@\n+%YAML:1.0\n+\n+#common parameters\n+imu_topic: \"/imu\"\n+image_topic: \"/image\"\n+mask_topic: \"/mask\"\n+output_path: \"/home/karnik/output/\"\n+\n+#camera calibration \n+model_type: PINHOLE\n+camera_name: camera\n+image_width: 640\n+image_height: 480\n+distortion_parameters:\n+   k1: 0\n+   k2: 0\n+   p1: 0\n+   p2: 0\n+projection_parameters:\n+   fx: 320\n+   fy: 320\n+   cx: 320\n+   cy: 240\n+\n+# Extrinsic parameter between IMU and Camera.\n+estimate_extrinsic: 0   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.\n+                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.\n+                        # 2  Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning.                        \n+#If you choose 0 or 1, you should write down the following matrix.\n+\n+#Rotation from camera frame to imu frame, imu^R_cam\n+extrinsicRotation: !!opencv-matrix\n+   rows: 3\n+   cols: 3\n+   dt: d\n+   data: [-0.00051157,-0.00020788,0.99999985,0.99999972,-0.00054877,0.00051145,0.00054867,0.99999983,0.00020816] #[0,0,1,1,0,0,0,1,0] \n+\n+#Translation from camera frame to imu frame, imu^T_cam\n+extrinsicTranslation: !!opencv-matrix\n+   rows: 3\n+   cols: 1\n+   dt: d\n+   data: [0.53937284, 0.02825352, 0.02064487] #[0.50,0,0]\n+\n+#feature traker parameters\n+max_cnt: 300          # max feature number in feature tracking\n+min_dist: 30            # min distance between two features \n+freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image \n+F_threshold: 1.0        # ransac threshold (pixel)\n+show_track: 1           # publish tracking image as topic\n+equalize: 1             # if image is too dark or light, trun on equalize to find enough features\n+fisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points\n+\n+#optimization parameters\n+max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time\n+max_num_iterations: 8   # max solver itrations, to guarantee real time\n+keyframe_parallax: 10.0 # keyframe selection threshold (pixel)\n+\n+#imu parameters       The more accurate parameters you provide, the better performance\n+acc_n: 0.1          # accelerometer measurement noise standard deviation. #0.2   0.04\n+gyr_n: 0.001         # gyroscope measurement noise standard deviation.     #0.05  0.004\n+acc_w: 0.0002         # accelerometer bias random work noise standard deviation.  #0.02\n+gyr_w: 2.0e-5       # gyroscope bias random work noise standard deviation.     #4.0e-5\n+g_norm: 9.8067     # gravity magnitude\n+\n+#loop closure parameters\n+loop_closure: 1                    # start loop closure\n+load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'\n+fast_relocalization: 0             # useful in real-time and large project\n+pose_graph_save_path: \"/home/karnik/\" # save and load path\n+\n+#unsynchronization parameters\n+estimate_td: 0                      # online estimate time offset between camera and imu\n+td: -0.051134656900876006 # -0.04109                          # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)\n+\n+#rolling shutter parameters\n+rolling_shutter: 0                  # 0: global shutter camera, 1: rolling shutter camera\n+rolling_shutter_tr: 0               # unit: s. rolling shutter read out time per frame (from data sheet). \n+\n+#visualization parameters\n+save_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0 \n+visualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results\n+visualize_camera_size: 0.4      # size of camera marker in RVIZ\ndiff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch\nnew file mode 100644\nindex 0000000..82ed3a8\n--- /dev/null\n+++ b/vins_estimator/launch/playback_vins_airsim.launch\n@@ -0,0 +1,29 @@\n+<launch>\n+\n+  <arg name=\"playback_rate\" default=\"1.0\" />\n+  <arg name=\"start_from\" default=\"20\" />\n+  <arg name=\"bagfile_path\" default=\"/home/karnik/dataset/airsim/CustomWarehouse/capoeira_new_1/sequence.bag\"/>\n+\n+  <arg name=\"config_path\" default = \"/home/karnik/catkin_ws/src/VINS-Mono/config/airsim_config.yaml\" />\n+  <arg name=\"vins_path\" default = \"$(find feature_tracker)/../config/../\" />\n+\n+  <!-- Use sim time -->\n+  <param name =\"/use_sim_time\" value=\"true\"/>\n+  \n+  <!-- Bag playback -->\n+  <node name=\"rosbag_playback\" pkg=\"rosbag\" type=\"play\" required=\"true\" args=\" --delay=10 --clock --queue=1000 -r $(arg playback_rate) -s $(arg start_from) $(arg bagfile_path)\"/>\n+\n+  <!-- VINS-Mono nodes -->\n+  <node name=\"feature_tracker\" pkg=\"feature_tracker\" type=\"feature_tracker\" output=\"log\">\n+    <param name=\"config_file\" type=\"string\" value=\"$(arg config_path)\" />\n+    <param name=\"vins_folder\" type=\"string\" value=\"$(arg vins_path)\" />\n+  </node>\n+\n+  <node name=\"vins_estimator\" pkg=\"vins_estimator\" type=\"vins_estimator\" output=\"screen\">\n+    <param name=\"config_file\" type=\"string\" value=\"$(arg config_path)\" />\n+    <param name=\"vins_folder\" type=\"string\" value=\"$(arg vins_path)\" />\n+  </node>\n+\n+  <node name=\"rvizvisualisation\" pkg=\"rviz\" type=\"rviz\" output=\"log\" args=\"-d $(find vins_estimator)/../config/vins_rviz_config.rviz\" />\n+\n+</launch>\n-- \n2.17.1\n\n\nFrom 3e110dc2125a47fd1f0bdad80065a0a2f1d0cf4f Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Tue, 2 Feb 2021 09:19:45 +0530\nSubject: [PATCH 31/42] Use only largest plane inside estimator\n\n---\n config/airsim_config.yaml                     |  2 +-\n .../launch/playback_vins_airsim.launch        |  4 ++--\n vins_estimator/src/estimator.cpp              |  8 +++++--\n vins_estimator/src/estimator.h                |  2 ++\n vins_estimator/src/feature_manager.cpp        | 24 +++++++++++++++++++\n vins_estimator/src/feature_manager.h          |  1 +\n 6 files changed, 36 insertions(+), 5 deletions(-)\n\ndiff --git a/config/airsim_config.yaml b/config/airsim_config.yaml\nindex f7dda5c..ed9c012 100644\n--- a/config/airsim_config.yaml\n+++ b/config/airsim_config.yaml\n@@ -43,7 +43,7 @@ extrinsicTranslation: !!opencv-matrix\n    data: [0.53937284, 0.02825352, 0.02064487] #[0.50,0,0]\n \n #feature traker parameters\n-max_cnt: 300          # max feature number in feature tracking\n+max_cnt: 250          # max feature number in feature tracking\n min_dist: 30            # min distance between two features \n freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image \n F_threshold: 1.0        # ransac threshold (pixel)\ndiff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch\nindex 82ed3a8..e86876a 100644\n--- a/vins_estimator/launch/playback_vins_airsim.launch\n+++ b/vins_estimator/launch/playback_vins_airsim.launch\n@@ -1,8 +1,8 @@\n <launch>\n \n   <arg name=\"playback_rate\" default=\"1.0\" />\n-  <arg name=\"start_from\" default=\"20\" />\n-  <arg name=\"bagfile_path\" default=\"/home/karnik/dataset/airsim/CustomWarehouse/capoeira_new_1/sequence.bag\"/>\n+  <arg name=\"start_from\" default=\"0\" />\n+  <arg name=\"bagfile_path\" default=\"/home/karnik/dataset/airsim/CustomWarehouse/static_new_15_1/sequence.bag\"/>\n \n   <arg name=\"config_path\" default = \"/home/karnik/catkin_ws/src/VINS-Mono/config/airsim_config.yaml\" />\n   <arg name=\"vins_path\" default = \"$(find feature_tracker)/../config/../\" />\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 88e0942..89d5b15 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -54,6 +54,7 @@ void Estimator::clearState()\n \n     para_N.clear();\n     para_d.clear();\n+    init_pids.clear();\n \n     solver_flag = INITIAL;\n     first_imu = false,\n@@ -453,6 +454,7 @@ bool Estimator::visualInitialAlign(map<int, Eigen::Vector3d> &relative_T, Eigen:\n         else\n            para_d[t.first] = {s*lt.norm()/t.second.norm()}; \n \n+        init_pids.push_back(t.first);\n         ROS_INFO(\"Initialized plane %d features\", t.first);\n     } \n \n@@ -622,6 +624,7 @@ void Estimator::initializeNewPlanes()\n                         // Estimate plane d using known metric t from vio \n                         vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]);\n                         para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())};\n+                        init_pids.push_back(plane_id);\n                         ROS_INFO(\"Initialized plane %d features\", plane_id);\n                         break;\n                     }\n@@ -886,6 +889,8 @@ void Estimator::optimization()\n     }\n     int f_m_cnt = 0;\n     int feature_index = -1;\n+    int largest_pid = f_manager.getLargestPlaneId(init_pids);\n+    ROS_INFO(\"Using features from plane %d\", largest_pid);\n     for (auto &it_per_id : f_manager.feature)\n     {\n         it_per_id.used_num = it_per_id.feature_per_frame.size();\n@@ -899,7 +904,7 @@ void Estimator::optimization()\n         Vector3d pts_i = it_per_id.feature_per_frame[0].point;\n         int pid = it_per_id.plane_id;\n \n-        if(!(para_N.count(pid) > 0 && para_d.count(pid) > 0))\n+        if(pid != largest_pid)\n             continue;\n \n         for (auto &it_per_frame : it_per_id.feature_per_frame)\n@@ -974,7 +979,6 @@ void Estimator::optimization()\n                 }\n             }\n         }\n-\n     }\n \n     ceres::Solver::Options options;\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex 35aaff8..49cd656 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -120,6 +120,8 @@ class Estimator\n     map<int, array<double,4>> para_N;\n     map<int, array<double,1>> para_d;\n \n+    vector<int> init_pids;\n+\n     int loop_window_index;\n \n     MarginalizationInfo *last_marginalization_info;\ndiff --git a/vins_estimator/src/feature_manager.cpp b/vins_estimator/src/feature_manager.cpp\nindex e82c694..16cf5e5 100644\n--- a/vins_estimator/src/feature_manager.cpp\n+++ b/vins_estimator/src/feature_manager.cpp\n@@ -42,6 +42,30 @@ int FeatureManager::getFeatureCount()\n     return cnt;\n }\n \n+int FeatureManager::getLargestPlaneId(std::vector<int> &init_pids)\n+{\n+    std::map<int, int> plane_counts;\n+    for(auto &pid : init_pids)\n+        plane_counts[pid] = 0;\n+\n+    for(auto &it : feature)\n+    {\n+        if(plane_counts.count(it.plane_id) > 0)\n+            plane_counts[it.plane_id]++;\n+    }\n+\n+    int largest_count = 0, largest_pid;\n+    for(auto &plane_count : plane_counts)\n+    {   \n+        if(plane_count.second > largest_count)\n+        {\n+            largest_count = plane_count.second;\n+            largest_pid = plane_count.first;\n+        }\n+    }\n+    \n+    return largest_pid;\n+}\n \n bool FeatureManager::addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, double td)\n {\ndiff --git a/vins_estimator/src/feature_manager.h b/vins_estimator/src/feature_manager.h\nindex a7f46fc..1080e7f 100644\n--- a/vins_estimator/src/feature_manager.h\n+++ b/vins_estimator/src/feature_manager.h\n@@ -75,6 +75,7 @@ class FeatureManager\n     void clearState();\n \n     int getFeatureCount();\n+    int getLargestPlaneId(std::vector<int> &init_pids);\n \n     bool addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, double td);\n     void debugShow();\n-- \n2.17.1\n\n\nFrom 20033975cd339f20def81dab5904dd2843d692cc Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 4 Feb 2021 09:46:20 +0530\nSubject: [PATCH 32/42] Enforce unit N using homo parameterization\n\n---\n .../launch/playback_vins_airsim.launch        |  3 ++-\n vins_estimator/src/estimator.cpp              |  8 ++++----\n vins_estimator/src/estimator.h                |  2 +-\n vins_estimator/src/factor/homography_factor.h |  7 +++----\n vins_estimator/src/initial/initial_sfm.cpp    | 20 +++++++++----------\n vins_estimator/src/initial/initial_sfm.h      |  4 ++--\n 6 files changed, 22 insertions(+), 22 deletions(-)\n\ndiff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch\nindex e86876a..61d8cac 100644\n--- a/vins_estimator/launch/playback_vins_airsim.launch\n+++ b/vins_estimator/launch/playback_vins_airsim.launch\n@@ -2,7 +2,7 @@\n \n   <arg name=\"playback_rate\" default=\"1.0\" />\n   <arg name=\"start_from\" default=\"0\" />\n-  <arg name=\"bagfile_path\" default=\"/home/karnik/dataset/airsim/CustomWarehouse/static_new_15_1/sequence.bag\"/>\n+  <arg name=\"bagfile_path\" default=\"/home/karnik/dataset/airsim/CustomWarehouse/final/static/sequence.bag\"/>\n \n   <arg name=\"config_path\" default = \"/home/karnik/catkin_ws/src/VINS-Mono/config/airsim_config.yaml\" />\n   <arg name=\"vins_path\" default = \"$(find feature_tracker)/../config/../\" />\n@@ -26,4 +26,5 @@\n \n   <node name=\"rvizvisualisation\" pkg=\"rviz\" type=\"rviz\" output=\"log\" args=\"-d $(find vins_estimator)/../config/vins_rviz_config.rviz\" />\n \n+ <node name=\"throttle\" pkg=\"topic_tools\" type=\"throttle\" args=\"messages /imu 200 /imu_throttled\"/>\n </launch>\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 89d5b15..d4ec043 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -302,7 +302,7 @@ bool Estimator::initialStructure()\n     // Save estimated normal variables\n     for(auto &est_n : n)\n     {\n-        para_N[est_n.first] = {0, est_n.second(0), est_n.second(1), est_n.second(2)};\n+        para_N[est_n.first] = {est_n.second(0), est_n.second(1), est_n.second(2)};\n     }\n \n     //solve pnp for all frame\n@@ -619,7 +619,7 @@ void Estimator::initializeNewPlanes()\n                     {\n                         // Transform est_n to global frame\n                         est_n = Rs[i] * est_n;\n-                        para_N[plane_id] = {0, est_n(0), est_n(1), est_n(2)};\n+                        para_N[plane_id] = {est_n(0), est_n(1), est_n(2)};\n \n                         // Estimate plane d using known metric t from vio \n                         vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]);\n@@ -846,8 +846,8 @@ void Estimator::optimization()\n \n     for(auto &N : para_N)\n     {\n-        ceres::LocalParameterization *local_n_parameterization = new ceres::QuaternionParameterization();\n-        problem.AddParameterBlock(N.second.data(), 4, local_n_parameterization);\n+        ceres::LocalParameterization *local_n_parameterization = new ceres::HomogeneousVectorParameterization(3);\n+        problem.AddParameterBlock(N.second.data(), 3, local_n_parameterization);\n     }\n \n     for (int i = 0; i < NUM_OF_CAM; i++)\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex 49cd656..e794866 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -117,7 +117,7 @@ class Estimator\n     double para_Retrive_Pose[SIZE_POSE];\n     double para_Td[1][1];\n     double para_Tr[1][1];\n-    map<int, array<double,4>> para_N;\n+    map<int, array<double,3>> para_N;\n     map<int, array<double,1>> para_d;\n \n     vector<int> init_pids;\ndiff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h\nindex ce6612b..adb6397 100644\n--- a/vins_estimator/src/factor/homography_factor.h\n+++ b/vins_estimator/src/factor/homography_factor.h\n@@ -22,10 +22,9 @@ struct HomographyFactor\n             Eigen::Quaternion<T> qic;\n             qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6];\n \n-            Eigen::Map<const Eigen::Matrix<T, 4, 1>> q_n(para_n);\n             Eigen::Matrix<T, 3, 1> n;\n-            n << q_n[1], q_n[2], q_n[3];\n-            Eigen::Matrix<T, 3, 1> n_imu_1 = qic*(n.normalized()) + tic;\n+            n << para_n[0], para_n[1], para_n[2];\n+            Eigen::Matrix<T, 3, 1> n_imu_1 = qic*n + tic;\n             Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi;\n \n             Eigen::Map<const Eigen::Matrix<T, 1, 1>> inv_depth(para_inv_depth);\n@@ -47,7 +46,7 @@ struct HomographyFactor\n \n     static ceres::CostFunction* Create(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j)\n     {\n-        return (new ceres::AutoDiffCostFunction<HomographyFactor, 2, 7, 7, 4, 1, 7>\n+        return (new ceres::AutoDiffCostFunction<HomographyFactor, 2, 7, 7, 3, 1, 7>\n                 (new HomographyFactor(_pts_i, _pts_j)));\n     }\n \ndiff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp\nindex ad9f729..7726410 100644\n--- a/vins_estimator/src/initial/initial_sfm.cpp\n+++ b/vins_estimator/src/initial/initial_sfm.cpp\n@@ -342,7 +342,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \tQuaterniond c_Quat[frame_num];\n \tdouble c_rotation[frame_num][4];\n \tdouble c_translation[frame_num][3];\n-    double c_normal[4];\n+    double c_normal[3];\n \tEigen::Matrix<double, 3, 4> Pose[frame_num];\n \n \tc_Quat[l] = q[l].inverse();\n@@ -420,7 +420,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n             sfm_f[j].image[1] = point1[1];\n             sfm_f[j].image[2] = 1;\n \t\t\t//cout << \"trangulated : \" << frame_0 << \" \" << frame_1 << \"  3d point : \"  << j << \"  \" << point_3d.transpose() << endl;\n-\t\t}\t\t\n+\t\t}\t\n \t}\n \n /*\n@@ -439,6 +439,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t//full BA\n \tceres::Problem problem;\n \tceres::LocalParameterization* local_parameterization = new ceres::QuaternionParameterization();\n+\tceres::LocalParameterization* local_n_parameterization = new ceres::HomogeneousVectorParameterization(3);\n \t//cout << \" begin full BA \" << endl;\n \tfor (int i = 0; i < frame_num; i++)\n \t{\n@@ -462,11 +463,10 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t}\n \t}\n \n-\tc_normal[0] = 0;\n-    c_normal[1] = n[0];\n-    c_normal[2] = n[1];\n-    c_normal[3] = n[2];\n-    problem.AddParameterBlock(c_normal, 4, local_parameterization);\n+    c_normal[0] = n[0];\n+    c_normal[1] = n[1];\n+    c_normal[2] = n[2];\n+    problem.AddParameterBlock(c_normal, 3, local_n_parameterization);\n \n \tfor (int i = 0; i < feature_num; i++)\n \t{\n@@ -527,9 +527,9 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t\tsfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]);\n \t}\n \n-    n[0] = c_normal[1];\n-    n[1] = c_normal[2];\n-    n[2] = c_normal[3];\n+    n[0] = c_normal[0];\n+    n[1] = c_normal[1];\n+    n[2] = c_normal[2];\n \n \treturn true;\n \ndiff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h\nindex 3c51e44..2ee3418 100644\n--- a/vins_estimator/src/initial/initial_sfm.h\n+++ b/vins_estimator/src/initial/initial_sfm.h\n@@ -67,7 +67,7 @@ struct ReprojectionErrorH\n \t{\n \t\tT rp[3];\n         ceres::QuaternionRotatePoint(R, point, rp);\n-        T np = n[1] * point[0] + n[2] * point[1] + n[3] * point[2];\n+        T np = n[0] * point[0] + n[1] * point[1] + n[2] * point[2];\n         rp[0] += t[0]*np; rp[1] += t[1]*np; rp[2] += t[2]*np;\n \t\tT xp = rp[0] / rp[2];\n     \tT yp = rp[1] / rp[2];\n@@ -80,7 +80,7 @@ struct ReprojectionErrorH\n \t                                   const double observed_y) \n \t{\n \t  return (new ceres::AutoDiffCostFunction<\n-\t          ReprojectionErrorH, 2, 4, 3, 4, 3>(\n+\t          ReprojectionErrorH, 2, 4, 3, 3, 3>(\n \t          \tnew ReprojectionErrorH(observed_x,observed_y)));\n \t}\n \n-- \n2.17.1\n\n\nFrom 8e05539a6ebdaa1ebe9e867551124d7cd7867ee3 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Tue, 9 Feb 2021 09:22:40 +0530\nSubject: [PATCH 33/42] Fix bug in normal transformation\n\n---\n vins_estimator/src/factor/homography_factor.h | 4 ++--\n vins_estimator/src/initial/initial_sfm.cpp    | 2 +-\n vins_estimator/src/initial/initial_sfm.h      | 7 +++----\n 3 files changed, 6 insertions(+), 7 deletions(-)\n\ndiff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h\nindex adb6397..7107036 100644\n--- a/vins_estimator/src/factor/homography_factor.h\n+++ b/vins_estimator/src/factor/homography_factor.h\n@@ -24,8 +24,8 @@ struct HomographyFactor\n \n             Eigen::Matrix<T, 3, 1> n;\n             n << para_n[0], para_n[1], para_n[2];\n-            Eigen::Matrix<T, 3, 1> n_imu_1 = qic*n + tic;\n-            Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_1 - qi.inverse()*pi;\n+            Eigen::Matrix<T, 3, 1> n_imu_1 = qic*n;// + tic;\n+            Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_1;// - qi.inverse()*pi;\n \n             Eigen::Map<const Eigen::Matrix<T, 1, 1>> inv_depth(para_inv_depth);\n \ndiff --git a/vins_estimator/src/initial/initial_sfm.cpp b/vins_estimator/src/initial/initial_sfm.cpp\nindex 7726410..f319bd9 100644\n--- a/vins_estimator/src/initial/initial_sfm.cpp\n+++ b/vins_estimator/src/initial/initial_sfm.cpp\n@@ -481,7 +481,7 @@ bool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n \n     \t\tproblem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l],\n-    \t\t\t\t\t\t\t\tsfm_f[i].position); \n+    \t\t\t\t\t\t\t\tsfm_f[i].position);\n \n \t\t\tceres::CostFunction* h_cost_function = ReprojectionErrorH::Create(\n \t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\ndiff --git a/vins_estimator/src/initial/initial_sfm.h b/vins_estimator/src/initial/initial_sfm.h\nindex 2ee3418..9699356 100644\n--- a/vins_estimator/src/initial/initial_sfm.h\n+++ b/vins_estimator/src/initial/initial_sfm.h\n@@ -11,8 +11,6 @@\n using namespace Eigen;\n using namespace std;\n \n-\n-\n struct SFMFeature\n {\n     bool state;\n@@ -66,8 +64,9 @@ struct ReprojectionErrorH\n \tbool operator()(const T* const R, const T* const t, const T* const n, const T* point, T* residuals) const\n \t{\n \t\tT rp[3];\n-        ceres::QuaternionRotatePoint(R, point, rp);\n-        T np = n[0] * point[0] + n[1] * point[1] + n[2] * point[2];\n+\t\tT norm_point[3] = {point[0]/point[2], point[1]/point[2], point[2]/point[2]};\n+        ceres::QuaternionRotatePoint(R, norm_point, rp);\n+        T np = n[0] * norm_point[0] + n[1] * norm_point[1] + n[2] * norm_point[2];\n         rp[0] += t[0]*np; rp[1] += t[1]*np; rp[2] += t[2]*np;\n \t\tT xp = rp[0] / rp[2];\n     \tT yp = rp[1] / rp[2];\n-- \n2.17.1\n\n\nFrom e5f5255debd246caf27345f43b185b5bc58d968b Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Tue, 9 Feb 2021 09:23:13 +0530\nSubject: [PATCH 34/42] Modify config\n\n---\n config/hvins_airsim_config.yaml               | 83 +++++++++++++++++++\n .../launch/playback_vins_airsim.launch        |  4 +-\n 2 files changed, 85 insertions(+), 2 deletions(-)\n create mode 100644 config/hvins_airsim_config.yaml\n\ndiff --git a/config/hvins_airsim_config.yaml b/config/hvins_airsim_config.yaml\nnew file mode 100644\nindex 0000000..5af6480\n--- /dev/null\n+++ b/config/hvins_airsim_config.yaml\n@@ -0,0 +1,83 @@\n+%YAML:1.0\n+\n+#common parameters\n+imu_topic: \"/imu_throttled\"\n+image_topic: \"/image\"\n+mask_topic: \"/mask\"\n+output_path: \"/home/karnik/output/\"\n+\n+#camera calibration \n+model_type: PINHOLE\n+camera_name: camera\n+image_width: 640\n+image_height: 480\n+distortion_parameters:\n+   k1: 0\n+   k2: 0\n+   p1: 0\n+   p2: 0\n+projection_parameters:\n+   fx: 320\n+   fy: 320\n+   cx: 320\n+   cy: 240\n+\n+# Extrinsic parameter between IMU and Camera.\n+estimate_extrinsic: 0   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.\n+                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.\n+                        # 2  Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning.                        \n+#If you choose 0 or 1, you should write down the following matrix.\n+\n+#Rotation from camera frame to imu frame, imu^R_cam\n+extrinsicRotation: !!opencv-matrix\n+   rows: 3\n+   cols: 3\n+   dt: d\n+   data: [-0.00017913,-0.00042042,0.9999999,0.99999996,-0.00023746, 0.00017903,0.00023739,0.99999988,0.00042046] #[0,0,1,1,0,0,0,1,0] \n+\n+#Translation from camera frame to imu frame, imu^T_cam\n+extrinsicTranslation: !!opencv-matrix\n+   rows: 3\n+   cols: 1\n+   dt: d\n+   data: [0.49118794, 0.02176473, 0.01374164] #[0.50,0,0]\n+\n+#feature traker parameters\n+max_cnt: 150          # max feature number in feature tracking\n+min_dist: 30            # min distance between two features \n+freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image \n+F_threshold: 1.0        # ransac threshold (pixel)\n+show_track: 1           # publish tracking image as topic\n+equalize: 1             # if image is too dark or light, trun on equalize to find enough features\n+fisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points\n+\n+#optimization parameters\n+max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time\n+max_num_iterations: 8   # max solver itrations, to guarantee real time\n+keyframe_parallax: 10.0 # keyframe selection threshold (pixel)\n+\n+#imu parameters       The more accurate parameters you provide, the better performance\n+acc_n: 0.074427240          # accelerometer measurement noise standard deviation. #0.2   0.04\n+gyr_n: 0.002759607         # gyroscope measurement noise standard deviation.     #0.05  0.004\n+acc_w: 3.9471004e-7         # accelerometer bias random work noise standard deviation.  #0.02\n+gyr_w: 3.1538983e-8       # gyroscope bias random work noise standard deviation.     #4.0e-5\n+g_norm: 9.80665     # gravity magnitude\n+\n+#loop closure parameters\n+loop_closure: 1                    # start loop closure\n+load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'\n+fast_relocalization: 0             # useful in real-time and large project\n+pose_graph_save_path: \"/home/karnik/\" # save and load path\n+\n+#unsynchronization parameters\n+estimate_td: 0                      # online estimate time offset between camera and imu\n+td: -0.03079132514112524 # -0.04109                          # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)\n+\n+#rolling shutter parameters\n+rolling_shutter: 0                  # 0: global shutter camera, 1: rolling shutter camera\n+rolling_shutter_tr: 0               # unit: s. rolling shutter read out time per frame (from data sheet). \n+\n+#visualization parameters\n+save_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0 \n+visualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results\n+visualize_camera_size: 0.4      # size of camera marker in RVIZ\ndiff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch\nindex 61d8cac..9925a21 100644\n--- a/vins_estimator/launch/playback_vins_airsim.launch\n+++ b/vins_estimator/launch/playback_vins_airsim.launch\n@@ -8,10 +8,10 @@\n   <arg name=\"vins_path\" default = \"$(find feature_tracker)/../config/../\" />\n \n   <!-- Use sim time -->\n-  <param name =\"/use_sim_time\" value=\"true\"/>\n+  <param name =\"/use_sim_time\" value=\"false\"/>\n   \n   <!-- Bag playback -->\n-  <node name=\"rosbag_playback\" pkg=\"rosbag\" type=\"play\" required=\"true\" args=\" --delay=10 --clock --queue=1000 -r $(arg playback_rate) -s $(arg start_from) $(arg bagfile_path)\"/>\n+  <node name=\"rosbag_playback\" pkg=\"rosbag\" type=\"play\" required=\"true\" args=\" --delay=10 --queue=1000 -r $(arg playback_rate) -s $(arg start_from) $(arg bagfile_path)\"/>\n \n   <!-- VINS-Mono nodes -->\n   <node name=\"feature_tracker\" pkg=\"feature_tracker\" type=\"feature_tracker\" output=\"log\">\n-- \n2.17.1\n\n\nFrom 2de4f06206bb221405ffa3e10f99766d6e0847e0 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 11 Feb 2021 11:13:51 +0530\nSubject: [PATCH 35/42] Modify config, use only H inliers for mean vector\n\n---\n config/hvins_airsim_config.yaml           |  2 +-\n config/vins_rviz_config.rviz              | 10 +++++-----\n vins_estimator/src/initial/solve_5pts.cpp | 18 +++++++++++-------\n 3 files changed, 17 insertions(+), 13 deletions(-)\n\ndiff --git a/config/hvins_airsim_config.yaml b/config/hvins_airsim_config.yaml\nindex 5af6480..55546be 100644\n--- a/config/hvins_airsim_config.yaml\n+++ b/config/hvins_airsim_config.yaml\n@@ -43,7 +43,7 @@ extrinsicTranslation: !!opencv-matrix\n    data: [0.49118794, 0.02176473, 0.01374164] #[0.50,0,0]\n \n #feature traker parameters\n-max_cnt: 150          # max feature number in feature tracking\n+max_cnt: 200          # max feature number in feature tracking\n min_dist: 30            # min distance between two features \n freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image \n F_threshold: 1.0        # ransac threshold (pixel)\ndiff --git a/config/vins_rviz_config.rviz b/config/vins_rviz_config.rviz\nindex 55ed131..e0c3a51 100644\n--- a/config/vins_rviz_config.rviz\n+++ b/config/vins_rviz_config.rviz\n@@ -6,7 +6,7 @@ Panels:\n       Expanded:\n         - /VIO1\n       Splitter Ratio: 0.4651159942150116\n-    Tree Height: 151\n+    Tree Height: 140\n   - Class: rviz/Selection\n     Name: Selection\n   - Class: rviz/Tool Properties\n@@ -295,7 +295,7 @@ Visualization Manager:\n           Queue Size: 100\n           Value: true\n         - Class: rviz/Image\n-          Enabled: true\n+          Enabled: false\n           Image Topic: /pose_graph/match_image\n           Max Value: 1\n           Median window: 5\n@@ -305,7 +305,7 @@ Visualization Manager:\n           Queue Size: 2\n           Transport Hint: raw\n           Unreliable: false\n-          Value: true\n+          Value: false\n         - Class: rviz/Marker\n           Enabled: true\n           Marker Topic: /pose_graph/key_odometrys\n@@ -506,7 +506,7 @@ Window Geometry:\n   Height: 734\n   Hide Left Dock: false\n   Hide Right Dock: true\n-  QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000000de0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650100000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001ac000000d4000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002a30000003bfc0100000002fb0000000800540069006d00650100000000000002a3000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000001470000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n+  QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000001740000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650000000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001b7000000c9000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002eb0000003bfc0100000002fb0000000800540069006d00650100000000000002eb000002eb00fffffffb0000000800540069006d006501000000000000045000000000000000000000018f0000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n   Selection:\n     collapsed: false\n   Time:\n@@ -515,7 +515,7 @@ Window Geometry:\n     collapsed: false\n   Views:\n     collapsed: true\n-  Width: 675\n+  Width: 747\n   X: 685\n   Y: 4\n   loop_match_image:\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex a14ec1c..f7aa380 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -240,19 +240,23 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         }\n         cv::Mat mask;\n         //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask);\n-        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/480);\n+        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/460, mask, 2000, 0.99);\n         cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n \n         // Compute mean point vector\n         Eigen::Vector3d mean_l(0, 0, 1);\n-        for(cv::Point2f &point : ll)\n+        int inlier_count = 0;\n+        for(int i = 0; i < ll.size(); i++)\n         {\n-            mean_l(0) += point.x;\n-            mean_l(1) += point.y;\n+            if(mask.at<uchar>(i,0) != 0)\n+            {\n+                mean_l(0) += ll[i].x;\n+                mean_l(1) += ll[i].y;\n+                inlier_count++;\n+            }\n         }\n-\n-        mean_l(0) /= int(ll.size());\n-        mean_l(1) /= int(ll.size());\n+        mean_l(0) /= int(inlier_count);\n+        mean_l(1) /= int(inlier_count);\n \n         Eigen::Matrix4d est_Tr;\n         Eigen::Vector3d est_n;\n-- \n2.17.1\n\n\nFrom 518208852f13b10d7ebc3efaa254564c09f35a46 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Thu, 11 Feb 2021 20:22:00 +0530\nSubject: [PATCH 36/42] Consider mask during feature tracking as well, use H\n for feature rejection instead of F\n\n---\n config/hvins_airsim_config.yaml         |  4 +--\n feature_tracker/src/feature_tracker.cpp | 45 +++++++++++++++++--------\n feature_tracker/src/feature_tracker.h   |  4 +--\n 3 files changed, 35 insertions(+), 18 deletions(-)\n\ndiff --git a/config/hvins_airsim_config.yaml b/config/hvins_airsim_config.yaml\nindex 55546be..7847070 100644\n--- a/config/hvins_airsim_config.yaml\n+++ b/config/hvins_airsim_config.yaml\n@@ -43,10 +43,10 @@ extrinsicTranslation: !!opencv-matrix\n    data: [0.49118794, 0.02176473, 0.01374164] #[0.50,0,0]\n \n #feature traker parameters\n-max_cnt: 200          # max feature number in feature tracking\n+max_cnt: 250          # max feature number in feature tracking\n min_dist: 30            # min distance between two features \n freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image \n-F_threshold: 1.0        # ransac threshold (pixel)\n+H_threshold: 1.0        # ransac threshold (pixel)\n show_track: 1           # publish tracking image as topic\n equalize: 1             # if image is too dark or light, trun on equalize to find enough features\n fisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points\ndiff --git a/feature_tracker/src/feature_tracker.cpp b/feature_tracker/src/feature_tracker.cpp\nindex 6548c36..5537c6b 100644\n--- a/feature_tracker/src/feature_tracker.cpp\n+++ b/feature_tracker/src/feature_tracker.cpp\n@@ -32,13 +32,12 @@ FeatureTracker::FeatureTracker()\n {\n }\n \n-void FeatureTracker::setMask()\n+void FeatureTracker::setMask(const cv::Mat &_mask)\n {\n-    if(FISHEYE)\n-        mask = fisheye_mask.clone();\n-    else\n-        mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255));\n-    \n+    //mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255));\n+    mask = _mask.clone();\n+    //cv::Mat erosion_mat = cv::getStructuringElement(cv::MORPH_RECT, cv::Size(21,21));\n+    cv::erode(mask, mask, cv::Mat(), cv::Point(-1, -1), 2, 1, 1);\n \n     // prefer to keep features that are tracked for long time\n     vector<pair<int, pair<cv::Point2f, pair<int,int>>>> cnt_pts_id_pid;\n@@ -61,7 +60,7 @@ void FeatureTracker::setMask()\n \n     for (auto &it : cnt_pts_id_pid)\n     {\n-        if (mask.at<uchar>(it.second.first) == 255)\n+        if (mask.at<uchar>(it.second.first) != 0)\n         {\n             forw_pts.push_back(it.second.first);\n             ids.push_back(it.second.second.first);\n@@ -165,10 +164,10 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double\n \n     if (PUB_THIS_FRAME)\n     {\n-        rejectWithF();\n+        rejectWithH();\n         ROS_DEBUG(\"set mask begins\");\n         TicToc t_m;\n-        setMask();\n+        setMask(_mask);\n         ROS_DEBUG(\"set mask costs %fms\", t_m.toc());\n \n         ROS_DEBUG(\"detect feature begins\");\n@@ -202,11 +201,11 @@ void FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double\n     prev_time = cur_time;\n }\n \n-void FeatureTracker::rejectWithF()\n+void FeatureTracker::rejectWithH()\n {\n-    if (forw_pts.size() >= 8)\n+    if (forw_pts.size() >= 4)\n     {\n-        ROS_DEBUG(\"FM ransac begins\");\n+        ROS_DEBUG(\"HM ransac begins\");\n         TicToc t_f;\n         vector<cv::Point2f> un_cur_pts(cur_pts.size()), un_forw_pts(forw_pts.size());\n         for (unsigned int i = 0; i < cur_pts.size(); i++)\n@@ -223,9 +222,27 @@ void FeatureTracker::rejectWithF()\n             un_forw_pts[i] = cv::Point2f(tmp_p.x(), tmp_p.y());\n         }\n \n-        vector<uchar> status;\n-        cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n+        vector<uchar> status(un_cur_pts.size());\n+        //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n         //cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status);\n+\n+        map<int, vector<cv::Point2f>> pid_un_cur_pts, pid_un_forw_pts;\n+        map<pair<int,int>, int> new_ids_to_old;\n+        for(uint i = 0; i < un_cur_pts.size(); i++)\n+        {\n+            pid_un_cur_pts[plane_ids[i]].push_back(un_cur_pts[i]);\n+            pid_un_forw_pts[plane_ids[i]].push_back(un_forw_pts[i]);\n+            new_ids_to_old[make_pair(plane_ids[i], pid_un_cur_pts[plane_ids[i]].size()-1)] = i;\n+        }\n+\n+        for(auto &imap : pid_un_cur_pts)\n+        {\n+            vector<uchar> pid_status;\n+            cv::findHomography(pid_un_cur_pts[imap.first], pid_un_forw_pts[imap.first], cv::RANSAC, 1.0, pid_status, 2000, 0.99);\n+            for(int i = 0; i < pid_status.size(); i++)\n+                status[new_ids_to_old[make_pair(imap.first, i)]] = pid_status[i];\n+        }\n+\n         int size_a = cur_pts.size();\n         reduceVector(prev_pts, status);\n         reduceVector(cur_pts, status);\ndiff --git a/feature_tracker/src/feature_tracker.h b/feature_tracker/src/feature_tracker.h\nindex 5db3810..72cc8d8 100644\n--- a/feature_tracker/src/feature_tracker.h\n+++ b/feature_tracker/src/feature_tracker.h\n@@ -32,7 +32,7 @@ class FeatureTracker\n \n     void readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time);\n \n-    void setMask();\n+    void setMask(const cv::Mat &_mask);\n \n     void addPoints(const cv::Mat &_mask);\n \n@@ -42,7 +42,7 @@ class FeatureTracker\n \n     void showUndistortion(const string &name);\n \n-    void rejectWithF();\n+    void rejectWithH();\n \n     void undistortedPoints();\n \n-- \n2.17.1\n\n\nFrom 911c4c709cd356869c9a3394bb3a2cf4602de65c Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Fri, 12 Feb 2021 10:58:47 +0530\nSubject: [PATCH 37/42] Revert to airsim spatial extrinsics\n\n---\n config/hvins_airsim_config.yaml | 4 ++--\n 1 file changed, 2 insertions(+), 2 deletions(-)\n\ndiff --git a/config/hvins_airsim_config.yaml b/config/hvins_airsim_config.yaml\nindex 7847070..f8e2908 100644\n--- a/config/hvins_airsim_config.yaml\n+++ b/config/hvins_airsim_config.yaml\n@@ -33,14 +33,14 @@ extrinsicRotation: !!opencv-matrix\n    rows: 3\n    cols: 3\n    dt: d\n-   data: [-0.00017913,-0.00042042,0.9999999,0.99999996,-0.00023746, 0.00017903,0.00023739,0.99999988,0.00042046] #[0,0,1,1,0,0,0,1,0] \n+   data: [0,0,1,1,0,0,0,1,0] #[-0.00017913,-0.00042042,0.9999999,0.99999996,-0.00023746, 0.00017903,0.00023739,0.99999988,0.00042046] #[0,0,1,1,0,0,0,1,0] \n \n #Translation from camera frame to imu frame, imu^T_cam\n extrinsicTranslation: !!opencv-matrix\n    rows: 3\n    cols: 1\n    dt: d\n-   data: [0.49118794, 0.02176473, 0.01374164] #[0.50,0,0]\n+   data: [0.50, 0, 0] #[0.49118794, 0.02176473, 0.01374164] #[0.50,0,0]\n \n #feature traker parameters\n max_cnt: 250          # max feature number in feature tracking\n-- \n2.17.1\n\n\nFrom 592768ea351956ff4a573292e78bc67449438db2 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sun, 14 Feb 2021 12:48:22 +0530\nSubject: [PATCH 38/42] Handle possible decomposeH exception\n\n---\n vins_estimator/src/initial/solve_5pts.cpp | 29 +++++++++++++----------\n 1 file changed, 17 insertions(+), 12 deletions(-)\n\ndiff --git a/vins_estimator/src/initial/solve_5pts.cpp b/vins_estimator/src/initial/solve_5pts.cpp\nindex f7aa380..bfba3f3 100644\n--- a/vins_estimator/src/initial/solve_5pts.cpp\n+++ b/vins_estimator/src/initial/solve_5pts.cpp\n@@ -259,11 +259,14 @@ bool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &c\n         mean_l(1) /= int(inlier_count);\n \n         Eigen::Matrix4d est_Tr;\n-        Eigen::Vector3d est_n;\n+        Eigen::Vector3d est_n(0,0,0);\n         decomposeH(H, K, R_imu, TrIC, mean_l, est_Tr, est_n);\n         Rotation = est_Tr.block(0,0,3,3);\n         Translation = est_Tr.block(0,3,3,1);\n         n = est_n;\n+        if(n.isZero())\n+            return false;\n+\n         return true;\n     }\n     return false;\n@@ -295,7 +298,6 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n             //Tr = TrIC * Tr * TrIC.inverse();\n             Tr = Tr.inverse().eval();\n \n-            Vector3d e3(0, 0, 1);\n             Vector3d n;\n             cv::cv2eigen(cv_ns[i], n);\n             n.normalize();\n@@ -306,18 +308,21 @@ void MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matri\n             }\n         }\n \n-        vector<double> rot_diff;\n-        for(size_t i = 0; i < positive_depth_transforms.size(); i++)\n+        if(positive_depth_transforms.size() > 0)\n         {\n-            Eigen::Matrix4d Tr = TrIC * positive_depth_transforms[i] * TrIC.inverse();\n-            Eigen::Matrix3d R = Tr.block(0,0,3,3);\n-            double f = (R.transpose()*R_imu - MatrixXd::Identity(3,3)).norm();\n-            rot_diff.push_back(f);\n-        }\n+            vector<double> rot_diff;\n+            for(size_t i = 0; i < positive_depth_transforms.size(); i++)\n+            {\n+                Eigen::Matrix4d Tr = TrIC * positive_depth_transforms[i] * TrIC.inverse();\n+                Eigen::Matrix3d R = Tr.block(0,0,3,3);\n+                double f = (R.transpose()*R_imu - MatrixXd::Identity(3,3)).norm();\n+                rot_diff.push_back(f);\n+            }\n \n-        int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin();\n-        est_Tr = positive_depth_transforms[min_index];\n-        est_n = positive_depth_normals[min_index];\n+            int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin();\n+            est_Tr = positive_depth_transforms[min_index];\n+            est_n = positive_depth_normals[min_index];\n+        }\n     }\n \n     else\n-- \n2.17.1\n\n\nFrom 1d0449b500e822224680d90503fcf785a2b28ab2 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sun, 14 Feb 2021 12:48:49 +0530\nSubject: [PATCH 39/42] Move plane variables to camera frame properly\n\n---\n vins_estimator/src/estimator.cpp | 4 ++--\n vins_estimator/src/estimator.h   | 2 ++\n 2 files changed, 4 insertions(+), 2 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex d4ec043..86033e9 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -618,11 +618,11 @@ void Estimator::initializeNewPlanes()\n                     if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, est_R, est_t, est_n))\n                     {\n                         // Transform est_n to global frame\n-                        est_n = Rs[i] * est_n;\n+                        est_n = ric[0].transpose() * Rs[i] * ric[0] * est_n;\n                         para_N[plane_id] = {est_n(0), est_n(1), est_n(2)};\n \n                         // Estimate plane d using known metric t from vio \n-                        vi_t = Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]);\n+                        vi_t = ric[0].transpose() * Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]);\n                         para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())};\n                         init_pids.push_back(plane_id);\n                         ROS_INFO(\"Initialized plane %d features\", plane_id);\ndiff --git a/vins_estimator/src/estimator.h b/vins_estimator/src/estimator.h\nindex e794866..6295758 100644\n--- a/vins_estimator/src/estimator.h\n+++ b/vins_estimator/src/estimator.h\n@@ -117,6 +117,8 @@ class Estimator\n     double para_Retrive_Pose[SIZE_POSE];\n     double para_Td[1][1];\n     double para_Tr[1][1];\n+    \n+    // In global camera frame (c_0)\n     map<int, array<double,3>> para_N;\n     map<int, array<double,1>> para_d;\n \n-- \n2.17.1\n\n\nFrom fe8857b092935da13daeee267612bc1a6e6c11bf Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sun, 14 Feb 2021 19:50:16 +0530\nSubject: [PATCH 40/42] Transform d variable between frames properly\n\n---\n vins_estimator/src/estimator.cpp              |  7 +++---\n vins_estimator/src/factor/homography_factor.h | 24 ++++++++++++++-----\n 2 files changed, 22 insertions(+), 9 deletions(-)\n\ndiff --git a/vins_estimator/src/estimator.cpp b/vins_estimator/src/estimator.cpp\nindex 86033e9..26836fc 100644\n--- a/vins_estimator/src/estimator.cpp\n+++ b/vins_estimator/src/estimator.cpp\n@@ -621,9 +621,10 @@ void Estimator::initializeNewPlanes()\n                         est_n = ric[0].transpose() * Rs[i] * ric[0] * est_n;\n                         para_N[plane_id] = {est_n(0), est_n(1), est_n(2)};\n \n-                        // Estimate plane d using known metric t from vio \n-                        vi_t = ric[0].transpose() * Rs[i].transpose() * (Ps[WINDOW_SIZE - 1] - Ps[i]);\n-                        para_d[plane_id] = {(1.0*vi_t.norm()/est_t.norm())};\n+                        // Estimate plane d (in global frame) using known metric t from vio \n+                        vi_t = (Ps[WINDOW_SIZE - 1] - Ps[i]);\n+                        double di = (1.0*vi_t.norm())/est_t.norm();\n+                        para_d[plane_id] = {di + (ric[0].transpose() * Ps[i]).dot(est_n)};\n                         init_pids.push_back(plane_id);\n                         ROS_INFO(\"Initialized plane %d features\", plane_id);\n                         break;\ndiff --git a/vins_estimator/src/factor/homography_factor.h b/vins_estimator/src/factor/homography_factor.h\nindex 7107036..f5eb3dc 100644\n--- a/vins_estimator/src/factor/homography_factor.h\n+++ b/vins_estimator/src/factor/homography_factor.h\n@@ -7,7 +7,7 @@ struct HomographyFactor\n     HomographyFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) {}\n \n     template <typename T>\n-        bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const para_inv_depth, const T* const ex_pose, \n+        bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const para_depth, const T* const ex_pose, \n         T* residuals) const\n         {\n             Eigen::Map<const Eigen::Matrix<T, 3, 1>> pi(pose_i);\n@@ -24,17 +24,29 @@ struct HomographyFactor\n \n             Eigen::Matrix<T, 3, 1> n;\n             n << para_n[0], para_n[1], para_n[2];\n-            Eigen::Matrix<T, 3, 1> n_imu_1 = qic*n;// + tic;\n-            Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_1;// - qi.inverse()*pi;\n \n-            Eigen::Map<const Eigen::Matrix<T, 1, 1>> inv_depth(para_inv_depth);\n+            // transform camera normal to imu normal\n+            Eigen::Matrix<T, 3, 1> n_imu_0 = qic*n;// + tic;\n+\n+            // transform imu 0 normal to imu i normal\n+            Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_0;// - qi.inverse()*pi;\n+\n+            Eigen::Map<const Eigen::Matrix<T, 1, 1>> depth(para_depth);\n \n             Eigen::Quaternion<T> qji = qj.inverse() * qi;\n             Eigen::Matrix<T, 3, 1> tji = qj.inverse() * (pi - pj);\n-            Eigen::Matrix<T, 1, 1> di;\n-            di(0,0) = inv_depth(0,0) - pi.dot(n_imu_i);\n+            Eigen::Matrix<T, 1, 1> di, di0;\n+\n+            // convert camera depth to imu frame\n+            di0(0,0) = depth(0,0) + tic.dot(n_imu_0);\n+            // convert imu 0 depth to imu i depth\n+            di(0,0) = di0(0,0) - pi.dot(n_imu_0);\n+\n             Eigen::Matrix<T, 3, 1> pts_imu_i = qic * pts_i.cast<T>() + tic;\n+\n+            // homography mapping\n             Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + (tji*(1.0/di(0,0)) * n_imu_i.transpose()) * pts_imu_i;\n+\n             Eigen::Matrix<T, 3, 1> pts_cam_j = qic.inverse() * (pts_imu_j - tic);\n \n             pts_cam_j = (pts_cam_j / pts_cam_j[2]);\n-- \n2.17.1\n\n\nFrom c5fa6e1369311dc5f762802c62a201590b9a5a34 Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sun, 21 Feb 2021 17:07:23 +0530\nSubject: [PATCH 41/42] Modify benchmark publisher\n\n---\n benchmark_publisher/launch/publish.launch     |  4 +-\n .../src/benchmark_publisher_node.cpp          | 27 +++-------\n config/vins_rviz_config.rviz                  | 49 ++++++++++++++-----\n .../launch/playback_vins_airsim.launch        |  3 ++\n 4 files changed, 49 insertions(+), 34 deletions(-)\n\ndiff --git a/benchmark_publisher/launch/publish.launch b/benchmark_publisher/launch/publish.launch\nindex 5e066fa..5d52c78 100644\n--- a/benchmark_publisher/launch/publish.launch\n+++ b/benchmark_publisher/launch/publish.launch\n@@ -1,9 +1,9 @@\n <launch>\n <!--MH_01_easy MH_02_easy MH_03_medium MH_04_difficult MH_05_difficult V1_01_easy V1_02_medium V1_03_difficult V2_01_easy V2_02_medium V2_03_difficult  -->\n-\t<arg name=\"sequence_name\" default = \"MH_01_easy\" />\n+\t<arg name=\"sequence_name\" default = \"C4\" />\n \n     <node name=\"benchmark_publisher\" pkg=\"benchmark_publisher\" type=\"benchmark_publisher\" output=\"screen\">\n-        <param name=\"data_name\" type=\"string\" value=\"$(find benchmark_publisher)/config/$(arg sequence_name)/data.csv\" />\n+        <param name=\"data_name\" type=\"string\" value=\"/home/karnik/dataset/airsim/CustomWarehouse/sine_tests/final2/c4_new/gt_data.txt\" />\n         <remap from=\"~estimated_odometry\" to=\"/vins_estimator/odometry\" />\n     </node>\n <!--\ndiff --git a/benchmark_publisher/src/benchmark_publisher_node.cpp b/benchmark_publisher/src/benchmark_publisher_node.cpp\nindex c405c87..04e5d0f 100644\n--- a/benchmark_publisher/src/benchmark_publisher_node.cpp\n+++ b/benchmark_publisher/src/benchmark_publisher_node.cpp\n@@ -11,7 +11,7 @@\n using namespace std;\n using namespace Eigen;\n \n-const int SKIP = 50;\n+const int SKIP = 100;\n string benchmark_output_path;\n string estimate_output_path;\n template <typename T>\n@@ -34,22 +34,16 @@ struct Data\n {\n     Data(FILE *f)\n     {\n-        if (fscanf(f, \" %lf,%f,%f,%f,%f,%f,%f,%f,%f,%f,%f,%f,%f,%f,%f,%f,%f\", &t,\n+        if (fscanf(f, \" %lf %f %f %f %f %f %f %f\", &t,\n                &px, &py, &pz,\n-               &qw, &qx, &qy, &qz,\n-               &vx, &vy, &vz,\n-               &wx, &wy, &wz,\n-               &ax, &ay, &az) != EOF)\n+               &qx, &qy, &qz, &qw) != EOF)\n         {\n-            t /= 1e9;\n+            //t /= 1e9;\n         }\n     }\n     double t;\n     float px, py, pz;\n     float qw, qx, qy, qz;\n-    float vx, vy, vz;\n-    float wx, wy, wz;\n-    float ax, ay, az;\n };\n int idx = 1;\n vector<Data> benchmark;\n@@ -65,7 +59,6 @@ tf::Transform trans;\n \n void odom_callback(const nav_msgs::OdometryConstPtr &odom_msg)\n {\n-    //ROS_INFO(\"odom callback!\");\n     if (odom_msg->header.stamp.toSec() > benchmark.back().t)\n       return;\n   \n@@ -109,12 +102,6 @@ void odom_callback(const nav_msgs::OdometryConstPtr &odom_msg)\n     odometry.pose.pose.orientation.y = tmp_R.y();\n     odometry.pose.pose.orientation.z = tmp_R.z();\n \n-    Vector3d tmp_V = baseRgt * Vector3d{benchmark[idx - 1].vx,\n-                                        benchmark[idx - 1].vy,\n-                                        benchmark[idx - 1].vz};\n-    odometry.twist.twist.linear.x = tmp_V.x();\n-    odometry.twist.twist.linear.y = tmp_V.y();\n-    odometry.twist.twist.linear.z = tmp_V.z();\n     pub_odom.publish(odometry);\n \n     geometry_msgs::PoseStamped pose_stamped;\n@@ -130,9 +117,9 @@ int main(int argc, char **argv)\n     ros::init(argc, argv, \"benchmark_publisher\");\n     ros::NodeHandle n(\"~\");\n \n-    string csv_file = readParam<string>(n, \"data_name\");\n-    std::cout << \"load ground truth \" << csv_file << std::endl;\n-    FILE *f = fopen(csv_file.c_str(), \"r\");\n+    string txt_file = readParam<string>(n, \"data_name\");\n+    std::cout << \"load ground truth \" << txt_file << std::endl;\n+    FILE *f = fopen(txt_file.c_str(), \"r\");\n     if (f==NULL)\n     {\n       ROS_WARN(\"can't load ground truth; wrong path\");\ndiff --git a/config/vins_rviz_config.rviz b/config/vins_rviz_config.rviz\nindex e0c3a51..649da9d 100644\n--- a/config/vins_rviz_config.rviz\n+++ b/config/vins_rviz_config.rviz\n@@ -4,7 +4,9 @@ Panels:\n     Name: Displays\n     Property Tree Widget:\n       Expanded:\n+        - /Global Options1\n         - /VIO1\n+        - /Path1\n       Splitter Ratio: 0.4651159942150116\n     Tree Height: 140\n   - Class: rviz/Selection\n@@ -44,7 +46,7 @@ Visualization Manager:\n       Cell Size: 1\n       Class: rviz/Grid\n       Color: 130; 130; 130\n-      Enabled: true\n+      Enabled: false\n       Line Style:\n         Line Width: 0.029999999329447746\n         Value: Lines\n@@ -57,14 +59,14 @@ Visualization Manager:\n       Plane: XY\n       Plane Cell Count: 10\n       Reference Frame: <Fixed Frame>\n-      Value: true\n+      Value: false\n     - Class: rviz/Axes\n-      Enabled: true\n+      Enabled: false\n       Length: 1\n       Name: Axes\n       Radius: 0.10000000149011612\n       Reference Frame: <Fixed Frame>\n-      Value: true\n+      Value: false\n     - Alpha: 1\n       Buffer Length: 1\n       Class: rviz/Path\n@@ -454,9 +456,32 @@ Visualization Manager:\n           Value: true\n       Enabled: true\n       Name: pose_graph\n+    - Alpha: 1\n+      Buffer Length: 1\n+      Class: rviz/Path\n+      Color: 164; 0; 0\n+      Enabled: true\n+      Head Diameter: 0.30000001192092896\n+      Head Length: 0.20000000298023224\n+      Length: 0.30000001192092896\n+      Line Style: Lines\n+      Line Width: 0.029999999329447746\n+      Name: Path\n+      Offset:\n+        X: 0\n+        Y: 0\n+        Z: 0\n+      Pose Color: 255; 85; 255\n+      Pose Style: None\n+      Radius: 0.029999999329447746\n+      Shaft Diameter: 0.10000000149011612\n+      Shaft Length: 0.10000000149011612\n+      Topic: /benchmark_publisher/path\n+      Unreliable: false\n+      Value: true\n   Enabled: true\n   Global Options:\n-    Background Color: 0; 0; 0\n+    Background Color: 255; 255; 255\n     Default Light: true\n     Fixed Frame: world\n     Frame Rate: 30\n@@ -480,25 +505,25 @@ Visualization Manager:\n   Views:\n     Current:\n       Class: rviz/XYOrbit\n-      Distance: 117.1939697265625\n+      Distance: 63.901485443115234\n       Enable Stereo Rendering:\n         Stereo Eye Separation: 0.05999999865889549\n         Stereo Focal Distance: 1\n         Swap Stereo Eyes: false\n         Value: false\n       Focal Point:\n-        X: -2.70693039894104\n-        Y: -1.2697441577911377\n+        X: 7.038261890411377\n+        Y: -13.610830307006836\n         Z: 2.141062395821791e-5\n       Focal Shape Fixed Size: true\n       Focal Shape Size: 0.05000000074505806\n       Invert Z Axis: false\n       Name: Current View\n       Near Clip Distance: 0.009999999776482582\n-      Pitch: 0.01979677751660347\n+      Pitch: 0.5597963333129883\n       Target Frame: <Fixed Frame>\n       Value: XYOrbit (rviz)\n-      Yaw: 3.0572268962860107\n+      Yaw: 4.052229404449463\n     Saved: ~\n Window Geometry:\n   Displays:\n@@ -506,7 +531,7 @@ Window Geometry:\n   Height: 734\n   Hide Left Dock: false\n   Hide Right Dock: true\n-  QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000001740000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650000000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001b7000000c9000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002eb0000003bfc0100000002fb0000000800540069006d00650100000000000002eb000002eb00fffffffb0000000800540069006d006501000000000000045000000000000000000000018f0000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n+  QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000001740000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650000000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001b7000000c9000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002a30000003bfc0100000002fb0000000800540069006d00650100000000000002a3000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000001470000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n   Selection:\n     collapsed: false\n   Time:\n@@ -515,7 +540,7 @@ Window Geometry:\n     collapsed: false\n   Views:\n     collapsed: true\n-  Width: 747\n+  Width: 675\n   X: 685\n   Y: 4\n   loop_match_image:\ndiff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch\nindex 9925a21..972de2a 100644\n--- a/vins_estimator/launch/playback_vins_airsim.launch\n+++ b/vins_estimator/launch/playback_vins_airsim.launch\n@@ -27,4 +27,7 @@\n   <node name=\"rvizvisualisation\" pkg=\"rviz\" type=\"rviz\" output=\"log\" args=\"-d $(find vins_estimator)/../config/vins_rviz_config.rviz\" />\n \n  <node name=\"throttle\" pkg=\"topic_tools\" type=\"throttle\" args=\"messages /imu 200 /imu_throttled\"/>\n+\n+ <include file=\"$(find benchmark_publisher)/launch/publish.launch\"/>\n+\n </launch>\n-- \n2.17.1\n\n\nFrom 6283cbf075ecba7a2a188d06de362901cfd8429b Mon Sep 17 00:00:00 2001\nFrom: Karnik Ram <karnikram@gmail.com>\nDate: Sat, 27 Feb 2021 09:16:02 +0530\nSubject: [PATCH 42/42] Viz largest plane\n\n---\n config/vins_rviz_config.rviz                  | 10 +++---\n feature_tracker/src/feature_tracker_node.cpp  | 31 +++++++++++++++++--\n .../launch/playback_vins_airsim.launch        |  2 +-\n 3 files changed, 34 insertions(+), 9 deletions(-)\n\ndiff --git a/config/vins_rviz_config.rviz b/config/vins_rviz_config.rviz\nindex 649da9d..2d6da66 100644\n--- a/config/vins_rviz_config.rviz\n+++ b/config/vins_rviz_config.rviz\n@@ -528,10 +528,10 @@ Visualization Manager:\n Window Geometry:\n   Displays:\n     collapsed: false\n-  Height: 734\n+  Height: 711\n   Hide Left Dock: false\n   Hide Right Dock: true\n-  QMainWindow State: 000000ff00000000fd00000004000000000000015600000243fc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d000001740000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650000000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001b7000000c9000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002a30000003bfc0100000002fb0000000800540069006d00650100000000000002a3000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000001470000024300000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n+  QMainWindow State: 000000ff00000000fd0000000400000000000001fb0000022cfc0200000010fb0000000a0049006d00610067006501000000280000013d0000000000000000fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000001200720061007700200049006d0061006700650000000028000000300000000000000000fb00000012007200610077005f0069006d0061006700650000000028000000f90000001600fffffffb0000001a0074007200610063006b0065006400200069006d006100670065010000003d0000015d0000001600fffffffb00000020006c006f006f0070005f006d0061007400630068005f0069006d0061006700650000000121000000850000001600fffffffb000000100044006900730070006c00610079007301000001a0000000c9000000c900fffffffc000000280000011e0000000000fffffffa000000000100000002fb0000001200720061007700200049006d0061006700650000000000ffffffff0000000000000000fb0000001a0074007200610063006b0065006400200069006d0061006700650100000000000002370000000000000000fb0000001000410052005f0069006d0061006700650100000373000000160000000000000000fb0000001200720061007700200069006d006100670065010000038f000000160000000000000000000000010000020800000399fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fc00000028000003990000000000fffffffaffffffff0100000002fb000000100044006900730070006c0061007900730000000000ffffffff0000015600fffffffb0000000a00560069006500770073000000023f0000016a0000010000fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000005130000003bfc0100000002fb0000000800540069006d0065010000000000000513000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000003120000022c00000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000\n   Selection:\n     collapsed: false\n   Time:\n@@ -540,9 +540,9 @@ Window Geometry:\n     collapsed: false\n   Views:\n     collapsed: true\n-  Width: 675\n-  X: 685\n-  Y: 4\n+  Width: 1299\n+  X: 67\n+  Y: 27\n   loop_match_image:\n     collapsed: false\n   raw_image:\ndiff --git a/feature_tracker/src/feature_tracker_node.cpp b/feature_tracker/src/feature_tracker_node.cpp\nindex ba64fed..96b37fd 100644\n--- a/feature_tracker/src/feature_tracker_node.cpp\n+++ b/feature_tracker/src/feature_tracker_node.cpp\n@@ -176,11 +176,36 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n             ptr = cv_bridge::cvtColor(ptr, sensor_msgs::image_encodings::BGR8);\n             //cv::Mat stereo_img(ROW * NUM_OF_CAM, COL, CV_8UC3);\n             cv::Mat stereo_img = ptr->image;\n+            cv::Mat mask_img = mask_ptr->image;\n+            std::vector<int> pids = trackerData[0].plane_ids;\n+\n+            std::set<int> unique_pids(pids.begin(), pids.end());\n+            std::map<int, int> plane_counts;\n+            for(auto &pid : unique_pids)\n+                plane_counts[pid] = 0;\n+\n+            for(auto &pid : pids)\n+                plane_counts[pid]++;\n+\n+            int largest_count = 0, largest_pid;\n+            for(auto &plane_count : plane_counts)\n+            {   \n+                if(plane_count.second > largest_count)\n+                {\n+                    largest_count = plane_count.second;\n+                    largest_pid = plane_count.first;\n+                }\n+            }\n+    \n+            cv::Mat mask = mask_img == largest_pid;\n+            cv::Mat mask_viz(ROW, COL, CV_8UC3, cv::Scalar(0,0,0));\n+            mask_viz.setTo(cv::Scalar(0,255,0), mask);\n \n             for (int i = 0; i < NUM_OF_CAM; i++)\n             {\n                 cv::Mat tmp_img = stereo_img.rowRange(i * ROW, (i + 1) * ROW);\n                 cv::cvtColor(show_img, tmp_img, CV_GRAY2RGB);\n+                cv::addWeighted(tmp_img, 1.0, mask_viz, 0.3, 0.0, tmp_img);\n \n                 for (unsigned int j = 0; j < trackerData[i].cur_pts.size(); j++)\n                 {\n@@ -197,9 +222,9 @@ void callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::Imag\n                     trackerData[i].m_camera->spaceToPlane(tmp_prev_un_pts, tmp_prev_uv);\n                     cv::line(tmp_img, trackerData[i].cur_pts[j], cv::Point2f(tmp_prev_uv.x(), tmp_prev_uv.y()), cv::Scalar(255 , 0, 0), 1 , 8, 0);\n                     */\n-                    char name[10];\n-                    sprintf(name, \"%d\", trackerData[i].plane_ids[j]);\n-                    cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0));\n+                    //char name[10];\n+                    //sprintf(name, \"%d\", trackerData[i].plane_ids[j]);\n+                    //cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0));\n                 }\n             }\n             //cv::imshow(\"vis\", stereo_img);\ndiff --git a/vins_estimator/launch/playback_vins_airsim.launch b/vins_estimator/launch/playback_vins_airsim.launch\nindex 972de2a..49b4b1e 100644\n--- a/vins_estimator/launch/playback_vins_airsim.launch\n+++ b/vins_estimator/launch/playback_vins_airsim.launch\n@@ -28,6 +28,6 @@\n \n  <node name=\"throttle\" pkg=\"topic_tools\" type=\"throttle\" args=\"messages /imu 200 /imu_throttled\"/>\n \n- <include file=\"$(find benchmark_publisher)/launch/publish.launch\"/>\n+ <!--<include file=\"$(find benchmark_publisher)/launch/publish.launch\"/>-->\n \n </launch>\n-- \n2.17.1\n\n"
  },
  {
    "path": "rpvio_estimator/CMakeLists.txt",
    "content": "cmake_minimum_required(VERSION 2.8.3)\nproject(rpvio_estimator)\n\nset(CMAKE_BUILD_TYPE \"Release\")\nset(CMAKE_CXX_FLAGS \"-std=c++11\")\n#-DEIGEN_USE_MKL_ALL\")\nset(CMAKE_CXX_FLAGS_RELEASE \"-O3 -Wall -g\")\n\nfind_package(catkin REQUIRED COMPONENTS\n    roscpp\n    std_msgs\n    geometry_msgs\n    nav_msgs\n    tf\n    cv_bridge\n    )\n\nfind_package(OpenCV REQUIRED)\n\n# message(WARNING \"OpenCV_VERSION: ${OpenCV_VERSION}\")\n\nfind_package(Ceres REQUIRED)\n\ninclude_directories(${catkin_INCLUDE_DIRS} ${CERES_INCLUDE_DIRS})\n\nset(CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake)\nfind_package(Eigen3)\ninclude_directories(\n  ${catkin_INCLUDE_DIRS}\n  ${EIGEN3_INCLUDE_DIR}\n)\n\ncatkin_package()\n\nadd_executable(rpvio_estimator\n    src/estimator_node.cpp\n    src/parameters.cpp\n    src/estimator.cpp\n    src/feature_manager.cpp\n    src/factor/pose_local_parameterization.cpp\n    src/factor/projection_factor.cpp\n    src/factor/projection_td_factor.cpp\n    src/factor/marginalization_factor.cpp\n    src/utility/utility.cpp\n    src/utility/visualization.cpp\n    src/utility/CameraPoseVisualization.cpp\n    src/initial/solve_5pts.cpp\n    src/initial/initial_aligment.cpp\n    src/initial/initial_sfm.cpp\n    src/initial/initial_ex_rotation.cpp\n    )\n\n\ntarget_link_libraries(rpvio_estimator ${catkin_LIBRARIES} ${OpenCV_LIBS} ${CERES_LIBRARIES}) \n\n\n"
  },
  {
    "path": "rpvio_estimator/cmake/FindEigen.cmake",
    "content": "# Ceres Solver - A fast non-linear least squares minimizer\n# Copyright 2015 Google Inc. All rights reserved.\n# http://ceres-solver.org/\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice,\n#   this list of conditions and the following disclaimer.\n# * Redistributions in binary form must reproduce the above copyright notice,\n#   this list of conditions and the following disclaimer in the documentation\n#   and/or other materials provided with the distribution.\n# * Neither the name of Google Inc. nor the names of its contributors may be\n#   used to endorse or promote products derived from this software without\n#   specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE\n# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE\n# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR\n# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF\n# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS\n# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN\n# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)\n# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE\n# POSSIBILITY OF SUCH DAMAGE.\n#\n# Author: alexs.mac@gmail.com (Alex Stewart)\n#\n\n# FindEigen.cmake - Find Eigen library, version >= 3.\n#\n# This module defines the following variables:\n#\n# EIGEN_FOUND: TRUE iff Eigen is found.\n# EIGEN_INCLUDE_DIRS: Include directories for Eigen.\n#\n# EIGEN_VERSION: Extracted from Eigen/src/Core/util/Macros.h\n# EIGEN_WORLD_VERSION: Equal to 3 if EIGEN_VERSION = 3.2.0\n# EIGEN_MAJOR_VERSION: Equal to 2 if EIGEN_VERSION = 3.2.0\n# EIGEN_MINOR_VERSION: Equal to 0 if EIGEN_VERSION = 3.2.0\n#\n# The following variables control the behaviour of this module:\n#\n# EIGEN_INCLUDE_DIR_HINTS: List of additional directories in which to\n#                          search for eigen includes, e.g: /timbuktu/eigen3.\n#\n# The following variables are also defined by this module, but in line with\n# CMake recommended FindPackage() module style should NOT be referenced directly\n# by callers (use the plural variables detailed above instead).  These variables\n# do however affect the behaviour of the module via FIND_[PATH/LIBRARY]() which\n# are NOT re-called (i.e. search for library is not repeated) if these variables\n# are set with valid values _in the CMake cache_. This means that if these\n# variables are set directly in the cache, either by the user in the CMake GUI,\n# or by the user passing -DVAR=VALUE directives to CMake when called (which\n# explicitly defines a cache variable), then they will be used verbatim,\n# bypassing the HINTS variables and other hard-coded search locations.\n#\n# EIGEN_INCLUDE_DIR: Include directory for CXSparse, not including the\n#                    include directory of any dependencies.\n\n# Called if we failed to find Eigen or any of it's required dependencies,\n# unsets all public (designed to be used externally) variables and reports\n# error message at priority depending upon [REQUIRED/QUIET/<NONE>] argument.\nmacro(EIGEN_REPORT_NOT_FOUND REASON_MSG)\n  unset(EIGEN_FOUND)\n  unset(EIGEN_INCLUDE_DIRS)\n  # Make results of search visible in the CMake GUI if Eigen has not\n  # been found so that user does not have to toggle to advanced view.\n  mark_as_advanced(CLEAR EIGEN_INCLUDE_DIR)\n  # Note <package>_FIND_[REQUIRED/QUIETLY] variables defined by FindPackage()\n  # use the camelcase library name, not uppercase.\n  if (Eigen_FIND_QUIETLY)\n    message(STATUS \"Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  elseif (Eigen_FIND_REQUIRED)\n    message(FATAL_ERROR \"Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  else()\n    # Neither QUIETLY nor REQUIRED, use no priority which emits a message\n    # but continues configuration and allows generation.\n    message(\"-- Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  endif ()\n  return()\nendmacro(EIGEN_REPORT_NOT_FOUND)\n\n# Protect against any alternative find_package scripts for this library having\n# been called previously (in a client project) which set EIGEN_FOUND, but not\n# the other variables we require / set here which could cause the search logic\n# here to fail.\nunset(EIGEN_FOUND)\n\n# Search user-installed locations first, so that we prefer user installs\n# to system installs where both exist.\nlist(APPEND EIGEN_CHECK_INCLUDE_DIRS\n  /usr/local/include\n  /usr/local/homebrew/include # Mac OS X\n  /opt/local/var/macports/software # Mac OS X.\n  /opt/local/include\n  /usr/include)\n# Additional suffixes to try appending to each search path.\nlist(APPEND EIGEN_CHECK_PATH_SUFFIXES\n  eigen3 # Default root directory for Eigen.\n  Eigen/include/eigen3 # Windows (for C:/Program Files prefix) < 3.3\n  Eigen3/include/eigen3 ) # Windows (for C:/Program Files prefix) >= 3.3\n\n# Search supplied hint directories first if supplied.\nfind_path(EIGEN_INCLUDE_DIR\n  NAMES Eigen/Core\n  PATHS ${EIGEN_INCLUDE_DIR_HINTS}\n  ${EIGEN_CHECK_INCLUDE_DIRS}\n  PATH_SUFFIXES ${EIGEN_CHECK_PATH_SUFFIXES})\n\nif (NOT EIGEN_INCLUDE_DIR OR\n    NOT EXISTS ${EIGEN_INCLUDE_DIR})\n  eigen_report_not_found(\n    \"Could not find eigen3 include directory, set EIGEN_INCLUDE_DIR to \"\n    \"path to eigen3 include directory, e.g. /usr/local/include/eigen3.\")\nendif (NOT EIGEN_INCLUDE_DIR OR\n       NOT EXISTS ${EIGEN_INCLUDE_DIR})\n\n# Mark internally as found, then verify. EIGEN_REPORT_NOT_FOUND() unsets\n# if called.\nset(EIGEN_FOUND TRUE)\n\n# Extract Eigen version from Eigen/src/Core/util/Macros.h\nif (EIGEN_INCLUDE_DIR)\n  set(EIGEN_VERSION_FILE ${EIGEN_INCLUDE_DIR}/Eigen/src/Core/util/Macros.h)\n  if (NOT EXISTS ${EIGEN_VERSION_FILE})\n    eigen_report_not_found(\n      \"Could not find file: ${EIGEN_VERSION_FILE} \"\n      \"containing version information in Eigen install located at: \"\n      \"${EIGEN_INCLUDE_DIR}.\")\n  else (NOT EXISTS ${EIGEN_VERSION_FILE})\n    file(READ ${EIGEN_VERSION_FILE} EIGEN_VERSION_FILE_CONTENTS)\n\n    string(REGEX MATCH \"#define EIGEN_WORLD_VERSION [0-9]+\"\n      EIGEN_WORLD_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_WORLD_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_WORLD_VERSION \"${EIGEN_WORLD_VERSION}\")\n\n    string(REGEX MATCH \"#define EIGEN_MAJOR_VERSION [0-9]+\"\n      EIGEN_MAJOR_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_MAJOR_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_MAJOR_VERSION \"${EIGEN_MAJOR_VERSION}\")\n\n    string(REGEX MATCH \"#define EIGEN_MINOR_VERSION [0-9]+\"\n      EIGEN_MINOR_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_MINOR_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_MINOR_VERSION \"${EIGEN_MINOR_VERSION}\")\n\n    # This is on a single line s/t CMake does not interpret it as a list of\n    # elements and insert ';' separators which would result in 3.;2.;0 nonsense.\n    set(EIGEN_VERSION \"${EIGEN_WORLD_VERSION}.${EIGEN_MAJOR_VERSION}.${EIGEN_MINOR_VERSION}\")\n  endif (NOT EXISTS ${EIGEN_VERSION_FILE})\nendif (EIGEN_INCLUDE_DIR)\n\n# Set standard CMake FindPackage variables if found.\nif (EIGEN_FOUND)\n  set(EIGEN_INCLUDE_DIRS ${EIGEN_INCLUDE_DIR})\nendif (EIGEN_FOUND)\n\n# Handle REQUIRED / QUIET optional arguments and version.\ninclude(FindPackageHandleStandardArgs)\nfind_package_handle_standard_args(Eigen\n  REQUIRED_VARS EIGEN_INCLUDE_DIRS\n  VERSION_VAR EIGEN_VERSION)\n\n# Only mark internal variables as advanced if we found Eigen, otherwise\n# leave it visible in the standard GUI for the user to set manually.\nif (EIGEN_FOUND)\n  mark_as_advanced(FORCE EIGEN_INCLUDE_DIR)\nendif (EIGEN_FOUND)\n"
  },
  {
    "path": "rpvio_estimator/launch/advio_12.launch",
    "content": "<launch>\n\n  <arg name=\"playback_rate\" default=\"1.0\" />\n  <arg name=\"start_from\" default=\"0\" />\n  <arg name=\"bagfile_path\" default=\"/home/karnik/dataset/ADVIO/12-mask.bag\"/>\n\n  <arg name=\"config_path\" default = \"$(find rpvio_estimator)/../config/advio_12_config.yaml\" />\n  <arg name=\"rpvio_path\" default = \"$(find rpvio_feature_tracker)/../config/../\" />\n\n  <!-- Bag playback -->\n  <node name=\"rosbag_playback\" pkg=\"rosbag\" type=\"play\" required=\"true\" args=\" --delay=10 --queue=1000 -r $(arg playback_rate) -s $(arg start_from) $(arg bagfile_path)\"/>\n\n  <!-- VINS-Mono nodes -->\n  <node name=\"rpvio_feature_tracker\" pkg=\"rpvio_feature_tracker\" type=\"rpvio_feature_tracker\" output=\"log\">\n    <param name=\"config_file\" type=\"string\" value=\"$(arg config_path)\" />\n    <param name=\"rpvio_folder\" type=\"string\" value=\"$(arg rpvio_path)\" />\n  </node>\n\n  <node name=\"rpvio_estimator\" pkg=\"rpvio_estimator\" type=\"rpvio_estimator\" output=\"screen\">\n    <param name=\"config_file\" type=\"string\" value=\"$(arg config_path)\" />\n    <param name=\"rpvio_folder\" type=\"string\" value=\"$(arg rpvio_path)\" />\n  </node>\n\n  <node name=\"rvizvisualisation\" pkg=\"rviz\" type=\"rviz\" output=\"log\" args=\"-d $(find rpvio_estimator)/../config/rpvio_rviz_config.rviz\" />\n\n</launch>\n"
  },
  {
    "path": "rpvio_estimator/launch/ol_market1.launch",
    "content": "<launch>\n\n  <arg name=\"playback_rate\" default=\"1.0\" />\n  <arg name=\"start_from\" default=\"0\" />\n  <arg name=\"bagfile_path\" default=\"/home/karnik/dataset/OpenLORIS/market1-mask.bag\"/>\n\n  <arg name=\"config_path\" default = \"$(find rpvio_estimator)/../config/ol_market1_config.yaml\" />\n  <arg name=\"rpvio_path\" default = \"$(find rpvio_feature_tracker)/../config/../\" />\n\n  <!-- Bag playback -->\n  <node name=\"rosbag_playback\" pkg=\"rosbag\" type=\"play\" required=\"true\" args=\" --delay=10 --queue=1000 -r $(arg playback_rate) -s $(arg start_from) $(arg bagfile_path)\"/>\n\n  <!-- RP-VIO nodes -->\n  <node name=\"rpvio_feature_tracker\" pkg=\"rpvio_feature_tracker\" type=\"rpvio_feature_tracker\" output=\"log\">\n    <param name=\"config_file\" type=\"string\" value=\"$(arg config_path)\" />\n    <param name=\"rpvio_folder\" type=\"string\" value=\"$(arg rpvio_path)\" />\n  </node>\n\n  <node name=\"rpvio_estimator\" pkg=\"rpvio_estimator\" type=\"rpvio_estimator\" output=\"screen\">\n    <param name=\"config_file\" type=\"string\" value=\"$(arg config_path)\" />\n    <param name=\"rpvio_folder\" type=\"string\" value=\"$(arg rpvio_path)\" />\n  </node>\n\n  <node name=\"rvizvisualisation\" pkg=\"rviz\" type=\"rviz\" output=\"log\" args=\"-d $(find rpvio_estimator)/../config/rpvio_rviz_config.rviz\" />\n\n</launch>\n"
  },
  {
    "path": "rpvio_estimator/launch/rpvio_rviz.launch",
    "content": "<launch>\n    <node name=\"rvizvisualisation\" pkg=\"rviz\" type=\"rviz\" output=\"log\" args=\"-d $(find rpvio_estimator)/../config/rpvio_rviz_config.rviz\" />\n</launch>\n"
  },
  {
    "path": "rpvio_estimator/launch/rpvio_sim.launch",
    "content": "<launch>\n\n  <arg name=\"playback_rate\" default=\"1.0\" />\n  <arg name=\"start_from\" default=\"0\" />\n  <arg name=\"bagfile_path\" default=\"/home/karnik/dataset/rpvio-sim/static/static.bag\"/>\n\n  <arg name=\"config_path\" default = \"$(find rpvio_estimator)/../config/rpvio_sim_config.yaml\" />\n  <arg name=\"rpvio_path\" default = \"$(find rpvio_feature_tracker)/../config/../\" />\n  \n  <!-- Bag playback -->\n  <node name=\"rosbag_playback\" pkg=\"rosbag\" type=\"play\" required=\"true\" args=\" --delay=10 --queue=1000 -r $(arg playback_rate) -s $(arg start_from) $(arg bagfile_path)\"/>\n\n  <!-- RP-VIO nodes -->\n  <node name=\"rpvio_feature_tracker\" pkg=\"rpvio_feature_tracker\" type=\"rpvio_feature_tracker\" output=\"log\">\n    <param name=\"config_file\" type=\"string\" value=\"$(arg config_path)\" />\n    <param name=\"rpvio_folder\" type=\"string\" value=\"$(arg rpvio_path)\" />\n  </node>\n\n  <node name=\"rpvio_estimator\" pkg=\"rpvio_estimator\" type=\"rpvio_estimator\" output=\"screen\">\n    <param name=\"config_file\" type=\"string\" value=\"$(arg config_path)\" />\n    <param name=\"rpvio_folder\" type=\"string\" value=\"$(arg rpvio_path)\" />\n  </node>\n\n  <node name=\"rvizvisualisation\" pkg=\"rviz\" type=\"rviz\" output=\"log\" args=\"-d $(find rpvio_estimator)/../config/rpvio_rviz_config.rviz\" />\n\n  <node name=\"throttle\" pkg=\"topic_tools\" type=\"throttle\" args=\"messages /imu 200 /imu_throttled\"/>\n\n</launch>\n"
  },
  {
    "path": "rpvio_estimator/package.xml",
    "content": "<?xml version=\"1.0\"?>\n<package>\n  <name>rpvio_estimator</name>\n  <version>0.0.0</version>\n  <description>The vins_estimator package</description>\n\n  <maintainer email=\"karnikram@gmail.com\">karnikram</maintainer>\n  <license>GPLv3</license>\n\n  <buildtool_depend>catkin</buildtool_depend>\n  <build_depend>roscpp</build_depend>\n  <run_depend>roscpp</run_depend>\n\n</package>\n"
  },
  {
    "path": "rpvio_estimator/src/estimator.cpp",
    "content": "#include \"estimator.h\"\n\nEstimator::Estimator(): f_manager{Rs}\n{\n    ROS_INFO(\"init begins\");\n    clearState();\n}\n\nvoid Estimator::setParameter()\n{\n    for (int i = 0; i < NUM_OF_CAM; i++)\n    {\n        tic[i] = TIC[i];\n        ric[i] = RIC[i];\n    }\n    f_manager.setRic(ric);\n    ProjectionFactor::sqrt_info = FOCAL_LENGTH / 1.5 * Matrix2d::Identity();\n    ProjectionTdFactor::sqrt_info = FOCAL_LENGTH / 1.5 * Matrix2d::Identity();\n    td = TD;\n}\n\nvoid Estimator::clearState()\n{\n    for (int i = 0; i < WINDOW_SIZE + 1; i++)\n    {\n        Rs[i].setIdentity();\n        Ps[i].setZero();\n        Vs[i].setZero();\n        Bas[i].setZero();\n        Bgs[i].setZero();\n        dt_buf[i].clear();\n        linear_acceleration_buf[i].clear();\n        angular_velocity_buf[i].clear();\n\n        if (pre_integrations[i] != nullptr)\n            delete pre_integrations[i];\n        pre_integrations[i] = nullptr;\n    }\n\n    for (int i = 0; i < NUM_OF_CAM; i++)\n    {\n        tic[i] = Vector3d::Zero();\n        ric[i] = Matrix3d::Identity();\n    }\n\n    for (auto &it : all_image_frame)\n    {\n        if (it.second.pre_integration != nullptr)\n        {\n            delete it.second.pre_integration;\n            it.second.pre_integration = nullptr;\n        }\n    }\n\n    para_N.clear();\n    para_d.clear();\n    init_pids.clear();\n\n    solver_flag = INITIAL;\n    first_imu = false,\n    sum_of_back = 0;\n    sum_of_front = 0;\n    frame_count = 0;\n    solver_flag = INITIAL;\n    initial_timestamp = 0;\n    all_image_frame.clear();\n    td = TD;\n\n\n    if (tmp_pre_integration != nullptr)\n        delete tmp_pre_integration;\n    if (last_marginalization_info != nullptr)\n        delete last_marginalization_info;\n\n    tmp_pre_integration = nullptr;\n    last_marginalization_info = nullptr;\n    last_marginalization_parameter_blocks.clear();\n\n    f_manager.clearState();\n\n    failure_occur = 0;\n    relocalization_info = 0;\n\n    drift_correct_r = Matrix3d::Identity();\n    drift_correct_t = Vector3d::Zero();\n}\n\nvoid Estimator::processIMU(double dt, const Vector3d &linear_acceleration, const Vector3d &angular_velocity)\n{\n    if (!first_imu)\n    {\n        first_imu = true;\n        acc_0 = linear_acceleration;\n        gyr_0 = angular_velocity;\n    }\n\n    if (!pre_integrations[frame_count])\n    {\n        pre_integrations[frame_count] = new IntegrationBase{acc_0, gyr_0, Bas[frame_count], Bgs[frame_count]};\n    }\n    if (frame_count != 0)\n    {\n        pre_integrations[frame_count]->push_back(dt, linear_acceleration, angular_velocity);\n        //if(solver_flag != NON_LINEAR)\n            tmp_pre_integration->push_back(dt, linear_acceleration, angular_velocity);\n\n        dt_buf[frame_count].push_back(dt);\n        linear_acceleration_buf[frame_count].push_back(linear_acceleration);\n        angular_velocity_buf[frame_count].push_back(angular_velocity);\n\n        int j = frame_count;         \n        Vector3d un_acc_0 = Rs[j] * (acc_0 - Bas[j]) - g;\n        Vector3d un_gyr = 0.5 * (gyr_0 + angular_velocity) - Bgs[j];\n        Rs[j] *= Utility::deltaQ(un_gyr * dt).toRotationMatrix();\n        Vector3d un_acc_1 = Rs[j] * (linear_acceleration - Bas[j]) - g;\n        Vector3d un_acc = 0.5 * (un_acc_0 + un_acc_1);\n        Ps[j] += dt * Vs[j] + 0.5 * dt * dt * un_acc;\n        Vs[j] += dt * un_acc;\n    }\n    acc_0 = linear_acceleration;\n    gyr_0 = angular_velocity;\n}\n\nvoid Estimator::processImage(const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, const std_msgs::Header &header)\n{\n    ROS_DEBUG(\"new image coming ------------------------------------------\");\n    ROS_DEBUG(\"Adding feature points %lu\", image.size());\n    if (f_manager.addFeatureCheckParallax(frame_count, image, td))\n        marginalization_flag = MARGIN_OLD;\n    else\n        marginalization_flag = MARGIN_SECOND_NEW;\n\n    ROS_DEBUG(\"this frame is--------------------%s\", marginalization_flag ? \"reject\" : \"accept\");\n    ROS_DEBUG(\"%s\", marginalization_flag ? \"Non-keyframe\" : \"Keyframe\");\n    ROS_DEBUG(\"Solving %d\", frame_count);\n    ROS_DEBUG(\"number of feature: %d\", f_manager.getFeatureCount());\n    Headers[frame_count] = header;\n\n    ImageFrame imageframe(image, header.stamp.toSec());\n    imageframe.pre_integration = tmp_pre_integration;\n    all_image_frame.insert(make_pair(header.stamp.toSec(), imageframe));\n    tmp_pre_integration = new IntegrationBase{acc_0, gyr_0, Bas[frame_count], Bgs[frame_count]};\n\n    //if(ESTIMATE_EXTRINSIC == 2)\n    //{\n    //    ROS_INFO(\"calibrating extrinsic param, rotation movement is needed\");\n    //    if (frame_count != 0)\n    //    {\n    //        vector<pair<Vector3d, Vector3d>> corres = f_manager.getCorresponding(frame_count - 1, frame_count);\n    //        Matrix3d calib_ric;\n    //        if (initial_ex_rotation.CalibrationExRotation(corres, pre_integrations[frame_count]->delta_q, calib_ric))\n    //        {\n    //            ROS_WARN(\"initial extrinsic rotation calib success\");\n    //            ROS_WARN_STREAM(\"initial extrinsic rotation: \" << endl << calib_ric);\n    //            ric[0] = calib_ric;\n    //            RIC[0] = calib_ric;\n    //            ESTIMATE_EXTRINSIC = 1;\n    //        }\n    //    }\n    //}\n\n    if (solver_flag == INITIAL)\n    {\n        if (frame_count == WINDOW_SIZE)\n        {\n            bool result = false;\n            if( ESTIMATE_EXTRINSIC != 2 && (header.stamp.toSec() - initial_timestamp) > 0.1)\n            {\n               result = initialStructure();\n               initial_timestamp = header.stamp.toSec();\n            }\n            if(result)\n            {\n                solver_flag = NON_LINEAR;\n                solveOdometry();\n                slideWindow();\n                f_manager.removeFailures();\n                ROS_INFO(\"Initialization finish!\");\n                last_R = Rs[WINDOW_SIZE];\n                last_P = Ps[WINDOW_SIZE];\n                last_R0 = Rs[0];\n                last_P0 = Ps[0];\n            }\n            else\n                slideWindow();\n        }\n        else\n            frame_count++;\n    }\n    else\n    {\n        TicToc t_solve;\n        initializeNewPlanes();\n        solveOdometry();\n        ROS_DEBUG(\"solver costs: %fms\", t_solve.toc());\n\n        if (failureDetection())\n        {\n            ROS_WARN(\"failure detection!\");\n            failure_occur = 1;\n            clearState();\n            setParameter();\n            ROS_WARN(\"system reboot!\");\n            return;\n        }\n\n        TicToc t_margin;\n        slideWindow();\n        f_manager.removeFailures();\n        ROS_DEBUG(\"marginalization costs: %fms\", t_margin.toc());\n        // prepare output of VINS\n        key_poses.clear();\n        for (int i = 0; i <= WINDOW_SIZE; i++)\n            key_poses.push_back(Ps[i]);\n\n        last_R = Rs[WINDOW_SIZE];\n        last_P = Ps[WINDOW_SIZE];\n        last_R0 = Rs[0];\n        last_P0 = Ps[0];\n    }\n}\n\nbool Estimator::initialStructure()\n{\n    TicToc t_sfm;\n    //check imu observibility\n    {\n        map<double, ImageFrame>::iterator frame_it;\n        Vector3d sum_g;\n        for (frame_it = all_image_frame.begin(), frame_it++; frame_it != all_image_frame.end(); frame_it++)\n        {\n            double dt = frame_it->second.pre_integration->sum_dt;\n            Vector3d tmp_g = frame_it->second.pre_integration->delta_v / dt;\n            sum_g += tmp_g;\n        }\n        Vector3d aver_g;\n        aver_g = sum_g * 1.0 / ((int)all_image_frame.size() - 1);\n        double var = 0;\n        for (frame_it = all_image_frame.begin(), frame_it++; frame_it != all_image_frame.end(); frame_it++)\n        {\n            double dt = frame_it->second.pre_integration->sum_dt;\n            Vector3d tmp_g = frame_it->second.pre_integration->delta_v / dt;\n            var += (tmp_g - aver_g).transpose() * (tmp_g - aver_g);\n            //cout << \"frame g \" << tmp_g.transpose() << endl;\n        }\n        var = sqrt(var / ((int)all_image_frame.size() - 1));\n        //ROS_WARN(\"IMU variation %f!\", var);\n        if(var < 0.25)\n        {\n            ROS_INFO(\"IMU excitation not enouth!\");\n            //return false;\n        }\n    }\n    \n    // Bootstrap SfM\n    map<int, Matrix3d> relative_R;\n    map<int, Vector3d> relative_T;\n    map<int, Vector3d> n;\n    int l;\n\n    if (!relativeHPose(relative_R, relative_T, n, l, lplane_id))\n    {\n        ROS_INFO(\"Not enough features or parallax; Move device around\");\n        return false;\n    }\n\n    // global sfm\n    Quaterniond Q[frame_count + 1];\n    Vector3d T[frame_count + 1];\n    map<int, Vector3d> sfm_tracked_points;\n    vector<SFMFeature> sfm_f;\n    for (auto &it_per_id : f_manager.feature)\n    {\n        // Consider only features from largest plne\n        if(it_per_id.plane_id != lplane_id)\n            continue;\n\n        int imu_j = it_per_id.start_frame - 1;\n        SFMFeature tmp_feature;\n        tmp_feature.state = false;\n        tmp_feature.id = it_per_id.feature_id;\n        tmp_feature.plane_id = it_per_id.plane_id;\n        for (auto &it_per_frame : it_per_id.feature_per_frame)\n        {\n            imu_j++;\n            Vector3d pts_j = it_per_frame.point;\n            tmp_feature.observation.push_back(make_pair(imu_j, Eigen::Vector2d{pts_j.x(), pts_j.y()}));\n        }\n        sfm_f.push_back(tmp_feature);\n    }\n\n    GlobalSFM sfm;\n    if(!sfm.constructH(frame_count + 1, Q, T, l,\n              relative_R[lplane_id], relative_T[lplane_id], n[lplane_id],\n              sfm_f, sfm_tracked_points))\n    {\n        ROS_DEBUG(\"global SFM failed!\");\n        marginalization_flag = MARGIN_OLD;\n        return false;\n    }\n\n    // Save estimated normal variables\n    for(auto &est_n : n)\n    {\n        para_N[est_n.first] = {est_n.second(0), est_n.second(1), est_n.second(2)};\n    }\n\n    //solve pnp for all frame\n    map<double, ImageFrame>::iterator frame_it;\n    map<int, Vector3d>::iterator it;\n    frame_it = all_image_frame.begin( );\n    for (int i = 0; frame_it != all_image_frame.end( ); frame_it++)\n    {\n        // provide initial guess\n        cv::Mat r, rvec, t, D, tmp_r;\n        if((frame_it->first) == Headers[i].stamp.toSec())\n        {\n            frame_it->second.is_key_frame = true;\n            frame_it->second.R = Q[i].toRotationMatrix() * RIC[0].transpose();\n            frame_it->second.T = T[i];\n            i++;\n            continue;\n        }\n        if((frame_it->first) > Headers[i].stamp.toSec())\n        {\n            i++;\n        }\n        Matrix3d R_inital = (Q[i].inverse()).toRotationMatrix();\n        Vector3d P_inital = - R_inital * T[i];\n        cv::eigen2cv(R_inital, tmp_r);\n        cv::Rodrigues(tmp_r, rvec);\n        cv::eigen2cv(P_inital, t);\n\n        frame_it->second.is_key_frame = false;\n        vector<cv::Point3f> pts_3_vector;\n        vector<cv::Point2f> pts_2_vector;\n        for (auto &id_pts : frame_it->second.points)\n        {\n            int feature_id = id_pts.first;\n            for (auto &i_p : id_pts.second)\n            {\n                it = sfm_tracked_points.find(feature_id);\n                if(it != sfm_tracked_points.end())\n                {\n                    Vector3d world_pts = it->second;\n                    cv::Point3f pts_3(world_pts(0), world_pts(1), world_pts(2));\n                    pts_3_vector.push_back(pts_3);\n                    Vector2d img_pts = i_p.second.head<2>();\n                    cv::Point2f pts_2(img_pts(0), img_pts(1));\n                    pts_2_vector.push_back(pts_2);\n                }\n            }\n        }\n        cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);     \n        if(pts_3_vector.size() < 6)\n        {\n            cout << \"pts_3_vector size \" << pts_3_vector.size() << endl;\n            ROS_DEBUG(\"Not enough points for solve pnp !\");\n            return false;\n        }\n        if (! cv::solvePnP(pts_3_vector, pts_2_vector, K, D, rvec, t, 1))\n        {\n            ROS_DEBUG(\"solve pnp fail!\");\n            return false;\n        }\n        cv::Rodrigues(rvec, r);\n        MatrixXd R_pnp,tmp_R_pnp;\n        cv::cv2eigen(r, tmp_R_pnp);\n        R_pnp = tmp_R_pnp.transpose();\n        MatrixXd T_pnp;\n        cv::cv2eigen(t, T_pnp);\n        T_pnp = R_pnp * (-T_pnp);\n        frame_it->second.R = R_pnp * RIC[0].transpose();\n        frame_it->second.T = T_pnp;\n    }\n    if (visualInitialAlign(relative_T, T[frame_count]))\n        return true;\n    else\n    {\n        para_N.clear();\n        para_d.clear();\n        ROS_INFO(\"misalign visual structure with IMU\");\n        return false;\n    }\n}\n\nbool Estimator::visualInitialAlign(map<int, Eigen::Vector3d> &relative_T, Eigen::Vector3d &lt)\n{\n    TicToc t_g;\n    VectorXd x;\n    //solve scale\n    bool result = VisualIMUAlignment(all_image_frame, Bgs, g, x);\n    if(!result)\n    {\n        ROS_DEBUG(\"solve g failed!\");\n        return false;\n    }\n\n    // change state\n    for (int i = 0; i <= frame_count; i++)\n    {\n        Matrix3d Ri = all_image_frame[Headers[i].stamp.toSec()].R;\n        Vector3d Pi = all_image_frame[Headers[i].stamp.toSec()].T;\n        Ps[i] = Pi;\n        Rs[i] = Ri;\n        all_image_frame[Headers[i].stamp.toSec()].is_key_frame = true;\n    }\n\n    VectorXd dep = f_manager.getDepthVector();\n    for (int i = 0; i < dep.size(); i++)\n        dep[i] = -1;\n    f_manager.clearDepth(dep);\n\n    //triangulat on cam pose , no tic\n    Vector3d TIC_TMP[NUM_OF_CAM];\n    for(int i = 0; i < NUM_OF_CAM; i++)\n        TIC_TMP[i].setZero();\n    ric[0] = RIC[0];\n    f_manager.setRic(ric);\n    f_manager.triangulate(Ps, &(TIC_TMP[0]), &(RIC[0]));\n\n    s = (x.tail<1>())(0);\n\n    for (int i = 0; i <= WINDOW_SIZE; i++)\n    {\n        pre_integrations[i]->repropagate(Vector3d::Zero(), Bgs[i]);\n    }\n    for (int i = frame_count; i >= 0; i--)\n        Ps[i] = s * Ps[i] - Rs[i] * TIC[0] - (s * Ps[0] - Rs[0] * TIC[0]);\n    int kv = -1;\n    map<double, ImageFrame>::iterator frame_i;\n    for (frame_i = all_image_frame.begin(); frame_i != all_image_frame.end(); frame_i++)\n    {\n        if(frame_i->second.is_key_frame)\n        {\n            kv++;\n            Vs[kv] = frame_i->second.R * x.segment<3>(kv * 3);\n        }\n    }\n    for (auto &it_per_id : f_manager.feature)\n    {\n        it_per_id.used_num = it_per_id.feature_per_frame.size();\n        if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n            continue;\n\n        it_per_id.estimated_depth *= s;\n    }\n\n    //estimate plane d variables\n    for(auto &t : relative_T)\n    {\n        if(t.first == lplane_id)\n            para_d[lplane_id] = {s};\n        else\n           para_d[t.first] = {s*lt.norm()/t.second.norm()}; \n\n        init_pids.push_back(t.first);\n        ROS_INFO(\"Initialized plane %d features\", t.first);\n    } \n\n    Matrix3d R0 = Utility::g2R(g);\n    double yaw = Utility::R2ypr(R0 * Rs[0]).x();\n    R0 = Utility::ypr2R(Eigen::Vector3d{-yaw, 0, 0}) * R0;\n    g = R0 * g;\n    //Matrix3d rot_diff = R0 * Rs[0].transpose();\n    Matrix3d rot_diff = R0;\n    for (int i = 0; i <= frame_count; i++)\n    {\n        Ps[i] = rot_diff * Ps[i];\n        Rs[i] = rot_diff * Rs[i];\n        Vs[i] = rot_diff * Vs[i];\n    }\n    ROS_DEBUG_STREAM(\"g0     \" << g.transpose());\n    ROS_DEBUG_STREAM(\"my R0  \" << Utility::R2ypr(Rs[0]).transpose());\n\n    return true;\n}\n\nbool Estimator::relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d> &relative_t, map<int, Vector3d> &n, int &l, int &lplane_id)\n{\n    // find previous frame which contains enough correspondence and parallex with newest frame\n    for (int i = 0; i < WINDOW_SIZE; i++)\n    {\n        int lplane_size = 0;\n        vector<pair<Vector3d, Vector3d>> lplane_corres;\n        map<int, vector<pair<Vector3d, Vector3d>>> plane_corres = f_manager.getCorresponding(i, WINDOW_SIZE);\n        for(auto &corres : plane_corres)\n        {\n            if(corres.second.size() > lplane_size)\n            {\n                lplane_id = corres.first;\n                lplane_size = corres.second.size();\n                lplane_corres = corres.second;\n            }\n        }\n\n        if (lplane_size > 20)\n        {\n            double sum_parallax = 0;\n            double average_parallax;\n            for (int j = 0; j < int(lplane_corres.size()); j++)\n            {\n                Vector2d pts_0(lplane_corres[j].first(0), lplane_corres[j].first(1));\n                Vector2d pts_1(lplane_corres[j].second(0), lplane_corres[j].second(1));\n                double parallax = (pts_0 - pts_1).norm();\n                sum_parallax = sum_parallax + parallax;\n            }\n\n            average_parallax = 1.0 * sum_parallax / int(lplane_corres.size());\n\n            Matrix4d TrIC = Matrix4d::Identity();\n            TrIC.block(0,0,3,3) = ric[0];\n            TrIC.block(0,3,3,1) = tic[0];\n\n            //compute corresponding preintegrated rotation\n            Matrix3d R_imu = Matrix3d::Identity();\n            for(int k = WINDOW_SIZE - 1; k > i; k--)\n                R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n\n            Eigen::Matrix3d est_R;\n            Eigen::Vector3d est_t, est_n;\n            if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(lplane_corres, R_imu, TrIC, est_R, est_t, est_n))\n            {\n                l = i;\n                ROS_DEBUG(\"average_parallax %f choose l %d and newest frame to triangulate the whole structure\", average_parallax * 460, l);\n                relative_R[lplane_id] = est_R;\n                relative_t[lplane_id] = est_t;\n                n[lplane_id] = est_n;\n\n                for(auto &corres : plane_corres)\n                {\n                    if(corres.first == lplane_id)\n                        continue;\n\n                    if(m_estimator.solveRelativeHRT(corres.second, R_imu, TrIC, est_R, est_t, est_n))\n                    {\n                        relative_R[corres.first] = est_R;\n                        relative_t[corres.first] = est_t;\n                        n[corres.first] = est_n;\n                    }\n                }\n                return true;\n            }\n        }\n    }\n    return false;\n}\n\nvoid Estimator::initializeNewPlanes()\n{\n    // check for new plane ids\n    for(auto &plane_id : f_manager.plane_ids)\n    {\n        if(para_N.count(plane_id) > 0)\n            continue;\n        else\n        {\n            for(int i = 0; i < WINDOW_SIZE - 1; i++)\n            {\n                vector<pair<Vector3d, Vector3d>> corres = f_manager.getCorrespondingByPlane(\n                    i, WINDOW_SIZE-1, plane_id);\n\n                if(corres.size() > 20)\n                {\n                    double sum_parallax = 0;\n                    double average_parallax;\n                    for (int j = 0; j < int(corres.size()); j++)\n                    {\n                        Vector2d pts_0(corres[j].first(0), corres[j].first(1));\n                        Vector2d pts_1(corres[j].second(0), corres[j].second(1));\n                        double parallax = (pts_0 - pts_1).norm();\n                        sum_parallax = sum_parallax + parallax;\n                    }\n\n                    average_parallax = 1.0 * sum_parallax / int(corres.size());\n\n                    Matrix4d TrIC = Matrix4d::Identity();\n                    TrIC.block(0,0,3,3) = ric[0];\n                    TrIC.block(0,3,3,1) = tic[0];\n\n                    //compute corresponding preintegrated rotation\n                    Matrix3d R_imu = Matrix3d::Identity();\n                    for(int k = WINDOW_SIZE - 2; k > i; k--)\n                        R_imu = R_imu * (pre_integrations[k]->delta_q).toRotationMatrix();\n\n                    Eigen::Matrix3d est_R;\n                    Eigen::Vector3d est_t, est_n, vi_t;\n                    if(average_parallax * 460 > 30 && m_estimator.solveRelativeHRT(corres, R_imu, TrIC, est_R, est_t, est_n))\n                    {\n                        // Transform est_n to global frame\n                        est_n = ric[0].transpose() * Rs[i] * ric[0] * est_n;\n                        para_N[plane_id] = {est_n(0), est_n(1), est_n(2)};\n\n                        // Estimate plane d (in global frame) using known metric t from vio \n                        vi_t = (Ps[WINDOW_SIZE - 1] - Ps[i]);\n                        double di = (1.0*vi_t.norm())/est_t.norm();\n                        para_d[plane_id] = {di + (ric[0].transpose() * Ps[i]).dot(est_n)};\n                        init_pids.push_back(plane_id);\n                        ROS_INFO(\"Initialized plane %d features\", plane_id);\n                        break;\n                    }\n                }\n            }\n        }\n    }\n}\n\nvoid Estimator::solveOdometry()\n{\n    if (frame_count < WINDOW_SIZE)\n        return;\n    if (solver_flag == NON_LINEAR)\n    {\n        TicToc t_tri;\n        f_manager.triangulate(Ps, tic, ric);\n        ROS_DEBUG(\"triangulation costs %f\", t_tri.toc());\n        optimization();\n    }\n}\n\nvoid Estimator::vector2double()\n{\n    for (int i = 0; i <= WINDOW_SIZE; i++)\n    {\n        para_Pose[i][0] = Ps[i].x();\n        para_Pose[i][1] = Ps[i].y();\n        para_Pose[i][2] = Ps[i].z();\n        Quaterniond q{Rs[i]};\n        para_Pose[i][3] = q.x();\n        para_Pose[i][4] = q.y();\n        para_Pose[i][5] = q.z();\n        para_Pose[i][6] = q.w();\n\n        para_SpeedBias[i][0] = Vs[i].x();\n        para_SpeedBias[i][1] = Vs[i].y();\n        para_SpeedBias[i][2] = Vs[i].z();\n\n        para_SpeedBias[i][3] = Bas[i].x();\n        para_SpeedBias[i][4] = Bas[i].y();\n        para_SpeedBias[i][5] = Bas[i].z();\n\n        para_SpeedBias[i][6] = Bgs[i].x();\n        para_SpeedBias[i][7] = Bgs[i].y();\n        para_SpeedBias[i][8] = Bgs[i].z();\n    }\n    for (int i = 0; i < NUM_OF_CAM; i++)\n    {\n        para_Ex_Pose[i][0] = tic[i].x();\n        para_Ex_Pose[i][1] = tic[i].y();\n        para_Ex_Pose[i][2] = tic[i].z();\n        Quaterniond q{ric[i]};\n        para_Ex_Pose[i][3] = q.x();\n        para_Ex_Pose[i][4] = q.y();\n        para_Ex_Pose[i][5] = q.z();\n        para_Ex_Pose[i][6] = q.w();\n    }\n\n    VectorXd dep = f_manager.getDepthVector();\n    for (int i = 0; i < f_manager.getFeatureCount(); i++)\n        para_Feature[i][0] = dep(i);\n    if (ESTIMATE_TD)\n        para_Td[0][0] = td;\n}\n\nvoid Estimator::double2vector()\n{\n    Vector3d origin_R0 = Utility::R2ypr(Rs[0]);\n    Vector3d origin_P0 = Ps[0];\n\n    if (failure_occur)\n    {\n        origin_R0 = Utility::R2ypr(last_R0);\n        origin_P0 = last_P0;\n        failure_occur = 0;\n    }\n    Vector3d origin_R00 = Utility::R2ypr(Quaterniond(para_Pose[0][6],\n                                                      para_Pose[0][3],\n                                                      para_Pose[0][4],\n                                                      para_Pose[0][5]).toRotationMatrix());\n    double y_diff = origin_R0.x() - origin_R00.x();\n    //TODO\n    Matrix3d rot_diff = Utility::ypr2R(Vector3d(y_diff, 0, 0));\n    if (abs(abs(origin_R0.y()) - 90) < 1.0 || abs(abs(origin_R00.y()) - 90) < 1.0)\n    {\n        ROS_DEBUG(\"euler singular point!\");\n        rot_diff = Rs[0] * Quaterniond(para_Pose[0][6],\n                                       para_Pose[0][3],\n                                       para_Pose[0][4],\n                                       para_Pose[0][5]).toRotationMatrix().transpose();\n    }\n\n    for (int i = 0; i <= WINDOW_SIZE; i++)\n    {\n\n        Rs[i] = rot_diff * Quaterniond(para_Pose[i][6], para_Pose[i][3], para_Pose[i][4], para_Pose[i][5]).normalized().toRotationMatrix();\n        \n        Ps[i] = rot_diff * Vector3d(para_Pose[i][0] - para_Pose[0][0],\n                                para_Pose[i][1] - para_Pose[0][1],\n                                para_Pose[i][2] - para_Pose[0][2]) + origin_P0;\n\n        Vs[i] = rot_diff * Vector3d(para_SpeedBias[i][0],\n                                    para_SpeedBias[i][1],\n                                    para_SpeedBias[i][2]);\n\n        Bas[i] = Vector3d(para_SpeedBias[i][3],\n                          para_SpeedBias[i][4],\n                          para_SpeedBias[i][5]);\n\n        Bgs[i] = Vector3d(para_SpeedBias[i][6],\n                          para_SpeedBias[i][7],\n                          para_SpeedBias[i][8]);\n    }\n\n    for (int i = 0; i < NUM_OF_CAM; i++)\n    {\n        tic[i] = Vector3d(para_Ex_Pose[i][0],\n                          para_Ex_Pose[i][1],\n                          para_Ex_Pose[i][2]);\n        ric[i] = Quaterniond(para_Ex_Pose[i][6],\n                             para_Ex_Pose[i][3],\n                             para_Ex_Pose[i][4],\n                             para_Ex_Pose[i][5]).toRotationMatrix();\n    }\n\n    VectorXd dep = f_manager.getDepthVector();\n    for (int i = 0; i < f_manager.getFeatureCount(); i++)\n        dep(i) = para_Feature[i][0];\n    f_manager.setDepth(dep);\n    if (ESTIMATE_TD)\n        td = para_Td[0][0];\n\n    // relative info between two loop frame\n    if(relocalization_info)\n    { \n        Matrix3d relo_r;\n        Vector3d relo_t;\n        relo_r = rot_diff * Quaterniond(relo_Pose[6], relo_Pose[3], relo_Pose[4], relo_Pose[5]).normalized().toRotationMatrix();\n        relo_t = rot_diff * Vector3d(relo_Pose[0] - para_Pose[0][0],\n                                     relo_Pose[1] - para_Pose[0][1],\n                                     relo_Pose[2] - para_Pose[0][2]) + origin_P0;\n        double drift_correct_yaw;\n        drift_correct_yaw = Utility::R2ypr(prev_relo_r).x() - Utility::R2ypr(relo_r).x();\n        drift_correct_r = Utility::ypr2R(Vector3d(drift_correct_yaw, 0, 0));\n        drift_correct_t = prev_relo_t - drift_correct_r * relo_t;   \n        relo_relative_t = relo_r.transpose() * (Ps[relo_frame_local_index] - relo_t);\n        relo_relative_q = relo_r.transpose() * Rs[relo_frame_local_index];\n        relo_relative_yaw = Utility::normalizeAngle(Utility::R2ypr(Rs[relo_frame_local_index]).x() - Utility::R2ypr(relo_r).x());\n        //cout << \"vins relo \" << endl;\n        //cout << \"vins relative_t \" << relo_relative_t.transpose() << endl;\n        //cout << \"vins relative_yaw \" <<relo_relative_yaw << endl;\n        relocalization_info = 0;    \n\n    }\n}\n\nbool Estimator::failureDetection()\n{\n    if (f_manager.last_track_num < 2)\n    {\n        ROS_INFO(\" little feature %d\", f_manager.last_track_num);\n        //return true;\n    }\n    if (Bas[WINDOW_SIZE].norm() > 2.5)\n    {\n        ROS_INFO(\" big IMU acc bias estimation %f\", Bas[WINDOW_SIZE].norm());\n        return true;\n    }\n    if (Bgs[WINDOW_SIZE].norm() > 1.0)\n    {\n        ROS_INFO(\" big IMU gyr bias estimation %f\", Bgs[WINDOW_SIZE].norm());\n        return true;\n    }\n    /*\n    if (tic(0) > 1)\n    {\n        ROS_INFO(\" big extri param estimation %d\", tic(0) > 1);\n        return true;\n    }\n    */\n    Vector3d tmp_P = Ps[WINDOW_SIZE];\n    if ((tmp_P - last_P).norm() > 5)\n    {\n        ROS_INFO(\" big translation\");\n        return true;\n    }\n    if (abs(tmp_P.z() - last_P.z()) > 1)\n    {\n        ROS_INFO(\" big z translation\");\n        return true; \n    }\n    Matrix3d tmp_R = Rs[WINDOW_SIZE];\n    Matrix3d delta_R = tmp_R.transpose() * last_R;\n    Quaterniond delta_Q(delta_R);\n    double delta_angle;\n    delta_angle = acos(delta_Q.w()) * 2.0 / 3.14 * 180.0;\n    if (delta_angle > 50)\n    {\n        ROS_INFO(\" big delta_angle \");\n        //return true;\n    }\n    return false;\n}\n\n\nvoid Estimator::optimization()\n{\n    ceres::Problem problem;\n    ceres::LossFunction *loss_function;\n    //loss_function = new ceres::HuberLoss(1.0);\n    loss_function = new ceres::CauchyLoss(1.0);\n    for (int i = 0; i < WINDOW_SIZE + 1; i++)\n    {\n        ceres::LocalParameterization *local_parameterization = new PoseLocalParameterization();\n        problem.AddParameterBlock(para_Pose[i], SIZE_POSE, local_parameterization);\n        problem.AddParameterBlock(para_SpeedBias[i], SIZE_SPEEDBIAS);\n    }\n\n    for(auto &N : para_N)\n    {\n        ceres::LocalParameterization *local_n_parameterization = new ceres::HomogeneousVectorParameterization(3);\n        problem.AddParameterBlock(N.second.data(), 3, local_n_parameterization);\n    }\n\n    for (int i = 0; i < NUM_OF_CAM; i++)\n    {\n        ceres::LocalParameterization *local_parameterization = new PoseLocalParameterization();\n        problem.AddParameterBlock(para_Ex_Pose[i], SIZE_POSE, local_parameterization);\n        if (!ESTIMATE_EXTRINSIC)\n        {\n            ROS_DEBUG(\"fix extinsic param\");\n            problem.SetParameterBlockConstant(para_Ex_Pose[i]);\n        }\n        else\n            ROS_DEBUG(\"estimate extinsic param\");\n    }\n    if (ESTIMATE_TD)\n    {\n        problem.AddParameterBlock(para_Td[0], 1);\n        //problem.SetParameterBlockConstant(para_Td[0]);\n    }\n\n    TicToc t_whole, t_prepare;\n    vector2double();\n\n    if (last_marginalization_info)\n    {\n        // construct new marginlization_factor\n        MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info);\n        problem.AddResidualBlock(marginalization_factor, NULL,\n                                 last_marginalization_parameter_blocks);\n    }\n\n    for (int i = 0; i < WINDOW_SIZE; i++)\n    {\n        int j = i + 1;\n        if (pre_integrations[j]->sum_dt > 10.0)\n            continue;\n        IMUFactor* imu_factor = new IMUFactor(pre_integrations[j]);\n        problem.AddResidualBlock(imu_factor, NULL, para_Pose[i], para_SpeedBias[i], para_Pose[j], para_SpeedBias[j]);\n    }\n    int f_m_cnt = 0;\n    int feature_index = -1;\n    int largest_pid = f_manager.getLargestPlaneId(init_pids);\n    ROS_INFO(\"Using features from plane %d\", largest_pid);\n    for (auto &it_per_id : f_manager.feature)\n    {\n        it_per_id.used_num = it_per_id.feature_per_frame.size();\n        if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n            continue;\n\n        ++feature_index;\n\n        int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;\n        \n        Vector3d pts_i = it_per_id.feature_per_frame[0].point;\n        int pid = it_per_id.plane_id;\n\n        if(pid != largest_pid)\n            continue;\n\n        for (auto &it_per_frame : it_per_id.feature_per_frame)\n        {\n            imu_j++;\n            if (imu_i == imu_j)\n            {\n                continue;\n            }\n\n            Vector3d pts_j = it_per_frame.point;\n\n            if (ESTIMATE_TD)\n            {\n                    ProjectionTdFactor *f_td = new ProjectionTdFactor(pts_i, pts_j, it_per_id.feature_per_frame[0].velocity, it_per_frame.velocity,\n                                                                     it_per_id.feature_per_frame[0].cur_td, it_per_frame.cur_td,\n                                                                     it_per_id.feature_per_frame[0].uv.y(), it_per_frame.uv.y());\n                    problem.AddResidualBlock(f_td, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index], para_Td[0]);\n                    /*\n                    double **para = new double *[5];\n                    para[0] = para_Pose[imu_i];\n                    para[1] = para_Pose[imu_j];\n                    para[2] = para_Ex_Pose[0];\n                    para[3] = para_Feature[feature_index];\n                    para[4] = para_Td[0];\n                    f_td->check(para);\n                    */\n            }\n            else\n            {\n                ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                problem.AddResidualBlock(f, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]);\n\n                ceres::CostFunction *h_cost_function = HomographyFactor::Create(pts_i, pts_j);\n                problem.AddResidualBlock(h_cost_function, loss_function, para_Pose[imu_i], para_Pose[imu_j], para_N[pid].data(), para_d[pid].data(), para_Ex_Pose[0]);\n            }\n            f_m_cnt++;\n        }\n    }\n\n    ROS_DEBUG(\"visual measurement count: %d\", f_m_cnt);\n    ROS_DEBUG(\"prepare for ceres: %f\", t_prepare.toc());\n\n    if(relocalization_info)\n    {\n        //printf(\"set relocalization factor! \\n\");\n        ceres::LocalParameterization *local_parameterization = new PoseLocalParameterization();\n        problem.AddParameterBlock(relo_Pose, SIZE_POSE, local_parameterization);\n        int retrive_feature_index = 0;\n        int feature_index = -1;\n        for (auto &it_per_id : f_manager.feature)\n        {\n            it_per_id.used_num = it_per_id.feature_per_frame.size();\n            if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n                continue;\n            ++feature_index;\n            int start = it_per_id.start_frame;\n            if(start <= relo_frame_local_index)\n            {   \n                while((int)match_points[retrive_feature_index].z() < it_per_id.feature_id)\n                {\n                    retrive_feature_index++;\n                }\n                if((int)match_points[retrive_feature_index].z() == it_per_id.feature_id)\n                {\n                    Vector3d pts_j = Vector3d(match_points[retrive_feature_index].x(), match_points[retrive_feature_index].y(), 1.0);\n                    Vector3d pts_i = it_per_id.feature_per_frame[0].point;\n                    \n                    ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                    problem.AddResidualBlock(f, loss_function, para_Pose[start], relo_Pose, para_Ex_Pose[0], para_Feature[feature_index]);\n                    retrive_feature_index++;\n                }\n            }\n        }\n    }\n\n    ceres::Solver::Options options;\n\n    options.linear_solver_type = ceres::DENSE_SCHUR;\n    //options.num_threads = 2;\n    options.trust_region_strategy_type = ceres::DOGLEG;\n    options.max_num_iterations = NUM_ITERATIONS;\n    //options.use_explicit_schur_complement = true;\n    //options.minimizer_progress_to_stdout = true;\n    //options.use_nonmonotonic_steps = true;\n    if (marginalization_flag == MARGIN_OLD)\n        options.max_solver_time_in_seconds = SOLVER_TIME * 4.0 / 5.0;\n    else\n        options.max_solver_time_in_seconds = SOLVER_TIME;\n    TicToc t_solver;\n    ceres::Solver::Summary summary;\n    ceres::Solve(options, &problem, &summary);\n    //cout << summary.BriefReport() << endl;\n    ROS_DEBUG(\"Iterations : %d\", static_cast<int>(summary.iterations.size()));\n    ROS_DEBUG(\"solver costs: %f\", t_solver.toc());\n\n    double2vector();\n\n    TicToc t_whole_marginalization;\n    if (marginalization_flag == MARGIN_OLD)\n    {\n        MarginalizationInfo *marginalization_info = new MarginalizationInfo();\n        vector2double();\n\n        if (last_marginalization_info)\n        {\n            vector<int> drop_set; \n            for (int i = 0; i < static_cast<int>(last_marginalization_parameter_blocks.size()); i++)\n            {\n                if (last_marginalization_parameter_blocks[i] == para_Pose[0] ||\n                    last_marginalization_parameter_blocks[i] == para_SpeedBias[0])\n                    drop_set.push_back(i);\n            }\n            // construct new marginlization_factor\n            MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info);\n            ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(marginalization_factor, NULL,\n                                                                           last_marginalization_parameter_blocks,\n                                                                           drop_set);\n\n            marginalization_info->addResidualBlockInfo(residual_block_info);\n        }\n\n        {\n            if (pre_integrations[1]->sum_dt < 10.0)\n            {\n                IMUFactor* imu_factor = new IMUFactor(pre_integrations[1]);\n                ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(imu_factor, NULL,\n                                                                           vector<double *>{para_Pose[0], para_SpeedBias[0], para_Pose[1], para_SpeedBias[1]},\n                                                                           vector<int>{0, 1});\n                marginalization_info->addResidualBlockInfo(residual_block_info);\n            }\n        }\n\n        {\n            int feature_index = -1;\n            for (auto &it_per_id : f_manager.feature)\n            {\n                it_per_id.used_num = it_per_id.feature_per_frame.size();\n                if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n                    continue;\n\n                ++feature_index;\n\n                int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;\n                if (imu_i != 0)\n                    continue;\n\n                Vector3d pts_i = it_per_id.feature_per_frame[0].point;\n\n                for (auto &it_per_frame : it_per_id.feature_per_frame)\n                {\n                    imu_j++;\n                    if (imu_i == imu_j)\n                        continue;\n\n                    Vector3d pts_j = it_per_frame.point;\n                    if (ESTIMATE_TD)\n                    {\n                        ProjectionTdFactor *f_td = new ProjectionTdFactor(pts_i, pts_j, it_per_id.feature_per_frame[0].velocity, it_per_frame.velocity,\n                                                                          it_per_id.feature_per_frame[0].cur_td, it_per_frame.cur_td,\n                                                                          it_per_id.feature_per_frame[0].uv.y(), it_per_frame.uv.y());\n                        ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(f_td, loss_function,\n                                                                                        vector<double *>{para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index], para_Td[0]},\n                                                                                        vector<int>{0, 3});\n                        marginalization_info->addResidualBlockInfo(residual_block_info);\n                    }\n                    else\n                    {\n                        ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);\n                        ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(f, loss_function,\n                                                                                       vector<double *>{para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]},\n                                                                                       vector<int>{0, 3});\n                        marginalization_info->addResidualBlockInfo(residual_block_info);\n                    }\n                }\n            }\n        }\n\n        TicToc t_pre_margin;\n        marginalization_info->preMarginalize();\n        ROS_DEBUG(\"pre marginalization %f ms\", t_pre_margin.toc());\n        \n        TicToc t_margin;\n        marginalization_info->marginalize();\n        ROS_DEBUG(\"marginalization %f ms\", t_margin.toc());\n\n        std::unordered_map<long, double *> addr_shift;\n        for (int i = 1; i <= WINDOW_SIZE; i++)\n        {\n            addr_shift[reinterpret_cast<long>(para_Pose[i])] = para_Pose[i - 1];\n            addr_shift[reinterpret_cast<long>(para_SpeedBias[i])] = para_SpeedBias[i - 1];\n        }\n        for (int i = 0; i < NUM_OF_CAM; i++)\n            addr_shift[reinterpret_cast<long>(para_Ex_Pose[i])] = para_Ex_Pose[i];\n        if (ESTIMATE_TD)\n        {\n            addr_shift[reinterpret_cast<long>(para_Td[0])] = para_Td[0];\n        }\n        vector<double *> parameter_blocks = marginalization_info->getParameterBlocks(addr_shift);\n\n        if (last_marginalization_info)\n            delete last_marginalization_info;\n        last_marginalization_info = marginalization_info;\n        last_marginalization_parameter_blocks = parameter_blocks;\n        \n    }\n    else\n    {\n        if (last_marginalization_info &&\n            std::count(std::begin(last_marginalization_parameter_blocks), std::end(last_marginalization_parameter_blocks), para_Pose[WINDOW_SIZE - 1]))\n        {\n\n            MarginalizationInfo *marginalization_info = new MarginalizationInfo();\n            vector2double();\n            if (last_marginalization_info)\n            {\n                vector<int> drop_set;\n                for (int i = 0; i < static_cast<int>(last_marginalization_parameter_blocks.size()); i++)\n                {\n                    ROS_ASSERT(last_marginalization_parameter_blocks[i] != para_SpeedBias[WINDOW_SIZE - 1]);\n                    if (last_marginalization_parameter_blocks[i] == para_Pose[WINDOW_SIZE - 1])\n                        drop_set.push_back(i);\n                }\n                // construct new marginlization_factor\n                MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info);\n                ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(marginalization_factor, NULL,\n                                                                               last_marginalization_parameter_blocks,\n                                                                               drop_set);\n\n                marginalization_info->addResidualBlockInfo(residual_block_info);\n            }\n\n            TicToc t_pre_margin;\n            ROS_DEBUG(\"begin marginalization\");\n            marginalization_info->preMarginalize();\n            ROS_DEBUG(\"end pre marginalization, %f ms\", t_pre_margin.toc());\n\n            TicToc t_margin;\n            ROS_DEBUG(\"begin marginalization\");\n            marginalization_info->marginalize();\n            ROS_DEBUG(\"end marginalization, %f ms\", t_margin.toc());\n            \n            std::unordered_map<long, double *> addr_shift;\n            for (int i = 0; i <= WINDOW_SIZE; i++)\n            {\n                if (i == WINDOW_SIZE - 1)\n                    continue;\n                else if (i == WINDOW_SIZE)\n                {\n                    addr_shift[reinterpret_cast<long>(para_Pose[i])] = para_Pose[i - 1];\n                    addr_shift[reinterpret_cast<long>(para_SpeedBias[i])] = para_SpeedBias[i - 1];\n                }\n                else\n                {\n                    addr_shift[reinterpret_cast<long>(para_Pose[i])] = para_Pose[i];\n                    addr_shift[reinterpret_cast<long>(para_SpeedBias[i])] = para_SpeedBias[i];\n                }\n            }\n            for (int i = 0; i < NUM_OF_CAM; i++)\n                addr_shift[reinterpret_cast<long>(para_Ex_Pose[i])] = para_Ex_Pose[i];\n            if (ESTIMATE_TD)\n            {\n                addr_shift[reinterpret_cast<long>(para_Td[0])] = para_Td[0];\n            }\n            \n            vector<double *> parameter_blocks = marginalization_info->getParameterBlocks(addr_shift);\n            if (last_marginalization_info)\n                delete last_marginalization_info;\n            last_marginalization_info = marginalization_info;\n            last_marginalization_parameter_blocks = parameter_blocks;\n            \n        }\n    }\n    ROS_DEBUG(\"whole marginalization costs: %f\", t_whole_marginalization.toc());\n    \n    ROS_DEBUG(\"whole time for ceres: %f\", t_whole.toc());\n}\n\nvoid Estimator::slideWindow()\n{\n    TicToc t_margin;\n    if (marginalization_flag == MARGIN_OLD)\n    {\n        double t_0 = Headers[0].stamp.toSec();\n        back_R0 = Rs[0];\n        back_P0 = Ps[0];\n        if (frame_count == WINDOW_SIZE)\n        {\n            for (int i = 0; i < WINDOW_SIZE; i++)\n            {\n                Rs[i].swap(Rs[i + 1]);\n\n                std::swap(pre_integrations[i], pre_integrations[i + 1]);\n\n                dt_buf[i].swap(dt_buf[i + 1]);\n                linear_acceleration_buf[i].swap(linear_acceleration_buf[i + 1]);\n                angular_velocity_buf[i].swap(angular_velocity_buf[i + 1]);\n\n                Headers[i] = Headers[i + 1];\n                Ps[i].swap(Ps[i + 1]);\n                Vs[i].swap(Vs[i + 1]);\n                Bas[i].swap(Bas[i + 1]);\n                Bgs[i].swap(Bgs[i + 1]);\n            }\n            Headers[WINDOW_SIZE] = Headers[WINDOW_SIZE - 1];\n            Ps[WINDOW_SIZE] = Ps[WINDOW_SIZE - 1];\n            Vs[WINDOW_SIZE] = Vs[WINDOW_SIZE - 1];\n            Rs[WINDOW_SIZE] = Rs[WINDOW_SIZE - 1];\n            Bas[WINDOW_SIZE] = Bas[WINDOW_SIZE - 1];\n            Bgs[WINDOW_SIZE] = Bgs[WINDOW_SIZE - 1];\n\n            delete pre_integrations[WINDOW_SIZE];\n            pre_integrations[WINDOW_SIZE] = new IntegrationBase{acc_0, gyr_0, Bas[WINDOW_SIZE], Bgs[WINDOW_SIZE]};\n\n            dt_buf[WINDOW_SIZE].clear();\n            linear_acceleration_buf[WINDOW_SIZE].clear();\n            angular_velocity_buf[WINDOW_SIZE].clear();\n\n            if (true || solver_flag == INITIAL)\n            {\n                map<double, ImageFrame>::iterator it_0;\n                it_0 = all_image_frame.find(t_0);\n                delete it_0->second.pre_integration;\n                it_0->second.pre_integration = nullptr;\n \n                for (map<double, ImageFrame>::iterator it = all_image_frame.begin(); it != it_0; ++it)\n                {\n                    if (it->second.pre_integration)\n                        delete it->second.pre_integration;\n                    it->second.pre_integration = NULL;\n                }\n\n                all_image_frame.erase(all_image_frame.begin(), it_0);\n                all_image_frame.erase(t_0);\n\n            }\n            slideWindowOld();\n        }\n    }\n    else\n    {\n        if (frame_count == WINDOW_SIZE)\n        {\n            for (unsigned int i = 0; i < dt_buf[frame_count].size(); i++)\n            {\n                double tmp_dt = dt_buf[frame_count][i];\n                Vector3d tmp_linear_acceleration = linear_acceleration_buf[frame_count][i];\n                Vector3d tmp_angular_velocity = angular_velocity_buf[frame_count][i];\n\n                pre_integrations[frame_count - 1]->push_back(tmp_dt, tmp_linear_acceleration, tmp_angular_velocity);\n\n                dt_buf[frame_count - 1].push_back(tmp_dt);\n                linear_acceleration_buf[frame_count - 1].push_back(tmp_linear_acceleration);\n                angular_velocity_buf[frame_count - 1].push_back(tmp_angular_velocity);\n            }\n\n            Headers[frame_count - 1] = Headers[frame_count];\n            Ps[frame_count - 1] = Ps[frame_count];\n            Vs[frame_count - 1] = Vs[frame_count];\n            Rs[frame_count - 1] = Rs[frame_count];\n            Bas[frame_count - 1] = Bas[frame_count];\n            Bgs[frame_count - 1] = Bgs[frame_count];\n\n            delete pre_integrations[WINDOW_SIZE];\n            pre_integrations[WINDOW_SIZE] = new IntegrationBase{acc_0, gyr_0, Bas[WINDOW_SIZE], Bgs[WINDOW_SIZE]};\n\n            dt_buf[WINDOW_SIZE].clear();\n            linear_acceleration_buf[WINDOW_SIZE].clear();\n            angular_velocity_buf[WINDOW_SIZE].clear();\n\n            slideWindowNew();\n        }\n    }\n}\n\n// real marginalization is removed in solve_ceres()\nvoid Estimator::slideWindowNew()\n{\n    sum_of_front++;\n    f_manager.removeFront(frame_count);\n}\n// real marginalization is removed in solve_ceres()\nvoid Estimator::slideWindowOld()\n{\n    sum_of_back++;\n\n    bool shift_depth = solver_flag == NON_LINEAR ? true : false;\n    if (shift_depth)\n    {\n        Matrix3d R0, R1;\n        Vector3d P0, P1;\n        R0 = back_R0 * ric[0];\n        R1 = Rs[0] * ric[0];\n        P0 = back_P0 + back_R0 * tic[0];\n        P1 = Ps[0] + Rs[0] * tic[0];\n        f_manager.removeBackShiftDepth(R0, P0, R1, P1);\n    }\n    else\n        f_manager.removeBack();\n}\n\nvoid Estimator::setReloFrame(double _frame_stamp, int _frame_index, vector<Vector3d> &_match_points, Vector3d _relo_t, Matrix3d _relo_r)\n{\n    relo_frame_stamp = _frame_stamp;\n    relo_frame_index = _frame_index;\n    match_points.clear();\n    match_points = _match_points;\n    prev_relo_t = _relo_t;\n    prev_relo_r = _relo_r;\n    for(int i = 0; i < WINDOW_SIZE; i++)\n    {\n        if(relo_frame_stamp == Headers[i].stamp.toSec())\n        {\n            relo_frame_local_index = i;\n            relocalization_info = 1;\n            for (int j = 0; j < SIZE_POSE; j++)\n                relo_Pose[j] = para_Pose[i][j];\n        }\n    }\n}\n\n"
  },
  {
    "path": "rpvio_estimator/src/estimator.h",
    "content": "#pragma once\n\n#include \"parameters.h\"\n#include \"feature_manager.h\"\n#include \"utility/utility.h\"\n#include \"utility/tic_toc.h\"\n#include \"initial/solve_5pts.h\"\n#include \"initial/initial_sfm.h\"\n#include \"initial/initial_alignment.h\"\n#include \"initial/initial_ex_rotation.h\"\n#include <std_msgs/Header.h>\n#include <std_msgs/Float32.h>\n\n#include <ceres/ceres.h>\n#include \"factor/imu_factor.h\"\n#include \"factor/pose_local_parameterization.h\"\n#include \"factor/projection_factor.h\"\n#include \"factor/projection_td_factor.h\"\n#include \"factor/marginalization_factor.h\"\n#include \"factor/homography_factor.h\"\n\n#include <unordered_map>\n#include <queue>\n#include <opencv2/core/eigen.hpp>\n\n\nclass Estimator\n{\n  public:\n    Estimator();\n\n    void setParameter();\n\n    // interface\n    void processIMU(double t, const Vector3d &linear_acceleration, const Vector3d &angular_velocity);\n    void processImage(const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, const std_msgs::Header &header);\n    void setReloFrame(double _frame_stamp, int _frame_index, vector<Vector3d> &_match_points, Vector3d _relo_t, Matrix3d _relo_r);\n\n    // internal\n    void clearState();\n    bool initialStructure();\n    bool visualInitialAlign(map<int,Eigen::Vector3d> &relative_T, Eigen::Vector3d &lt);\n    bool relativePose(Matrix3d &relative_R, Vector3d &relative_T, int &l);\n    bool relativeHPose(map<int, Matrix3d> &relative_R, map<int, Vector3d> &relative_T, map<int, Vector3d> &n, int &l, int &lplane_id);\n    void initializeNewPlanes();\n    void slideWindow();\n    void solveOdometry();\n    void slideWindowNew();\n    void slideWindowOld();\n    void optimization();\n    void vector2double();\n    void double2vector();\n    bool failureDetection();\n\n\n    enum SolverFlag\n    {\n        INITIAL,\n        NON_LINEAR\n    };\n\n    enum MarginalizationFlag\n    {\n        MARGIN_OLD = 0,\n        MARGIN_SECOND_NEW = 1\n    };\n\n    SolverFlag solver_flag;\n    MarginalizationFlag  marginalization_flag;\n    Vector3d g;\n    MatrixXd Ap[2], backup_A;\n    VectorXd bp[2], backup_b;\n\n    Matrix3d ric[NUM_OF_CAM];\n    Vector3d tic[NUM_OF_CAM];\n\n    Vector3d Ps[(WINDOW_SIZE + 1)];\n    Vector3d Vs[(WINDOW_SIZE + 1)];\n    Matrix3d Rs[(WINDOW_SIZE + 1)];\n    Vector3d Bas[(WINDOW_SIZE + 1)];\n    Vector3d Bgs[(WINDOW_SIZE + 1)];\n    double td;\n    double s;\n    int lplane_id;\n\n    Matrix3d back_R0, last_R, last_R0;\n    Vector3d back_P0, last_P, last_P0;\n    std_msgs::Header Headers[(WINDOW_SIZE + 1)];\n\n    IntegrationBase *pre_integrations[(WINDOW_SIZE + 1)];\n    Vector3d acc_0, gyr_0;\n\n    vector<double> dt_buf[(WINDOW_SIZE + 1)];\n    vector<Vector3d> linear_acceleration_buf[(WINDOW_SIZE + 1)];\n    vector<Vector3d> angular_velocity_buf[(WINDOW_SIZE + 1)];\n\n    int frame_count;\n    int sum_of_outlier, sum_of_back, sum_of_front, sum_of_invalid;\n\n    FeatureManager f_manager;\n    MotionEstimator m_estimator;\n    InitialEXRotation initial_ex_rotation;\n\n    bool first_imu;\n    bool is_valid, is_key;\n    bool failure_occur;\n\n    vector<Vector3d> point_cloud;\n    vector<Vector3d> margin_cloud;\n    vector<Vector3d> key_poses;\n    double initial_timestamp;\n\n    double para_Pose[WINDOW_SIZE + 1][SIZE_POSE];\n    double para_SpeedBias[WINDOW_SIZE + 1][SIZE_SPEEDBIAS];\n    double para_Feature[NUM_OF_F][SIZE_FEATURE];\n    double para_Ex_Pose[NUM_OF_CAM][SIZE_POSE];\n    double para_Retrive_Pose[SIZE_POSE];\n    double para_Td[1][1];\n    double para_Tr[1][1];\n    \n    // In global camera frame (c_0)\n    map<int, array<double,3>> para_N;\n    map<int, array<double,1>> para_d;\n\n    vector<int> init_pids;\n\n    int loop_window_index;\n\n    MarginalizationInfo *last_marginalization_info;\n    vector<double *> last_marginalization_parameter_blocks;\n\n    map<double, ImageFrame> all_image_frame;\n    IntegrationBase *tmp_pre_integration;\n\n    //relocalization variables\n    bool relocalization_info;\n    double relo_frame_stamp;\n    double relo_frame_index;\n    int relo_frame_local_index;\n    vector<Vector3d> match_points;\n    double relo_Pose[SIZE_POSE];\n    Matrix3d drift_correct_r;\n    Vector3d drift_correct_t;\n    Vector3d prev_relo_t;\n    Matrix3d prev_relo_r;\n    Vector3d relo_relative_t;\n    Quaterniond relo_relative_q;\n    double relo_relative_yaw;\n};\n"
  },
  {
    "path": "rpvio_estimator/src/estimator_node.cpp",
    "content": "#include <stdio.h>\n#include <queue>\n#include <map>\n#include <thread>\n#include <mutex>\n#include <condition_variable>\n#include <ros/ros.h>\n#include <cv_bridge/cv_bridge.h>\n#include <opencv2/opencv.hpp>\n\n#include \"estimator.h\"\n#include \"parameters.h\"\n#include \"utility/visualization.h\"\n\n\nEstimator estimator;\n\nstd::condition_variable con;\ndouble current_time = -1;\nqueue<sensor_msgs::ImuConstPtr> imu_buf;\nqueue<sensor_msgs::PointCloudConstPtr> feature_buf;\nqueue<sensor_msgs::PointCloudConstPtr> relo_buf;\nint sum_of_wait = 0;\n\nstd::mutex m_buf;\nstd::mutex m_state;\nstd::mutex i_buf;\nstd::mutex m_estimator;\n\ndouble latest_time;\nEigen::Vector3d tmp_P;\nEigen::Quaterniond tmp_Q;\nEigen::Vector3d tmp_V;\nEigen::Vector3d tmp_Ba;\nEigen::Vector3d tmp_Bg;\nEigen::Vector3d acc_0;\nEigen::Vector3d gyr_0;\nbool init_feature = 0;\nbool init_imu = 1;\ndouble last_imu_t = 0;\n\nvoid predict(const sensor_msgs::ImuConstPtr &imu_msg)\n{\n    double t = imu_msg->header.stamp.toSec();\n    if (init_imu)\n    {\n        latest_time = t;\n        init_imu = 0;\n        return;\n    }\n    double dt = t - latest_time;\n    latest_time = t;\n\n    double dx = imu_msg->linear_acceleration.x;\n    double dy = imu_msg->linear_acceleration.y;\n    double dz = imu_msg->linear_acceleration.z;\n    Eigen::Vector3d linear_acceleration{dx, dy, dz};\n\n    double rx = imu_msg->angular_velocity.x;\n    double ry = imu_msg->angular_velocity.y;\n    double rz = imu_msg->angular_velocity.z;\n    Eigen::Vector3d angular_velocity{rx, ry, rz};\n\n    Eigen::Vector3d un_acc_0 = tmp_Q * (acc_0 - tmp_Ba) - estimator.g;\n\n    Eigen::Vector3d un_gyr = 0.5 * (gyr_0 + angular_velocity) - tmp_Bg;\n    tmp_Q = tmp_Q * Utility::deltaQ(un_gyr * dt);\n\n    Eigen::Vector3d un_acc_1 = tmp_Q * (linear_acceleration - tmp_Ba) - estimator.g;\n\n    Eigen::Vector3d un_acc = 0.5 * (un_acc_0 + un_acc_1);\n\n    tmp_P = tmp_P + dt * tmp_V + 0.5 * dt * dt * un_acc;\n    tmp_V = tmp_V + dt * un_acc;\n\n    acc_0 = linear_acceleration;\n    gyr_0 = angular_velocity;\n}\n\nvoid update()\n{\n    TicToc t_predict;\n    latest_time = current_time;\n    tmp_P = estimator.Ps[WINDOW_SIZE];\n    tmp_Q = estimator.Rs[WINDOW_SIZE];\n    tmp_V = estimator.Vs[WINDOW_SIZE];\n    tmp_Ba = estimator.Bas[WINDOW_SIZE];\n    tmp_Bg = estimator.Bgs[WINDOW_SIZE];\n    acc_0 = estimator.acc_0;\n    gyr_0 = estimator.gyr_0;\n\n    queue<sensor_msgs::ImuConstPtr> tmp_imu_buf = imu_buf;\n    for (sensor_msgs::ImuConstPtr tmp_imu_msg; !tmp_imu_buf.empty(); tmp_imu_buf.pop())\n        predict(tmp_imu_buf.front());\n\n}\n\nstd::vector<std::pair<std::vector<sensor_msgs::ImuConstPtr>, sensor_msgs::PointCloudConstPtr>>\ngetMeasurements()\n{\n    std::vector<std::pair<std::vector<sensor_msgs::ImuConstPtr>, sensor_msgs::PointCloudConstPtr>> measurements;\n\n    while (true)\n    {\n        if (imu_buf.empty() || feature_buf.empty())\n            return measurements;\n\n        if (!(imu_buf.back()->header.stamp.toSec() > feature_buf.front()->header.stamp.toSec() + estimator.td))\n        {\n            //ROS_WARN(\"wait for imu, only should happen at the beginning\");\n            sum_of_wait++;\n            return measurements;\n        }\n\n        if (!(imu_buf.front()->header.stamp.toSec() < feature_buf.front()->header.stamp.toSec() + estimator.td))\n        {\n            ROS_WARN(\"throw img, only should happen at the beginning\");\n            feature_buf.pop();\n            continue;\n        }\n        sensor_msgs::PointCloudConstPtr img_msg = feature_buf.front();\n        feature_buf.pop();\n\n        std::vector<sensor_msgs::ImuConstPtr> IMUs;\n        while (imu_buf.front()->header.stamp.toSec() < img_msg->header.stamp.toSec() + estimator.td)\n        {\n            IMUs.emplace_back(imu_buf.front());\n            imu_buf.pop();\n        }\n        IMUs.emplace_back(imu_buf.front());\n        if (IMUs.empty())\n            ROS_WARN(\"no imu between two image\");\n        measurements.emplace_back(IMUs, img_msg);\n    }\n    return measurements;\n}\n\nvoid imu_callback(const sensor_msgs::ImuConstPtr &imu_msg)\n{\n    if (imu_msg->header.stamp.toSec() <= last_imu_t)\n    {\n        ROS_WARN(\"imu message in disorder!\");\n        return;\n    }\n    last_imu_t = imu_msg->header.stamp.toSec();\n\n    m_buf.lock();\n    imu_buf.push(imu_msg);\n    m_buf.unlock();\n    con.notify_one();\n\n    last_imu_t = imu_msg->header.stamp.toSec();\n\n    {\n        std::lock_guard<std::mutex> lg(m_state);\n        predict(imu_msg);\n        std_msgs::Header header = imu_msg->header;\n        header.frame_id = \"world\";\n        if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR)\n            pubLatestOdometry(tmp_P, tmp_Q, tmp_V, header);\n    }\n}\n\n\nvoid feature_callback(const sensor_msgs::PointCloudConstPtr &feature_msg)\n{\n    if (!init_feature)\n    {\n        //skip the first detected feature, which doesn't contain optical flow speed\n        init_feature = 1;\n        return;\n    }\n    m_buf.lock();\n    feature_buf.push(feature_msg);\n    m_buf.unlock();\n    con.notify_one();\n}\n\nvoid restart_callback(const std_msgs::BoolConstPtr &restart_msg)\n{\n    if (restart_msg->data == true)\n    {\n        ROS_WARN(\"restart the estimator!\");\n        m_buf.lock();\n        while(!feature_buf.empty())\n            feature_buf.pop();\n        while(!imu_buf.empty())\n            imu_buf.pop();\n        m_buf.unlock();\n        m_estimator.lock();\n        estimator.clearState();\n        estimator.setParameter();\n        m_estimator.unlock();\n        current_time = -1;\n        last_imu_t = 0;\n    }\n    return;\n}\n\nvoid relocalization_callback(const sensor_msgs::PointCloudConstPtr &points_msg)\n{\n    //printf(\"relocalization callback! \\n\");\n    m_buf.lock();\n    relo_buf.push(points_msg);\n    m_buf.unlock();\n}\n\n// thread: visual-inertial odometry\nvoid process()\n{\n    while (true)\n    {\n        std::vector<std::pair<std::vector<sensor_msgs::ImuConstPtr>, sensor_msgs::PointCloudConstPtr>> measurements;\n        std::unique_lock<std::mutex> lk(m_buf);\n        con.wait(lk, [&]\n                 {\n            return (measurements = getMeasurements()).size() != 0;\n                 });\n        lk.unlock();\n        m_estimator.lock();\n        for (auto &measurement : measurements)\n        {\n            auto img_msg = measurement.second;\n            double dx = 0, dy = 0, dz = 0, rx = 0, ry = 0, rz = 0;\n            for (auto &imu_msg : measurement.first)\n            {\n                double t = imu_msg->header.stamp.toSec();\n                double img_t = img_msg->header.stamp.toSec() + estimator.td;\n                if (t <= img_t)\n                { \n                    if (current_time < 0)\n                        current_time = t;\n                    double dt = t - current_time;\n                    ROS_ASSERT(dt >= 0);\n                    current_time = t;\n                    dx = imu_msg->linear_acceleration.x;\n                    dy = imu_msg->linear_acceleration.y;\n                    dz = imu_msg->linear_acceleration.z;\n                    rx = imu_msg->angular_velocity.x;\n                    ry = imu_msg->angular_velocity.y;\n                    rz = imu_msg->angular_velocity.z;\n                    estimator.processIMU(dt, Vector3d(dx, dy, dz), Vector3d(rx, ry, rz));\n                    //printf(\"imu: dt:%f a: %f %f %f w: %f %f %f\\n\",dt, dx, dy, dz, rx, ry, rz);\n\n                }\n                else\n                {\n                    double dt_1 = img_t - current_time;\n                    double dt_2 = t - img_t;\n                    current_time = img_t;\n                    ROS_ASSERT(dt_1 >= 0);\n                    ROS_ASSERT(dt_2 >= 0);\n                    ROS_ASSERT(dt_1 + dt_2 > 0);\n                    double w1 = dt_2 / (dt_1 + dt_2);\n                    double w2 = dt_1 / (dt_1 + dt_2);\n                    dx = w1 * dx + w2 * imu_msg->linear_acceleration.x;\n                    dy = w1 * dy + w2 * imu_msg->linear_acceleration.y;\n                    dz = w1 * dz + w2 * imu_msg->linear_acceleration.z;\n                    rx = w1 * rx + w2 * imu_msg->angular_velocity.x;\n                    ry = w1 * ry + w2 * imu_msg->angular_velocity.y;\n                    rz = w1 * rz + w2 * imu_msg->angular_velocity.z;\n                    estimator.processIMU(dt_1, Vector3d(dx, dy, dz), Vector3d(rx, ry, rz));\n                    //printf(\"dimu: dt:%f a: %f %f %f w: %f %f %f\\n\",dt_1, dx, dy, dz, rx, ry, rz);\n                }\n            }\n            // set relocalization frame\n            sensor_msgs::PointCloudConstPtr relo_msg = NULL;\n            while (!relo_buf.empty())\n            {\n                relo_msg = relo_buf.front();\n                relo_buf.pop();\n            }\n            if (relo_msg != NULL)\n            {\n                vector<Vector3d> match_points;\n                double frame_stamp = relo_msg->header.stamp.toSec();\n                for (unsigned int i = 0; i < relo_msg->points.size(); i++)\n                {\n                    Vector3d u_v_id;\n                    u_v_id.x() = relo_msg->points[i].x;\n                    u_v_id.y() = relo_msg->points[i].y;\n                    u_v_id.z() = relo_msg->points[i].z;\n                    match_points.push_back(u_v_id);\n                }\n                Vector3d relo_t(relo_msg->channels[0].values[0], relo_msg->channels[0].values[1], relo_msg->channels[0].values[2]);\n                Quaterniond relo_q(relo_msg->channels[0].values[3], relo_msg->channels[0].values[4], relo_msg->channels[0].values[5], relo_msg->channels[0].values[6]);\n                Matrix3d relo_r = relo_q.toRotationMatrix();\n                int frame_index;\n                frame_index = relo_msg->channels[0].values[7];\n                estimator.setReloFrame(frame_stamp, frame_index, match_points, relo_t, relo_r);\n            }\n\n            ROS_DEBUG(\"processing vision data with stamp %f \\n\", img_msg->header.stamp.toSec());\n\n            TicToc t_s;\n            map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> image;\n            for (unsigned int i = 0; i < img_msg->points.size(); i++)\n            {\n                int v = img_msg->channels[0].values[i] + 0.5;\n                int feature_id = v / NUM_OF_CAM;\n                int camera_id = v % NUM_OF_CAM;\n                int plane_id = img_msg->channels[1].values[i];\n                double x = img_msg->points[i].x;\n                double y = img_msg->points[i].y;\n                double z = img_msg->points[i].z;\n                double p_u = img_msg->channels[2].values[i];\n                double p_v = img_msg->channels[3].values[i];\n                double velocity_x = img_msg->channels[4].values[i];\n                double velocity_y = img_msg->channels[5].values[i];\n                ROS_ASSERT(z == 1);\n                Eigen::Matrix<double, 8, 1> xyz_uv_velocity_pid;\n                xyz_uv_velocity_pid << x, y, z, p_u, p_v, velocity_x, velocity_y, plane_id;\n                image[feature_id].emplace_back(camera_id,  xyz_uv_velocity_pid);\n            }\n            estimator.processImage(image, img_msg->header);\n\n            double whole_t = t_s.toc();\n            printStatistics(estimator, whole_t);\n            std_msgs::Header header = img_msg->header;\n            header.frame_id = \"world\";\n\n            pubOdometry(estimator, header);\n            pubKeyPoses(estimator, header);\n            pubCameraPose(estimator, header);\n            pubPointCloud(estimator, header);\n            pubTF(estimator, header);\n            pubKeyframe(estimator);\n            if (relo_msg != NULL)\n                pubRelocalization(estimator);\n            //ROS_ERROR(\"end: %f, at %f\", img_msg->header.stamp.toSec(), ros::Time::now().toSec());\n        }\n        m_estimator.unlock();\n        m_buf.lock();\n        m_state.lock();\n        if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR)\n            update();\n        m_state.unlock();\n        m_buf.unlock();\n    }\n}\n\nint main(int argc, char **argv)\n{\n    ros::init(argc, argv, \"rpvio_estimator\");\n    ros::NodeHandle n(\"~\");\n    ros::console::set_logger_level(ROSCONSOLE_DEFAULT_NAME, ros::console::levels::Info);\n    readParameters(n);\n    estimator.setParameter();\n#ifdef EIGEN_DONT_PARALLELIZE\n    ROS_DEBUG(\"EIGEN_DONT_PARALLELIZE\");\n#endif\n    ROS_WARN(\"waiting for image and imu...\");\n\n    registerPub(n);\n\n    ros::Subscriber sub_imu = n.subscribe(IMU_TOPIC, 2000, imu_callback, ros::TransportHints().tcpNoDelay());\n    ros::Subscriber sub_image = n.subscribe(\"/rpvio_feature_tracker/feature\", 2000, feature_callback);\n    ros::Subscriber sub_restart = n.subscribe(\"/rpvio_feature_tracker/restart\", 2000, restart_callback);\n    ros::Subscriber sub_relo_points = n.subscribe(\"/pose_graph/match_points\", 2000, relocalization_callback);\n\n    std::thread measurement_process{process};\n    ros::spin();\n\n    return 0;\n}\n"
  },
  {
    "path": "rpvio_estimator/src/factor/homography_factor.h",
    "content": "#pragma once\n#include <ceres/ceres.h>\n#include <eigen3/Eigen/Dense>\n\nstruct HomographyFactor\n{\n    HomographyFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j) {}\n\n    template <typename T>\n        bool operator()(const T* const pose_i, const T* const pose_j, const T* const para_n, const T* const para_depth, const T* const ex_pose, \n        T* residuals) const\n        {\n            Eigen::Map<const Eigen::Matrix<T, 3, 1>> pi(pose_i);\n            Eigen::Quaternion<T> qi;\n            qi.coeffs() << pose_i[3], pose_i[4], pose_i[5], pose_i[6];\n\n            Eigen::Map<const Eigen::Matrix<T, 3, 1>> pj(pose_j);\n            Eigen::Quaternion<T> qj;\n            qj.coeffs() << pose_j[3], pose_j[4], pose_j[5], pose_j[6];\n\n            Eigen::Map<const Eigen::Matrix<T, 3, 1>> tic(ex_pose);\n            Eigen::Quaternion<T> qic;\n            qic.coeffs() << ex_pose[3], ex_pose[4], ex_pose[5], ex_pose[6];\n\n            Eigen::Matrix<T, 3, 1> n;\n            n << para_n[0], para_n[1], para_n[2];\n\n            // transform camera normal to imu normal\n            Eigen::Matrix<T, 3, 1> n_imu_0 = qic*n;// + tic;\n\n            // transform imu 0 normal to imu i normal\n            Eigen::Matrix<T, 3, 1> n_imu_i = qi.inverse()*n_imu_0;// - qi.inverse()*pi;\n\n            Eigen::Map<const Eigen::Matrix<T, 1, 1>> depth(para_depth);\n\n            Eigen::Quaternion<T> qji = qj.inverse() * qi;\n            Eigen::Matrix<T, 3, 1> tji = qj.inverse() * (pi - pj);\n            Eigen::Matrix<T, 1, 1> di, di0;\n\n            // convert camera depth to imu frame\n            di0(0,0) = depth(0,0) + tic.dot(n_imu_0);\n            // convert imu 0 depth to imu i depth\n            di(0,0) = di0(0,0) - pi.dot(n_imu_0);\n\n            Eigen::Matrix<T, 3, 1> pts_imu_i = qic * pts_i.cast<T>() + tic;\n\n            // homography mapping\n            Eigen::Matrix<T, 3, 1> pts_imu_j = qji * pts_imu_i + (tji*(1.0/di(0,0)) * n_imu_i.transpose()) * pts_imu_i;\n\n            Eigen::Matrix<T, 3, 1> pts_cam_j = qic.inverse() * (pts_imu_j - tic);\n\n            pts_cam_j = (pts_cam_j / pts_cam_j[2]);\n            residuals[0] = pts_cam_j[0] - pts_j[0];\n            residuals[1] = pts_cam_j[1] - pts_j[1];\n\n            return true;\n        }\n\n    static ceres::CostFunction* Create(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j)\n    {\n        return (new ceres::AutoDiffCostFunction<HomographyFactor, 2, 7, 7, 3, 1, 7>\n                (new HomographyFactor(_pts_i, _pts_j)));\n    }\n\n    Eigen::Vector3d pts_i, pts_j;\n};\n"
  },
  {
    "path": "rpvio_estimator/src/factor/imu_factor.h",
    "content": "#pragma once\n#include <ros/assert.h>\n#include <iostream>\n#include <eigen3/Eigen/Dense>\n\n#include \"../utility/utility.h\"\n#include \"../parameters.h\"\n#include \"integration_base.h\"\n\n#include <ceres/ceres.h>\n\nclass IMUFactor : public ceres::SizedCostFunction<15, 7, 9, 7, 9>\n{\n  public:\n    IMUFactor() = delete;\n    IMUFactor(IntegrationBase* _pre_integration):pre_integration(_pre_integration)\n    {\n    }\n    virtual bool Evaluate(double const *const *parameters, double *residuals, double **jacobians) const\n    {\n\n        Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]);\n        Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]);\n\n        Eigen::Vector3d Vi(parameters[1][0], parameters[1][1], parameters[1][2]);\n        Eigen::Vector3d Bai(parameters[1][3], parameters[1][4], parameters[1][5]);\n        Eigen::Vector3d Bgi(parameters[1][6], parameters[1][7], parameters[1][8]);\n\n        Eigen::Vector3d Pj(parameters[2][0], parameters[2][1], parameters[2][2]);\n        Eigen::Quaterniond Qj(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]);\n\n        Eigen::Vector3d Vj(parameters[3][0], parameters[3][1], parameters[3][2]);\n        Eigen::Vector3d Baj(parameters[3][3], parameters[3][4], parameters[3][5]);\n        Eigen::Vector3d Bgj(parameters[3][6], parameters[3][7], parameters[3][8]);\n\n//Eigen::Matrix<double, 15, 15> Fd;\n//Eigen::Matrix<double, 15, 12> Gd;\n\n//Eigen::Vector3d pPj = Pi + Vi * sum_t - 0.5 * g * sum_t * sum_t + corrected_delta_p;\n//Eigen::Quaterniond pQj = Qi * delta_q;\n//Eigen::Vector3d pVj = Vi - g * sum_t + corrected_delta_v;\n//Eigen::Vector3d pBaj = Bai;\n//Eigen::Vector3d pBgj = Bgi;\n\n//Vi + Qi * delta_v - g * sum_dt = Vj;\n//Qi * delta_q = Qj;\n\n//delta_p = Qi.inverse() * (0.5 * g * sum_dt * sum_dt + Pj - Pi);\n//delta_v = Qi.inverse() * (g * sum_dt + Vj - Vi);\n//delta_q = Qi.inverse() * Qj;\n\n#if 0\n        if ((Bai - pre_integration->linearized_ba).norm() > 0.10 ||\n            (Bgi - pre_integration->linearized_bg).norm() > 0.01)\n        {\n            pre_integration->repropagate(Bai, Bgi);\n        }\n#endif\n\n        Eigen::Map<Eigen::Matrix<double, 15, 1>> residual(residuals);\n        residual = pre_integration->evaluate(Pi, Qi, Vi, Bai, Bgi,\n                                            Pj, Qj, Vj, Baj, Bgj);\n\n        Eigen::Matrix<double, 15, 15> sqrt_info = Eigen::LLT<Eigen::Matrix<double, 15, 15>>(pre_integration->covariance.inverse()).matrixL().transpose();\n        //sqrt_info.setIdentity();\n        residual = sqrt_info * residual;\n\n        if (jacobians)\n        {\n            double sum_dt = pre_integration->sum_dt;\n            Eigen::Matrix3d dp_dba = pre_integration->jacobian.template block<3, 3>(O_P, O_BA);\n            Eigen::Matrix3d dp_dbg = pre_integration->jacobian.template block<3, 3>(O_P, O_BG);\n\n            Eigen::Matrix3d dq_dbg = pre_integration->jacobian.template block<3, 3>(O_R, O_BG);\n\n            Eigen::Matrix3d dv_dba = pre_integration->jacobian.template block<3, 3>(O_V, O_BA);\n            Eigen::Matrix3d dv_dbg = pre_integration->jacobian.template block<3, 3>(O_V, O_BG);\n\n            if (pre_integration->jacobian.maxCoeff() > 1e8 || pre_integration->jacobian.minCoeff() < -1e8)\n            {\n                ROS_WARN(\"numerical unstable in preintegration\");\n                //std::cout << pre_integration->jacobian << std::endl;\n///                ROS_BREAK();\n            }\n\n            if (jacobians[0])\n            {\n                Eigen::Map<Eigen::Matrix<double, 15, 7, Eigen::RowMajor>> jacobian_pose_i(jacobians[0]);\n                jacobian_pose_i.setZero();\n\n                jacobian_pose_i.block<3, 3>(O_P, O_P) = -Qi.inverse().toRotationMatrix();\n                jacobian_pose_i.block<3, 3>(O_P, O_R) = Utility::skewSymmetric(Qi.inverse() * (0.5 * G * sum_dt * sum_dt + Pj - Pi - Vi * sum_dt));\n\n#if 0\n            jacobian_pose_i.block<3, 3>(O_R, O_R) = -(Qj.inverse() * Qi).toRotationMatrix();\n#else\n                Eigen::Quaterniond corrected_delta_q = pre_integration->delta_q * Utility::deltaQ(dq_dbg * (Bgi - pre_integration->linearized_bg));\n                jacobian_pose_i.block<3, 3>(O_R, O_R) = -(Utility::Qleft(Qj.inverse() * Qi) * Utility::Qright(corrected_delta_q)).bottomRightCorner<3, 3>();\n#endif\n\n                jacobian_pose_i.block<3, 3>(O_V, O_R) = Utility::skewSymmetric(Qi.inverse() * (G * sum_dt + Vj - Vi));\n\n                jacobian_pose_i = sqrt_info * jacobian_pose_i;\n\n                if (jacobian_pose_i.maxCoeff() > 1e8 || jacobian_pose_i.minCoeff() < -1e8)\n                {\n                    ROS_WARN(\"numerical unstable in preintegration\");\n                    //std::cout << sqrt_info << std::endl;\n                    //ROS_BREAK();\n                }\n            }\n            if (jacobians[1])\n            {\n                Eigen::Map<Eigen::Matrix<double, 15, 9, Eigen::RowMajor>> jacobian_speedbias_i(jacobians[1]);\n                jacobian_speedbias_i.setZero();\n                jacobian_speedbias_i.block<3, 3>(O_P, O_V - O_V) = -Qi.inverse().toRotationMatrix() * sum_dt;\n                jacobian_speedbias_i.block<3, 3>(O_P, O_BA - O_V) = -dp_dba;\n                jacobian_speedbias_i.block<3, 3>(O_P, O_BG - O_V) = -dp_dbg;\n\n#if 0\n            jacobian_speedbias_i.block<3, 3>(O_R, O_BG - O_V) = -dq_dbg;\n#else\n                //Eigen::Quaterniond corrected_delta_q = pre_integration->delta_q * Utility::deltaQ(dq_dbg * (Bgi - pre_integration->linearized_bg));\n                //jacobian_speedbias_i.block<3, 3>(O_R, O_BG - O_V) = -Utility::Qleft(Qj.inverse() * Qi * corrected_delta_q).bottomRightCorner<3, 3>() * dq_dbg;\n                jacobian_speedbias_i.block<3, 3>(O_R, O_BG - O_V) = -Utility::Qleft(Qj.inverse() * Qi * pre_integration->delta_q).bottomRightCorner<3, 3>() * dq_dbg;\n#endif\n\n                jacobian_speedbias_i.block<3, 3>(O_V, O_V - O_V) = -Qi.inverse().toRotationMatrix();\n                jacobian_speedbias_i.block<3, 3>(O_V, O_BA - O_V) = -dv_dba;\n                jacobian_speedbias_i.block<3, 3>(O_V, O_BG - O_V) = -dv_dbg;\n\n                jacobian_speedbias_i.block<3, 3>(O_BA, O_BA - O_V) = -Eigen::Matrix3d::Identity();\n\n                jacobian_speedbias_i.block<3, 3>(O_BG, O_BG - O_V) = -Eigen::Matrix3d::Identity();\n\n                jacobian_speedbias_i = sqrt_info * jacobian_speedbias_i;\n\n                //ROS_ASSERT(fabs(jacobian_speedbias_i.maxCoeff()) < 1e8);\n                //ROS_ASSERT(fabs(jacobian_speedbias_i.minCoeff()) < 1e8);\n            }\n            if (jacobians[2])\n            {\n                Eigen::Map<Eigen::Matrix<double, 15, 7, Eigen::RowMajor>> jacobian_pose_j(jacobians[2]);\n                jacobian_pose_j.setZero();\n\n                jacobian_pose_j.block<3, 3>(O_P, O_P) = Qi.inverse().toRotationMatrix();\n\n#if 0\n            jacobian_pose_j.block<3, 3>(O_R, O_R) = Eigen::Matrix3d::Identity();\n#else\n                Eigen::Quaterniond corrected_delta_q = pre_integration->delta_q * Utility::deltaQ(dq_dbg * (Bgi - pre_integration->linearized_bg));\n                jacobian_pose_j.block<3, 3>(O_R, O_R) = Utility::Qleft(corrected_delta_q.inverse() * Qi.inverse() * Qj).bottomRightCorner<3, 3>();\n#endif\n\n                jacobian_pose_j = sqrt_info * jacobian_pose_j;\n\n                //ROS_ASSERT(fabs(jacobian_pose_j.maxCoeff()) < 1e8);\n                //ROS_ASSERT(fabs(jacobian_pose_j.minCoeff()) < 1e8);\n            }\n            if (jacobians[3])\n            {\n                Eigen::Map<Eigen::Matrix<double, 15, 9, Eigen::RowMajor>> jacobian_speedbias_j(jacobians[3]);\n                jacobian_speedbias_j.setZero();\n\n                jacobian_speedbias_j.block<3, 3>(O_V, O_V - O_V) = Qi.inverse().toRotationMatrix();\n\n                jacobian_speedbias_j.block<3, 3>(O_BA, O_BA - O_V) = Eigen::Matrix3d::Identity();\n\n                jacobian_speedbias_j.block<3, 3>(O_BG, O_BG - O_V) = Eigen::Matrix3d::Identity();\n\n                jacobian_speedbias_j = sqrt_info * jacobian_speedbias_j;\n\n                //ROS_ASSERT(fabs(jacobian_speedbias_j.maxCoeff()) < 1e8);\n                //ROS_ASSERT(fabs(jacobian_speedbias_j.minCoeff()) < 1e8);\n            }\n        }\n\n        return true;\n    }\n\n    //bool Evaluate_Direct(double const *const *parameters, Eigen::Matrix<double, 15, 1> &residuals, Eigen::Matrix<double, 15, 30> &jacobians);\n\n    //void checkCorrection();\n    //void checkTransition();\n    //void checkJacobian(double **parameters);\n    IntegrationBase* pre_integration;\n\n};\n\n"
  },
  {
    "path": "rpvio_estimator/src/factor/integration_base.h",
    "content": "#pragma once\n\n#include \"../utility/utility.h\"\n#include \"../parameters.h\"\n\n#include <ceres/ceres.h>\nusing namespace Eigen;\n\nclass IntegrationBase\n{\n  public:\n    IntegrationBase() = delete;\n    IntegrationBase(const Eigen::Vector3d &_acc_0, const Eigen::Vector3d &_gyr_0,\n                    const Eigen::Vector3d &_linearized_ba, const Eigen::Vector3d &_linearized_bg)\n        : acc_0{_acc_0}, gyr_0{_gyr_0}, linearized_acc{_acc_0}, linearized_gyr{_gyr_0},\n          linearized_ba{_linearized_ba}, linearized_bg{_linearized_bg},\n            jacobian{Eigen::Matrix<double, 15, 15>::Identity()}, covariance{Eigen::Matrix<double, 15, 15>::Zero()},\n          sum_dt{0.0}, delta_p{Eigen::Vector3d::Zero()}, delta_q{Eigen::Quaterniond::Identity()}, delta_v{Eigen::Vector3d::Zero()}\n\n    {\n        noise = Eigen::Matrix<double, 18, 18>::Zero();\n        noise.block<3, 3>(0, 0) =  (ACC_N * ACC_N) * Eigen::Matrix3d::Identity();\n        noise.block<3, 3>(3, 3) =  (GYR_N * GYR_N) * Eigen::Matrix3d::Identity();\n        noise.block<3, 3>(6, 6) =  (ACC_N * ACC_N) * Eigen::Matrix3d::Identity();\n        noise.block<3, 3>(9, 9) =  (GYR_N * GYR_N) * Eigen::Matrix3d::Identity();\n        noise.block<3, 3>(12, 12) =  (ACC_W * ACC_W) * Eigen::Matrix3d::Identity();\n        noise.block<3, 3>(15, 15) =  (GYR_W * GYR_W) * Eigen::Matrix3d::Identity();\n    }\n\n    void push_back(double dt, const Eigen::Vector3d &acc, const Eigen::Vector3d &gyr)\n    {\n        dt_buf.push_back(dt);\n        acc_buf.push_back(acc);\n        gyr_buf.push_back(gyr);\n        propagate(dt, acc, gyr);\n    }\n\n    void repropagate(const Eigen::Vector3d &_linearized_ba, const Eigen::Vector3d &_linearized_bg)\n    {\n        sum_dt = 0.0;\n        acc_0 = linearized_acc;\n        gyr_0 = linearized_gyr;\n        delta_p.setZero();\n        delta_q.setIdentity();\n        delta_v.setZero();\n        linearized_ba = _linearized_ba;\n        linearized_bg = _linearized_bg;\n        jacobian.setIdentity();\n        covariance.setZero();\n        for (int i = 0; i < static_cast<int>(dt_buf.size()); i++)\n            propagate(dt_buf[i], acc_buf[i], gyr_buf[i]);\n    }\n\n    void midPointIntegration(double _dt, \n                            const Eigen::Vector3d &_acc_0, const Eigen::Vector3d &_gyr_0,\n                            const Eigen::Vector3d &_acc_1, const Eigen::Vector3d &_gyr_1,\n                            const Eigen::Vector3d &delta_p, const Eigen::Quaterniond &delta_q, const Eigen::Vector3d &delta_v,\n                            const Eigen::Vector3d &linearized_ba, const Eigen::Vector3d &linearized_bg,\n                            Eigen::Vector3d &result_delta_p, Eigen::Quaterniond &result_delta_q, Eigen::Vector3d &result_delta_v,\n                            Eigen::Vector3d &result_linearized_ba, Eigen::Vector3d &result_linearized_bg, bool update_jacobian)\n    {\n        //ROS_INFO(\"midpoint integration\");\n        Vector3d un_acc_0 = delta_q * (_acc_0 - linearized_ba);\n        Vector3d un_gyr = 0.5 * (_gyr_0 + _gyr_1) - linearized_bg;\n        result_delta_q = delta_q * Quaterniond(1, un_gyr(0) * _dt / 2, un_gyr(1) * _dt / 2, un_gyr(2) * _dt / 2);\n        Vector3d un_acc_1 = result_delta_q * (_acc_1 - linearized_ba);\n        Vector3d un_acc = 0.5 * (un_acc_0 + un_acc_1);\n        result_delta_p = delta_p + delta_v * _dt + 0.5 * un_acc * _dt * _dt;\n        result_delta_v = delta_v + un_acc * _dt;\n        result_linearized_ba = linearized_ba;\n        result_linearized_bg = linearized_bg;         \n\n        if(update_jacobian)\n        {\n            Vector3d w_x = 0.5 * (_gyr_0 + _gyr_1) - linearized_bg;\n            Vector3d a_0_x = _acc_0 - linearized_ba;\n            Vector3d a_1_x = _acc_1 - linearized_ba;\n            Matrix3d R_w_x, R_a_0_x, R_a_1_x;\n\n            R_w_x<<0, -w_x(2), w_x(1),\n                w_x(2), 0, -w_x(0),\n                -w_x(1), w_x(0), 0;\n            R_a_0_x<<0, -a_0_x(2), a_0_x(1),\n                a_0_x(2), 0, -a_0_x(0),\n                -a_0_x(1), a_0_x(0), 0;\n            R_a_1_x<<0, -a_1_x(2), a_1_x(1),\n                a_1_x(2), 0, -a_1_x(0),\n                -a_1_x(1), a_1_x(0), 0;\n\n            MatrixXd F = MatrixXd::Zero(15, 15);\n            F.block<3, 3>(0, 0) = Matrix3d::Identity();\n            F.block<3, 3>(0, 3) = -0.25 * delta_q.toRotationMatrix() * R_a_0_x * _dt * _dt + \n                                  -0.25 * result_delta_q.toRotationMatrix() * R_a_1_x * (Matrix3d::Identity() - R_w_x * _dt) * _dt * _dt;\n            F.block<3, 3>(0, 6) = MatrixXd::Identity(3,3) * _dt;\n            F.block<3, 3>(0, 9) = -0.25 * (delta_q.toRotationMatrix() + result_delta_q.toRotationMatrix()) * _dt * _dt;\n            F.block<3, 3>(0, 12) = -0.25 * result_delta_q.toRotationMatrix() * R_a_1_x * _dt * _dt * -_dt;\n            F.block<3, 3>(3, 3) = Matrix3d::Identity() - R_w_x * _dt;\n            F.block<3, 3>(3, 12) = -1.0 * MatrixXd::Identity(3,3) * _dt;\n            F.block<3, 3>(6, 3) = -0.5 * delta_q.toRotationMatrix() * R_a_0_x * _dt + \n                                  -0.5 * result_delta_q.toRotationMatrix() * R_a_1_x * (Matrix3d::Identity() - R_w_x * _dt) * _dt;\n            F.block<3, 3>(6, 6) = Matrix3d::Identity();\n            F.block<3, 3>(6, 9) = -0.5 * (delta_q.toRotationMatrix() + result_delta_q.toRotationMatrix()) * _dt;\n            F.block<3, 3>(6, 12) = -0.5 * result_delta_q.toRotationMatrix() * R_a_1_x * _dt * -_dt;\n            F.block<3, 3>(9, 9) = Matrix3d::Identity();\n            F.block<3, 3>(12, 12) = Matrix3d::Identity();\n            //cout<<\"A\"<<endl<<A<<endl;\n\n            MatrixXd V = MatrixXd::Zero(15,18);\n            V.block<3, 3>(0, 0) =  0.25 * delta_q.toRotationMatrix() * _dt * _dt;\n            V.block<3, 3>(0, 3) =  0.25 * -result_delta_q.toRotationMatrix() * R_a_1_x  * _dt * _dt * 0.5 * _dt;\n            V.block<3, 3>(0, 6) =  0.25 * result_delta_q.toRotationMatrix() * _dt * _dt;\n            V.block<3, 3>(0, 9) =  V.block<3, 3>(0, 3);\n            V.block<3, 3>(3, 3) =  0.5 * MatrixXd::Identity(3,3) * _dt;\n            V.block<3, 3>(3, 9) =  0.5 * MatrixXd::Identity(3,3) * _dt;\n            V.block<3, 3>(6, 0) =  0.5 * delta_q.toRotationMatrix() * _dt;\n            V.block<3, 3>(6, 3) =  0.5 * -result_delta_q.toRotationMatrix() * R_a_1_x  * _dt * 0.5 * _dt;\n            V.block<3, 3>(6, 6) =  0.5 * result_delta_q.toRotationMatrix() * _dt;\n            V.block<3, 3>(6, 9) =  V.block<3, 3>(6, 3);\n            V.block<3, 3>(9, 12) = MatrixXd::Identity(3,3) * _dt;\n            V.block<3, 3>(12, 15) = MatrixXd::Identity(3,3) * _dt;\n\n            //step_jacobian = F;\n            //step_V = V;\n            jacobian = F * jacobian;\n            covariance = F * covariance * F.transpose() + V * noise * V.transpose();\n        }\n\n    }\n\n    void propagate(double _dt, const Eigen::Vector3d &_acc_1, const Eigen::Vector3d &_gyr_1)\n    {\n        dt = _dt;\n        acc_1 = _acc_1;\n        gyr_1 = _gyr_1;\n        Vector3d result_delta_p;\n        Quaterniond result_delta_q;\n        Vector3d result_delta_v;\n        Vector3d result_linearized_ba;\n        Vector3d result_linearized_bg;\n\n        midPointIntegration(_dt, acc_0, gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v,\n                            linearized_ba, linearized_bg,\n                            result_delta_p, result_delta_q, result_delta_v,\n                            result_linearized_ba, result_linearized_bg, 1);\n\n        //checkJacobian(_dt, acc_0, gyr_0, acc_1, gyr_1, delta_p, delta_q, delta_v,\n        //                    linearized_ba, linearized_bg);\n        delta_p = result_delta_p;\n        delta_q = result_delta_q;\n        delta_v = result_delta_v;\n        linearized_ba = result_linearized_ba;\n        linearized_bg = result_linearized_bg;\n        delta_q.normalize();\n        sum_dt += dt;\n        acc_0 = acc_1;\n        gyr_0 = gyr_1;  \n     \n    }\n\n    Eigen::Matrix<double, 15, 1> evaluate(const Eigen::Vector3d &Pi, const Eigen::Quaterniond &Qi, const Eigen::Vector3d &Vi, const Eigen::Vector3d &Bai, const Eigen::Vector3d &Bgi,\n                                          const Eigen::Vector3d &Pj, const Eigen::Quaterniond &Qj, const Eigen::Vector3d &Vj, const Eigen::Vector3d &Baj, const Eigen::Vector3d &Bgj)\n    {\n        Eigen::Matrix<double, 15, 1> residuals;\n\n        Eigen::Matrix3d dp_dba = jacobian.block<3, 3>(O_P, O_BA);\n        Eigen::Matrix3d dp_dbg = jacobian.block<3, 3>(O_P, O_BG);\n\n        Eigen::Matrix3d dq_dbg = jacobian.block<3, 3>(O_R, O_BG);\n\n        Eigen::Matrix3d dv_dba = jacobian.block<3, 3>(O_V, O_BA);\n        Eigen::Matrix3d dv_dbg = jacobian.block<3, 3>(O_V, O_BG);\n\n        Eigen::Vector3d dba = Bai - linearized_ba;\n        Eigen::Vector3d dbg = Bgi - linearized_bg;\n\n        Eigen::Quaterniond corrected_delta_q = delta_q * Utility::deltaQ(dq_dbg * dbg);\n        Eigen::Vector3d corrected_delta_v = delta_v + dv_dba * dba + dv_dbg * dbg;\n        Eigen::Vector3d corrected_delta_p = delta_p + dp_dba * dba + dp_dbg * dbg;\n\n        residuals.block<3, 1>(O_P, 0) = Qi.inverse() * (0.5 * G * sum_dt * sum_dt + Pj - Pi - Vi * sum_dt) - corrected_delta_p;\n        residuals.block<3, 1>(O_R, 0) = 2 * (corrected_delta_q.inverse() * (Qi.inverse() * Qj)).vec();\n        residuals.block<3, 1>(O_V, 0) = Qi.inverse() * (G * sum_dt + Vj - Vi) - corrected_delta_v;\n        residuals.block<3, 1>(O_BA, 0) = Baj - Bai;\n        residuals.block<3, 1>(O_BG, 0) = Bgj - Bgi;\n        return residuals;\n    }\n\n    double dt;\n    Eigen::Vector3d acc_0, gyr_0;\n    Eigen::Vector3d acc_1, gyr_1;\n\n    const Eigen::Vector3d linearized_acc, linearized_gyr;\n    Eigen::Vector3d linearized_ba, linearized_bg;\n\n    Eigen::Matrix<double, 15, 15> jacobian, covariance;\n    Eigen::Matrix<double, 15, 15> step_jacobian;\n    Eigen::Matrix<double, 15, 18> step_V;\n    Eigen::Matrix<double, 18, 18> noise;\n\n    double sum_dt;\n    Eigen::Vector3d delta_p;\n    Eigen::Quaterniond delta_q;\n    Eigen::Vector3d delta_v;\n\n    std::vector<double> dt_buf;\n    std::vector<Eigen::Vector3d> acc_buf;\n    std::vector<Eigen::Vector3d> gyr_buf;\n\n};\n/*\n\n    void eulerIntegration(double _dt, const Eigen::Vector3d &_acc_0, const Eigen::Vector3d &_gyr_0,\n                            const Eigen::Vector3d &_acc_1, const Eigen::Vector3d &_gyr_1,\n                            const Eigen::Vector3d &delta_p, const Eigen::Quaterniond &delta_q, const Eigen::Vector3d &delta_v,\n                            const Eigen::Vector3d &linearized_ba, const Eigen::Vector3d &linearized_bg,\n                            Eigen::Vector3d &result_delta_p, Eigen::Quaterniond &result_delta_q, Eigen::Vector3d &result_delta_v,\n                            Eigen::Vector3d &result_linearized_ba, Eigen::Vector3d &result_linearized_bg, bool update_jacobian)\n    {\n        result_delta_p = delta_p + delta_v * _dt + 0.5 * (delta_q * (_acc_1 - linearized_ba)) * _dt * _dt;\n        result_delta_v = delta_v + delta_q * (_acc_1 - linearized_ba) * _dt;\n        Vector3d omg = _gyr_1 - linearized_bg;\n        omg = omg * _dt / 2;\n        Quaterniond dR(1, omg(0), omg(1), omg(2));\n        result_delta_q = (delta_q * dR);   \n        result_linearized_ba = linearized_ba;\n        result_linearized_bg = linearized_bg;         \n\n        if(update_jacobian)\n        {\n            Vector3d w_x = _gyr_1 - linearized_bg;\n            Vector3d a_x = _acc_1 - linearized_ba;\n            Matrix3d R_w_x, R_a_x;\n\n            R_w_x<<0, -w_x(2), w_x(1),\n                w_x(2), 0, -w_x(0),\n                -w_x(1), w_x(0), 0;\n            R_a_x<<0, -a_x(2), a_x(1),\n                a_x(2), 0, -a_x(0),\n                -a_x(1), a_x(0), 0;\n\n            MatrixXd A = MatrixXd::Zero(15, 15);\n            // one step euler 0.5\n            A.block<3, 3>(0, 3) = 0.5 * (-1 * delta_q.toRotationMatrix()) * R_a_x * _dt;\n            A.block<3, 3>(0, 6) = MatrixXd::Identity(3,3);\n            A.block<3, 3>(0, 9) = 0.5 * (-1 * delta_q.toRotationMatrix()) * _dt;\n            A.block<3, 3>(3, 3) = -R_w_x;\n            A.block<3, 3>(3, 12) = -1 * MatrixXd::Identity(3,3);\n            A.block<3, 3>(6, 3) = (-1 * delta_q.toRotationMatrix()) * R_a_x;\n            A.block<3, 3>(6, 9) = (-1 * delta_q.toRotationMatrix());\n            //cout<<\"A\"<<endl<<A<<endl;\n\n            MatrixXd U = MatrixXd::Zero(15,12);\n            U.block<3, 3>(0, 0) =  0.5 * delta_q.toRotationMatrix() * _dt;\n            U.block<3, 3>(3, 3) =  MatrixXd::Identity(3,3);\n            U.block<3, 3>(6, 0) =  delta_q.toRotationMatrix();\n            U.block<3, 3>(9, 6) = MatrixXd::Identity(3,3);\n            U.block<3, 3>(12, 9) = MatrixXd::Identity(3,3);\n\n            // put outside\n            Eigen::Matrix<double, 12, 12> noise = Eigen::Matrix<double, 12, 12>::Zero();\n            noise.block<3, 3>(0, 0) =  (ACC_N * ACC_N) * Eigen::Matrix3d::Identity();\n            noise.block<3, 3>(3, 3) =  (GYR_N * GYR_N) * Eigen::Matrix3d::Identity();\n            noise.block<3, 3>(6, 6) =  (ACC_W * ACC_W) * Eigen::Matrix3d::Identity();\n            noise.block<3, 3>(9, 9) =  (GYR_W * GYR_W) * Eigen::Matrix3d::Identity();\n\n            //write F directly\n            MatrixXd F, V;\n            F = (MatrixXd::Identity(15,15) + _dt * A);\n            V = _dt * U;\n            step_jacobian = F;\n            step_V = V;\n            jacobian = F * jacobian;\n            covariance = F * covariance * F.transpose() + V * noise * V.transpose();\n        }\n\n    }     \n\n\n    void checkJacobian(double _dt, const Eigen::Vector3d &_acc_0, const Eigen::Vector3d &_gyr_0, \n                                   const Eigen::Vector3d &_acc_1, const Eigen::Vector3d &_gyr_1,\n                            const Eigen::Vector3d &delta_p, const Eigen::Quaterniond &delta_q, const Eigen::Vector3d &delta_v,\n                            const Eigen::Vector3d &linearized_ba, const Eigen::Vector3d &linearized_bg)\n    {\n        Vector3d result_delta_p;\n        Quaterniond result_delta_q;\n        Vector3d result_delta_v;\n        Vector3d result_linearized_ba;\n        Vector3d result_linearized_bg;\n        midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v,\n                            linearized_ba, linearized_bg,\n                            result_delta_p, result_delta_q, result_delta_v,\n                            result_linearized_ba, result_linearized_bg, 0);\n\n        Vector3d turb_delta_p;\n        Quaterniond turb_delta_q;\n        Vector3d turb_delta_v;\n        Vector3d turb_linearized_ba;\n        Vector3d turb_linearized_bg;\n\n        Vector3d turb(0.0001, -0.003, 0.003);\n\n        midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p + turb, delta_q, delta_v,\n                            linearized_ba, linearized_bg,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb p       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_jacobian.block<3, 3>(0, 0) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_jacobian.block<3, 3>(3, 0) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_jacobian.block<3, 3>(6, 0) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_jacobian.block<3, 3>(9, 0) * turb).transpose() << endl;\n        cout << \"bg diff \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff \" << (step_jacobian.block<3, 3>(12, 0) * turb).transpose() << endl;\n\n        midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q * Quaterniond(1, turb(0) / 2, turb(1) / 2, turb(2) / 2), delta_v,\n                            linearized_ba, linearized_bg,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb q       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_jacobian.block<3, 3>(0, 3) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_jacobian.block<3, 3>(3, 3) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_jacobian.block<3, 3>(6, 3) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_jacobian.block<3, 3>(9, 3) * turb).transpose() << endl;\n        cout << \"bg diff      \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff\" << (step_jacobian.block<3, 3>(12, 3) * turb).transpose() << endl;\n\n        midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v + turb,\n                            linearized_ba, linearized_bg,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb v       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_jacobian.block<3, 3>(0, 6) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_jacobian.block<3, 3>(3, 6) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_jacobian.block<3, 3>(6, 6) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_jacobian.block<3, 3>(9, 6) * turb).transpose() << endl;\n        cout << \"bg diff      \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff\" << (step_jacobian.block<3, 3>(12, 6) * turb).transpose() << endl;\n\n        midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v,\n                            linearized_ba + turb, linearized_bg,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb ba       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_jacobian.block<3, 3>(0, 9) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_jacobian.block<3, 3>(3, 9) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_jacobian.block<3, 3>(6, 9) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_jacobian.block<3, 3>(9, 9) * turb).transpose() << endl;\n        cout << \"bg diff      \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff\" << (step_jacobian.block<3, 3>(12, 9) * turb).transpose() << endl;\n\n        midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1, _gyr_1, delta_p, delta_q, delta_v,\n                            linearized_ba, linearized_bg + turb,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb bg       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_jacobian.block<3, 3>(0, 12) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_jacobian.block<3, 3>(3, 12) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_jacobian.block<3, 3>(6, 12) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_jacobian.block<3, 3>(9, 12) * turb).transpose() << endl;\n        cout << \"bg diff      \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff\" << (step_jacobian.block<3, 3>(12, 12) * turb).transpose() << endl;\n\n        midPointIntegration(_dt, _acc_0 + turb, _gyr_0, _acc_1 , _gyr_1, delta_p, delta_q, delta_v,\n                            linearized_ba, linearized_bg,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb acc_0       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_V.block<3, 3>(0, 0) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_V.block<3, 3>(3, 0) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_V.block<3, 3>(6, 0) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_V.block<3, 3>(9, 0) * turb).transpose() << endl;\n        cout << \"bg diff      \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff\" << (step_V.block<3, 3>(12, 0) * turb).transpose() << endl;\n\n        midPointIntegration(_dt, _acc_0, _gyr_0 + turb, _acc_1 , _gyr_1, delta_p, delta_q, delta_v,\n                            linearized_ba, linearized_bg,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb _gyr_0       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_V.block<3, 3>(0, 3) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_V.block<3, 3>(3, 3) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_V.block<3, 3>(6, 3) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_V.block<3, 3>(9, 3) * turb).transpose() << endl;\n        cout << \"bg diff      \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff\" << (step_V.block<3, 3>(12, 3) * turb).transpose() << endl;\n\n        midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1 + turb, _gyr_1, delta_p, delta_q, delta_v,\n                            linearized_ba, linearized_bg,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb acc_1       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_V.block<3, 3>(0, 6) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_V.block<3, 3>(3, 6) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_V.block<3, 3>(6, 6) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_V.block<3, 3>(9, 6) * turb).transpose() << endl;\n        cout << \"bg diff      \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff\" << (step_V.block<3, 3>(12, 6) * turb).transpose() << endl;\n\n        midPointIntegration(_dt, _acc_0, _gyr_0, _acc_1 , _gyr_1 + turb, delta_p, delta_q, delta_v,\n                            linearized_ba, linearized_bg,\n                            turb_delta_p, turb_delta_q, turb_delta_v,\n                            turb_linearized_ba, turb_linearized_bg, 0);\n        cout << \"turb _gyr_1       \" << endl;\n        cout << \"p diff       \" << (turb_delta_p - result_delta_p).transpose() << endl;\n        cout << \"p jacob diff \" << (step_V.block<3, 3>(0, 9) * turb).transpose() << endl;\n        cout << \"q diff       \" << ((result_delta_q.inverse() * turb_delta_q).vec() * 2).transpose() << endl;\n        cout << \"q jacob diff \" << (step_V.block<3, 3>(3, 9) * turb).transpose() << endl;\n        cout << \"v diff       \" << (turb_delta_v - result_delta_v).transpose() << endl;\n        cout << \"v jacob diff \" << (step_V.block<3, 3>(6, 9) * turb).transpose() << endl;\n        cout << \"ba diff      \" << (turb_linearized_ba - result_linearized_ba).transpose() << endl;\n        cout << \"ba jacob diff\" << (step_V.block<3, 3>(9, 9) * turb).transpose() << endl;\n        cout << \"bg diff      \" << (turb_linearized_bg - result_linearized_bg).transpose() << endl;\n        cout << \"bg jacob diff\" << (step_V.block<3, 3>(12, 9) * turb).transpose() << endl;\n    }\n    */"
  },
  {
    "path": "rpvio_estimator/src/factor/marginalization_factor.cpp",
    "content": "#include \"marginalization_factor.h\"\n\nvoid ResidualBlockInfo::Evaluate()\n{\n    residuals.resize(cost_function->num_residuals());\n\n    std::vector<int> block_sizes = cost_function->parameter_block_sizes();\n    raw_jacobians = new double *[block_sizes.size()];\n    jacobians.resize(block_sizes.size());\n\n    for (int i = 0; i < static_cast<int>(block_sizes.size()); i++)\n    {\n        jacobians[i].resize(cost_function->num_residuals(), block_sizes[i]);\n        raw_jacobians[i] = jacobians[i].data();\n        //dim += block_sizes[i] == 7 ? 6 : block_sizes[i];\n    }\n    cost_function->Evaluate(parameter_blocks.data(), residuals.data(), raw_jacobians);\n\n    //std::vector<int> tmp_idx(block_sizes.size());\n    //Eigen::MatrixXd tmp(dim, dim);\n    //for (int i = 0; i < static_cast<int>(parameter_blocks.size()); i++)\n    //{\n    //    int size_i = localSize(block_sizes[i]);\n    //    Eigen::MatrixXd jacobian_i = jacobians[i].leftCols(size_i);\n    //    for (int j = 0, sub_idx = 0; j < static_cast<int>(parameter_blocks.size()); sub_idx += block_sizes[j] == 7 ? 6 : block_sizes[j], j++)\n    //    {\n    //        int size_j = localSize(block_sizes[j]);\n    //        Eigen::MatrixXd jacobian_j = jacobians[j].leftCols(size_j);\n    //        tmp_idx[j] = sub_idx;\n    //        tmp.block(tmp_idx[i], tmp_idx[j], size_i, size_j) = jacobian_i.transpose() * jacobian_j;\n    //    }\n    //}\n    //Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> saes(tmp);\n    //std::cout << saes.eigenvalues() << std::endl;\n    //ROS_ASSERT(saes.eigenvalues().minCoeff() >= -1e-6);\n\n    if (loss_function)\n    {\n        double residual_scaling_, alpha_sq_norm_;\n\n        double sq_norm, rho[3];\n\n        sq_norm = residuals.squaredNorm();\n        loss_function->Evaluate(sq_norm, rho);\n        //printf(\"sq_norm: %f, rho[0]: %f, rho[1]: %f, rho[2]: %f\\n\", sq_norm, rho[0], rho[1], rho[2]);\n\n        double sqrt_rho1_ = sqrt(rho[1]);\n\n        if ((sq_norm == 0.0) || (rho[2] <= 0.0))\n        {\n            residual_scaling_ = sqrt_rho1_;\n            alpha_sq_norm_ = 0.0;\n        }\n        else\n        {\n            const double D = 1.0 + 2.0 * sq_norm * rho[2] / rho[1];\n            const double alpha = 1.0 - sqrt(D);\n            residual_scaling_ = sqrt_rho1_ / (1 - alpha);\n            alpha_sq_norm_ = alpha / sq_norm;\n        }\n\n        for (int i = 0; i < static_cast<int>(parameter_blocks.size()); i++)\n        {\n            jacobians[i] = sqrt_rho1_ * (jacobians[i] - alpha_sq_norm_ * residuals * (residuals.transpose() * jacobians[i]));\n        }\n\n        residuals *= residual_scaling_;\n    }\n}\n\nMarginalizationInfo::~MarginalizationInfo()\n{\n    //ROS_WARN(\"release marginlizationinfo\");\n    \n    for (auto it = parameter_block_data.begin(); it != parameter_block_data.end(); ++it)\n        delete[] it->second;\n\n    for (int i = 0; i < (int)factors.size(); i++)\n    {\n\n        delete[] factors[i]->raw_jacobians;\n        \n        delete factors[i]->cost_function;\n\n        delete factors[i];\n    }\n}\n\nvoid MarginalizationInfo::addResidualBlockInfo(ResidualBlockInfo *residual_block_info)\n{\n    factors.emplace_back(residual_block_info);\n\n    std::vector<double *> &parameter_blocks = residual_block_info->parameter_blocks;\n    std::vector<int> parameter_block_sizes = residual_block_info->cost_function->parameter_block_sizes();\n\n    for (int i = 0; i < static_cast<int>(residual_block_info->parameter_blocks.size()); i++)\n    {\n        double *addr = parameter_blocks[i];\n        int size = parameter_block_sizes[i];\n        parameter_block_size[reinterpret_cast<long>(addr)] = size;\n    }\n\n    for (int i = 0; i < static_cast<int>(residual_block_info->drop_set.size()); i++)\n    {\n        double *addr = parameter_blocks[residual_block_info->drop_set[i]];\n        parameter_block_idx[reinterpret_cast<long>(addr)] = 0;\n    }\n}\n\nvoid MarginalizationInfo::preMarginalize()\n{\n    for (auto it : factors)\n    {\n        it->Evaluate();\n\n        std::vector<int> block_sizes = it->cost_function->parameter_block_sizes();\n        for (int i = 0; i < static_cast<int>(block_sizes.size()); i++)\n        {\n            long addr = reinterpret_cast<long>(it->parameter_blocks[i]);\n            int size = block_sizes[i];\n            if (parameter_block_data.find(addr) == parameter_block_data.end())\n            {\n                double *data = new double[size];\n                memcpy(data, it->parameter_blocks[i], sizeof(double) * size);\n                parameter_block_data[addr] = data;\n            }\n        }\n    }\n}\n\nint MarginalizationInfo::localSize(int size) const\n{\n    return size == 7 ? 6 : size;\n}\n\nint MarginalizationInfo::globalSize(int size) const\n{\n    return size == 6 ? 7 : size;\n}\n\nvoid* ThreadsConstructA(void* threadsstruct)\n{\n    ThreadsStruct* p = ((ThreadsStruct*)threadsstruct);\n    for (auto it : p->sub_factors)\n    {\n        for (int i = 0; i < static_cast<int>(it->parameter_blocks.size()); i++)\n        {\n            int idx_i = p->parameter_block_idx[reinterpret_cast<long>(it->parameter_blocks[i])];\n            int size_i = p->parameter_block_size[reinterpret_cast<long>(it->parameter_blocks[i])];\n            if (size_i == 7)\n                size_i = 6;\n            Eigen::MatrixXd jacobian_i = it->jacobians[i].leftCols(size_i);\n            for (int j = i; j < static_cast<int>(it->parameter_blocks.size()); j++)\n            {\n                int idx_j = p->parameter_block_idx[reinterpret_cast<long>(it->parameter_blocks[j])];\n                int size_j = p->parameter_block_size[reinterpret_cast<long>(it->parameter_blocks[j])];\n                if (size_j == 7)\n                    size_j = 6;\n                Eigen::MatrixXd jacobian_j = it->jacobians[j].leftCols(size_j);\n                if (i == j)\n                    p->A.block(idx_i, idx_j, size_i, size_j) += jacobian_i.transpose() * jacobian_j;\n                else\n                {\n                    p->A.block(idx_i, idx_j, size_i, size_j) += jacobian_i.transpose() * jacobian_j;\n                    p->A.block(idx_j, idx_i, size_j, size_i) = p->A.block(idx_i, idx_j, size_i, size_j).transpose();\n                }\n            }\n            p->b.segment(idx_i, size_i) += jacobian_i.transpose() * it->residuals;\n        }\n    }\n    return threadsstruct;\n}\n\nvoid MarginalizationInfo::marginalize()\n{\n    int pos = 0;\n    for (auto &it : parameter_block_idx)\n    {\n        it.second = pos;\n        pos += localSize(parameter_block_size[it.first]);\n    }\n\n    m = pos;\n\n    for (const auto &it : parameter_block_size)\n    {\n        if (parameter_block_idx.find(it.first) == parameter_block_idx.end())\n        {\n            parameter_block_idx[it.first] = pos;\n            pos += localSize(it.second);\n        }\n    }\n\n    n = pos - m;\n\n    //ROS_DEBUG(\"marginalization, pos: %d, m: %d, n: %d, size: %d\", pos, m, n, (int)parameter_block_idx.size());\n\n    TicToc t_summing;\n    Eigen::MatrixXd A(pos, pos);\n    Eigen::VectorXd b(pos);\n    A.setZero();\n    b.setZero();\n    /*\n    for (auto it : factors)\n    {\n        for (int i = 0; i < static_cast<int>(it->parameter_blocks.size()); i++)\n        {\n            int idx_i = parameter_block_idx[reinterpret_cast<long>(it->parameter_blocks[i])];\n            int size_i = localSize(parameter_block_size[reinterpret_cast<long>(it->parameter_blocks[i])]);\n            Eigen::MatrixXd jacobian_i = it->jacobians[i].leftCols(size_i);\n            for (int j = i; j < static_cast<int>(it->parameter_blocks.size()); j++)\n            {\n                int idx_j = parameter_block_idx[reinterpret_cast<long>(it->parameter_blocks[j])];\n                int size_j = localSize(parameter_block_size[reinterpret_cast<long>(it->parameter_blocks[j])]);\n                Eigen::MatrixXd jacobian_j = it->jacobians[j].leftCols(size_j);\n                if (i == j)\n                    A.block(idx_i, idx_j, size_i, size_j) += jacobian_i.transpose() * jacobian_j;\n                else\n                {\n                    A.block(idx_i, idx_j, size_i, size_j) += jacobian_i.transpose() * jacobian_j;\n                    A.block(idx_j, idx_i, size_j, size_i) = A.block(idx_i, idx_j, size_i, size_j).transpose();\n                }\n            }\n            b.segment(idx_i, size_i) += jacobian_i.transpose() * it->residuals;\n        }\n    }\n    ROS_INFO(\"summing up costs %f ms\", t_summing.toc());\n    */\n    //multi thread\n\n\n    TicToc t_thread_summing;\n    pthread_t tids[NUM_THREADS];\n    ThreadsStruct threadsstruct[NUM_THREADS];\n    int i = 0;\n    for (auto it : factors)\n    {\n        threadsstruct[i].sub_factors.push_back(it);\n        i++;\n        i = i % NUM_THREADS;\n    }\n    for (int i = 0; i < NUM_THREADS; i++)\n    {\n        TicToc zero_matrix;\n        threadsstruct[i].A = Eigen::MatrixXd::Zero(pos,pos);\n        threadsstruct[i].b = Eigen::VectorXd::Zero(pos);\n        threadsstruct[i].parameter_block_size = parameter_block_size;\n        threadsstruct[i].parameter_block_idx = parameter_block_idx;\n        int ret = pthread_create( &tids[i], NULL, ThreadsConstructA ,(void*)&(threadsstruct[i]));\n        if (ret != 0)\n        {\n            ROS_WARN(\"pthread_create error\");\n            ROS_BREAK();\n        }\n    }\n    for( int i = NUM_THREADS - 1; i >= 0; i--)  \n    {\n        pthread_join( tids[i], NULL ); \n        A += threadsstruct[i].A;\n        b += threadsstruct[i].b;\n    }\n    //ROS_DEBUG(\"thread summing up costs %f ms\", t_thread_summing.toc());\n    //ROS_INFO(\"A diff %f , b diff %f \", (A - tmp_A).sum(), (b - tmp_b).sum());\n\n\n    //TODO\n    Eigen::MatrixXd Amm = 0.5 * (A.block(0, 0, m, m) + A.block(0, 0, m, m).transpose());\n    Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> saes(Amm);\n\n    //ROS_ASSERT_MSG(saes.eigenvalues().minCoeff() >= -1e-4, \"min eigenvalue %f\", saes.eigenvalues().minCoeff());\n\n    Eigen::MatrixXd Amm_inv = saes.eigenvectors() * Eigen::VectorXd((saes.eigenvalues().array() > eps).select(saes.eigenvalues().array().inverse(), 0)).asDiagonal() * saes.eigenvectors().transpose();\n    //printf(\"error1: %f\\n\", (Amm * Amm_inv - Eigen::MatrixXd::Identity(m, m)).sum());\n\n    Eigen::VectorXd bmm = b.segment(0, m);\n    Eigen::MatrixXd Amr = A.block(0, m, m, n);\n    Eigen::MatrixXd Arm = A.block(m, 0, n, m);\n    Eigen::MatrixXd Arr = A.block(m, m, n, n);\n    Eigen::VectorXd brr = b.segment(m, n);\n    A = Arr - Arm * Amm_inv * Amr;\n    b = brr - Arm * Amm_inv * bmm;\n\n    Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> saes2(A);\n    Eigen::VectorXd S = Eigen::VectorXd((saes2.eigenvalues().array() > eps).select(saes2.eigenvalues().array(), 0));\n    Eigen::VectorXd S_inv = Eigen::VectorXd((saes2.eigenvalues().array() > eps).select(saes2.eigenvalues().array().inverse(), 0));\n\n    Eigen::VectorXd S_sqrt = S.cwiseSqrt();\n    Eigen::VectorXd S_inv_sqrt = S_inv.cwiseSqrt();\n\n    linearized_jacobians = S_sqrt.asDiagonal() * saes2.eigenvectors().transpose();\n    linearized_residuals = S_inv_sqrt.asDiagonal() * saes2.eigenvectors().transpose() * b;\n    //std::cout << A << std::endl\n    //          << std::endl;\n    //std::cout << linearized_jacobians << std::endl;\n    //printf(\"error2: %f %f\\n\", (linearized_jacobians.transpose() * linearized_jacobians - A).sum(),\n    //      (linearized_jacobians.transpose() * linearized_residuals - b).sum());\n}\n\nstd::vector<double *> MarginalizationInfo::getParameterBlocks(std::unordered_map<long, double *> &addr_shift)\n{\n    std::vector<double *> keep_block_addr;\n    keep_block_size.clear();\n    keep_block_idx.clear();\n    keep_block_data.clear();\n\n    for (const auto &it : parameter_block_idx)\n    {\n        if (it.second >= m)\n        {\n            keep_block_size.push_back(parameter_block_size[it.first]);\n            keep_block_idx.push_back(parameter_block_idx[it.first]);\n            keep_block_data.push_back(parameter_block_data[it.first]);\n            keep_block_addr.push_back(addr_shift[it.first]);\n        }\n    }\n    sum_block_size = std::accumulate(std::begin(keep_block_size), std::end(keep_block_size), 0);\n\n    return keep_block_addr;\n}\n\nMarginalizationFactor::MarginalizationFactor(MarginalizationInfo* _marginalization_info):marginalization_info(_marginalization_info)\n{\n    int cnt = 0;\n    for (auto it : marginalization_info->keep_block_size)\n    {\n        mutable_parameter_block_sizes()->push_back(it);\n        cnt += it;\n    }\n    //printf(\"residual size: %d, %d\\n\", cnt, n);\n    set_num_residuals(marginalization_info->n);\n};\n\nbool MarginalizationFactor::Evaluate(double const *const *parameters, double *residuals, double **jacobians) const\n{\n    //printf(\"internal addr,%d, %d\\n\", (int)parameter_block_sizes().size(), num_residuals());\n    //for (int i = 0; i < static_cast<int>(keep_block_size.size()); i++)\n    //{\n    //    //printf(\"unsigned %x\\n\", reinterpret_cast<unsigned long>(parameters[i]));\n    //    //printf(\"signed %x\\n\", reinterpret_cast<long>(parameters[i]));\n    //printf(\"jacobian %x\\n\", reinterpret_cast<long>(jacobians));\n    //printf(\"residual %x\\n\", reinterpret_cast<long>(residuals));\n    //}\n    int n = marginalization_info->n;\n    int m = marginalization_info->m;\n    Eigen::VectorXd dx(n);\n    for (int i = 0; i < static_cast<int>(marginalization_info->keep_block_size.size()); i++)\n    {\n        int size = marginalization_info->keep_block_size[i];\n        int idx = marginalization_info->keep_block_idx[i] - m;\n        Eigen::VectorXd x = Eigen::Map<const Eigen::VectorXd>(parameters[i], size);\n        Eigen::VectorXd x0 = Eigen::Map<const Eigen::VectorXd>(marginalization_info->keep_block_data[i], size);\n        if (size != 7)\n            dx.segment(idx, size) = x - x0;\n        else\n        {\n            dx.segment<3>(idx + 0) = x.head<3>() - x0.head<3>();\n            dx.segment<3>(idx + 3) = 2.0 * Utility::positify(Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).vec();\n            if (!((Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).w() >= 0))\n            {\n                dx.segment<3>(idx + 3) = 2.0 * -Utility::positify(Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).vec();\n            }\n        }\n    }\n    Eigen::Map<Eigen::VectorXd>(residuals, n) = marginalization_info->linearized_residuals + marginalization_info->linearized_jacobians * dx;\n    if (jacobians)\n    {\n\n        for (int i = 0; i < static_cast<int>(marginalization_info->keep_block_size.size()); i++)\n        {\n            if (jacobians[i])\n            {\n                int size = marginalization_info->keep_block_size[i], local_size = marginalization_info->localSize(size);\n                int idx = marginalization_info->keep_block_idx[i] - m;\n                Eigen::Map<Eigen::Matrix<double, Eigen::Dynamic, Eigen::Dynamic, Eigen::RowMajor>> jacobian(jacobians[i], n, size);\n                jacobian.setZero();\n                jacobian.leftCols(local_size) = marginalization_info->linearized_jacobians.middleCols(idx, local_size);\n            }\n        }\n    }\n    return true;\n}\n"
  },
  {
    "path": "rpvio_estimator/src/factor/marginalization_factor.h",
    "content": "#pragma once\n\n#include <ros/ros.h>\n#include <ros/console.h>\n#include <cstdlib>\n#include <pthread.h>\n#include <ceres/ceres.h>\n#include <unordered_map>\n\n#include \"../utility/utility.h\"\n#include \"../utility/tic_toc.h\"\n\nconst int NUM_THREADS = 4;\n\nstruct ResidualBlockInfo\n{\n    ResidualBlockInfo(ceres::CostFunction *_cost_function, ceres::LossFunction *_loss_function, std::vector<double *> _parameter_blocks, std::vector<int> _drop_set)\n        : cost_function(_cost_function), loss_function(_loss_function), parameter_blocks(_parameter_blocks), drop_set(_drop_set) {}\n\n    void Evaluate();\n\n    ceres::CostFunction *cost_function;\n    ceres::LossFunction *loss_function;\n    std::vector<double *> parameter_blocks;\n    std::vector<int> drop_set;\n\n    double **raw_jacobians;\n    std::vector<Eigen::Matrix<double, Eigen::Dynamic, Eigen::Dynamic, Eigen::RowMajor>> jacobians;\n    Eigen::VectorXd residuals;\n\n    int localSize(int size)\n    {\n        return size == 7 ? 6 : size;\n    }\n};\n\nstruct ThreadsStruct\n{\n    std::vector<ResidualBlockInfo *> sub_factors;\n    Eigen::MatrixXd A;\n    Eigen::VectorXd b;\n    std::unordered_map<long, int> parameter_block_size; //global size\n    std::unordered_map<long, int> parameter_block_idx; //local size\n};\n\nclass MarginalizationInfo\n{\n  public:\n    ~MarginalizationInfo();\n    int localSize(int size) const;\n    int globalSize(int size) const;\n    void addResidualBlockInfo(ResidualBlockInfo *residual_block_info);\n    void preMarginalize();\n    void marginalize();\n    std::vector<double *> getParameterBlocks(std::unordered_map<long, double *> &addr_shift);\n\n    std::vector<ResidualBlockInfo *> factors;\n    int m, n;\n    std::unordered_map<long, int> parameter_block_size; //global size\n    int sum_block_size;\n    std::unordered_map<long, int> parameter_block_idx; //local size\n    std::unordered_map<long, double *> parameter_block_data;\n\n    std::vector<int> keep_block_size; //global size\n    std::vector<int> keep_block_idx;  //local size\n    std::vector<double *> keep_block_data;\n\n    Eigen::MatrixXd linearized_jacobians;\n    Eigen::VectorXd linearized_residuals;\n    const double eps = 1e-8;\n\n};\n\nclass MarginalizationFactor : public ceres::CostFunction\n{\n  public:\n    MarginalizationFactor(MarginalizationInfo* _marginalization_info);\n    virtual bool Evaluate(double const *const *parameters, double *residuals, double **jacobians) const;\n\n    MarginalizationInfo* marginalization_info;\n};\n"
  },
  {
    "path": "rpvio_estimator/src/factor/pose_local_parameterization.cpp",
    "content": "#include \"pose_local_parameterization.h\"\n\nbool PoseLocalParameterization::Plus(const double *x, const double *delta, double *x_plus_delta) const\n{\n    Eigen::Map<const Eigen::Vector3d> _p(x);\n    Eigen::Map<const Eigen::Quaterniond> _q(x + 3);\n\n    Eigen::Map<const Eigen::Vector3d> dp(delta);\n\n    Eigen::Quaterniond dq = Utility::deltaQ(Eigen::Map<const Eigen::Vector3d>(delta + 3));\n\n    Eigen::Map<Eigen::Vector3d> p(x_plus_delta);\n    Eigen::Map<Eigen::Quaterniond> q(x_plus_delta + 3);\n\n    p = _p + dp;\n    q = (_q * dq).normalized();\n\n    return true;\n}\nbool PoseLocalParameterization::ComputeJacobian(const double *x, double *jacobian) const\n{\n    Eigen::Map<Eigen::Matrix<double, 7, 6, Eigen::RowMajor>> j(jacobian);\n    j.topRows<6>().setIdentity();\n    j.bottomRows<1>().setZero();\n\n    return true;\n}\n"
  },
  {
    "path": "rpvio_estimator/src/factor/pose_local_parameterization.h",
    "content": "#pragma once\n\n#include <eigen3/Eigen/Dense>\n#include <ceres/ceres.h>\n#include \"../utility/utility.h\"\n\nclass PoseLocalParameterization : public ceres::LocalParameterization\n{\n    virtual bool Plus(const double *x, const double *delta, double *x_plus_delta) const;\n    virtual bool ComputeJacobian(const double *x, double *jacobian) const;\n    virtual int GlobalSize() const { return 7; };\n    virtual int LocalSize() const { return 6; };\n};\n"
  },
  {
    "path": "rpvio_estimator/src/factor/projection_factor.cpp",
    "content": "#include \"projection_factor.h\"\n\nEigen::Matrix2d ProjectionFactor::sqrt_info;\ndouble ProjectionFactor::sum_t;\n\nProjectionFactor::ProjectionFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j) : pts_i(_pts_i), pts_j(_pts_j)\n{\n#ifdef UNIT_SPHERE_ERROR\n    Eigen::Vector3d b1, b2;\n    Eigen::Vector3d a = pts_j.normalized();\n    Eigen::Vector3d tmp(0, 0, 1);\n    if(a == tmp)\n        tmp << 1, 0, 0;\n    b1 = (tmp - a * (a.transpose() * tmp)).normalized();\n    b2 = a.cross(b1);\n    tangent_base.block<1, 3>(0, 0) = b1.transpose();\n    tangent_base.block<1, 3>(1, 0) = b2.transpose();\n#endif\n};\n\nbool ProjectionFactor::Evaluate(double const *const *parameters, double *residuals, double **jacobians) const\n{\n    TicToc tic_toc;\n    Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]);\n    Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]);\n\n    Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]);\n    Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]);\n\n    Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]);\n    Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]);\n\n    double inv_dep_i = parameters[3][0];\n\n    Eigen::Vector3d pts_camera_i = pts_i / inv_dep_i;\n    Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic;\n    Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi;\n    Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj);\n    Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic);\n    Eigen::Map<Eigen::Vector2d> residual(residuals);\n\n#ifdef UNIT_SPHERE_ERROR \n    residual =  tangent_base * (pts_camera_j.normalized() - pts_j.normalized());\n#else\n    double dep_j = pts_camera_j.z();\n    residual = (pts_camera_j / dep_j).head<2>() - pts_j.head<2>();\n#endif\n\n    residual = sqrt_info * residual;\n\n    if (jacobians)\n    {\n        Eigen::Matrix3d Ri = Qi.toRotationMatrix();\n        Eigen::Matrix3d Rj = Qj.toRotationMatrix();\n        Eigen::Matrix3d ric = qic.toRotationMatrix();\n        Eigen::Matrix<double, 2, 3> reduce(2, 3);\n#ifdef UNIT_SPHERE_ERROR\n        double norm = pts_camera_j.norm();\n        Eigen::Matrix3d norm_jaco;\n        double x1, x2, x3;\n        x1 = pts_camera_j(0);\n        x2 = pts_camera_j(1);\n        x3 = pts_camera_j(2);\n        norm_jaco << 1.0 / norm - x1 * x1 / pow(norm, 3), - x1 * x2 / pow(norm, 3),            - x1 * x3 / pow(norm, 3),\n                     - x1 * x2 / pow(norm, 3),            1.0 / norm - x2 * x2 / pow(norm, 3), - x2 * x3 / pow(norm, 3),\n                     - x1 * x3 / pow(norm, 3),            - x2 * x3 / pow(norm, 3),            1.0 / norm - x3 * x3 / pow(norm, 3);\n        reduce = tangent_base * norm_jaco;\n#else\n        reduce << 1. / dep_j, 0, -pts_camera_j(0) / (dep_j * dep_j),\n            0, 1. / dep_j, -pts_camera_j(1) / (dep_j * dep_j);\n#endif\n        reduce = sqrt_info * reduce;\n\n        if (jacobians[0])\n        {\n            Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>> jacobian_pose_i(jacobians[0]);\n\n            Eigen::Matrix<double, 3, 6> jaco_i;\n            jaco_i.leftCols<3>() = ric.transpose() * Rj.transpose();\n            jaco_i.rightCols<3>() = ric.transpose() * Rj.transpose() * Ri * -Utility::skewSymmetric(pts_imu_i);\n\n            jacobian_pose_i.leftCols<6>() = reduce * jaco_i;\n            jacobian_pose_i.rightCols<1>().setZero();\n        }\n\n        if (jacobians[1])\n        {\n            Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>> jacobian_pose_j(jacobians[1]);\n\n            Eigen::Matrix<double, 3, 6> jaco_j;\n            jaco_j.leftCols<3>() = ric.transpose() * -Rj.transpose();\n            jaco_j.rightCols<3>() = ric.transpose() * Utility::skewSymmetric(pts_imu_j);\n\n            jacobian_pose_j.leftCols<6>() = reduce * jaco_j;\n            jacobian_pose_j.rightCols<1>().setZero();\n        }\n        if (jacobians[2])\n        {\n            Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>> jacobian_ex_pose(jacobians[2]);\n            Eigen::Matrix<double, 3, 6> jaco_ex;\n            jaco_ex.leftCols<3>() = ric.transpose() * (Rj.transpose() * Ri - Eigen::Matrix3d::Identity());\n            Eigen::Matrix3d tmp_r = ric.transpose() * Rj.transpose() * Ri * ric;\n            jaco_ex.rightCols<3>() = -tmp_r * Utility::skewSymmetric(pts_camera_i) + Utility::skewSymmetric(tmp_r * pts_camera_i) +\n                                     Utility::skewSymmetric(ric.transpose() * (Rj.transpose() * (Ri * tic + Pi - Pj) - tic));\n            jacobian_ex_pose.leftCols<6>() = reduce * jaco_ex;\n            jacobian_ex_pose.rightCols<1>().setZero();\n        }\n        if (jacobians[3])\n        {\n            Eigen::Map<Eigen::Vector2d> jacobian_feature(jacobians[3]);\n#if 1\n            jacobian_feature = reduce * ric.transpose() * Rj.transpose() * Ri * ric * pts_i * -1.0 / (inv_dep_i * inv_dep_i);\n#else\n            jacobian_feature = reduce * ric.transpose() * Rj.transpose() * Ri * ric * pts_i;\n#endif\n        }\n    }\n    sum_t += tic_toc.toc();\n\n    return true;\n}\n\nvoid ProjectionFactor::check(double **parameters)\n{\n    double *res = new double[15];\n    double **jaco = new double *[4];\n    jaco[0] = new double[2 * 7];\n    jaco[1] = new double[2 * 7];\n    jaco[2] = new double[2 * 7];\n    jaco[3] = new double[2 * 1];\n    Evaluate(parameters, res, jaco);\n    puts(\"check begins\");\n\n    puts(\"my\");\n\n    std::cout << Eigen::Map<Eigen::Matrix<double, 2, 1>>(res).transpose() << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>>(jaco[0]) << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>>(jaco[1]) << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>>(jaco[2]) << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Vector2d>(jaco[3]) << std::endl\n              << std::endl;\n\n    Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]);\n    Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]);\n\n    Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]);\n    Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]);\n\n    Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]);\n    Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]);\n    double inv_dep_i = parameters[3][0];\n\n    Eigen::Vector3d pts_camera_i = pts_i / inv_dep_i;\n    Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic;\n    Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi;\n    Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj);\n    Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic);\n\n\n    Eigen::Vector2d residual;\n#ifdef UNIT_SPHERE_ERROR \n    residual =  tangent_base * (pts_camera_j.normalized() - pts_j.normalized());\n#else\n    double dep_j = pts_camera_j.z();\n    residual = (pts_camera_j / dep_j).head<2>() - pts_j.head<2>();\n#endif\n    residual = sqrt_info * residual;\n\n    puts(\"num\");\n    std::cout << residual.transpose() << std::endl;\n\n    const double eps = 1e-6;\n    Eigen::Matrix<double, 2, 19> num_jacobian;\n    for (int k = 0; k < 19; k++)\n    {\n        Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]);\n        Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]);\n\n        Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]);\n        Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]);\n\n        Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]);\n        Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]);\n        double inv_dep_i = parameters[3][0];\n\n        int a = k / 3, b = k % 3;\n        Eigen::Vector3d delta = Eigen::Vector3d(b == 0, b == 1, b == 2) * eps;\n\n        if (a == 0)\n            Pi += delta;\n        else if (a == 1)\n            Qi = Qi * Utility::deltaQ(delta);\n        else if (a == 2)\n            Pj += delta;\n        else if (a == 3)\n            Qj = Qj * Utility::deltaQ(delta);\n        else if (a == 4)\n            tic += delta;\n        else if (a == 5)\n            qic = qic * Utility::deltaQ(delta);\n        else if (a == 6)\n            inv_dep_i += delta.x();\n\n        Eigen::Vector3d pts_camera_i = pts_i / inv_dep_i;\n        Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic;\n        Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi;\n        Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj);\n        Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic);\n\n        Eigen::Vector2d tmp_residual;\n#ifdef UNIT_SPHERE_ERROR \n        tmp_residual =  tangent_base * (pts_camera_j.normalized() - pts_j.normalized());\n#else\n        double dep_j = pts_camera_j.z();\n        tmp_residual = (pts_camera_j / dep_j).head<2>() - pts_j.head<2>();\n#endif\n        tmp_residual = sqrt_info * tmp_residual;\n        num_jacobian.col(k) = (tmp_residual - residual) / eps;\n    }\n    std::cout << num_jacobian << std::endl;\n}\n"
  },
  {
    "path": "rpvio_estimator/src/factor/projection_factor.h",
    "content": "#pragma once\n\n#include <ros/assert.h>\n#include <ceres/ceres.h>\n#include <Eigen/Dense>\n#include \"../utility/utility.h\"\n#include \"../utility/tic_toc.h\"\n#include \"../parameters.h\"\n\nclass ProjectionFactor : public ceres::SizedCostFunction<2, 7, 7, 7, 1>\n{\n  public:\n    ProjectionFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j);\n    virtual bool Evaluate(double const *const *parameters, double *residuals, double **jacobians) const;\n    void check(double **parameters);\n\n    Eigen::Vector3d pts_i, pts_j;\n    Eigen::Matrix<double, 2, 3> tangent_base;\n    static Eigen::Matrix2d sqrt_info;\n    static double sum_t;\n};\n"
  },
  {
    "path": "rpvio_estimator/src/factor/projection_td_factor.cpp",
    "content": "#include \"projection_td_factor.h\"\n\nEigen::Matrix2d ProjectionTdFactor::sqrt_info;\ndouble ProjectionTdFactor::sum_t;\n\nProjectionTdFactor::ProjectionTdFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j, \n                                       const Eigen::Vector2d &_velocity_i, const Eigen::Vector2d &_velocity_j,\n                                       const double _td_i, const double _td_j, const double _row_i, const double _row_j) : \n                                       pts_i(_pts_i), pts_j(_pts_j), \n                                       td_i(_td_i), td_j(_td_j)\n{\n    velocity_i.x() = _velocity_i.x();\n    velocity_i.y() = _velocity_i.y();\n    velocity_i.z() = 0;\n    velocity_j.x() = _velocity_j.x();\n    velocity_j.y() = _velocity_j.y();\n    velocity_j.z() = 0;\n    row_i = _row_i - ROW / 2;\n    row_j = _row_j - ROW / 2;\n\n#ifdef UNIT_SPHERE_ERROR\n    Eigen::Vector3d b1, b2;\n    Eigen::Vector3d a = pts_j.normalized();\n    Eigen::Vector3d tmp(0, 0, 1);\n    if(a == tmp)\n        tmp << 1, 0, 0;\n    b1 = (tmp - a * (a.transpose() * tmp)).normalized();\n    b2 = a.cross(b1);\n    tangent_base.block<1, 3>(0, 0) = b1.transpose();\n    tangent_base.block<1, 3>(1, 0) = b2.transpose();\n#endif\n};\n\nbool ProjectionTdFactor::Evaluate(double const *const *parameters, double *residuals, double **jacobians) const\n{\n    TicToc tic_toc;\n    Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]);\n    Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]);\n\n    Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]);\n    Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]);\n\n    Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]);\n    Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]);\n\n    double inv_dep_i = parameters[3][0];\n\n    double td = parameters[4][0];\n\n    Eigen::Vector3d pts_i_td, pts_j_td;\n    pts_i_td = pts_i - (td - td_i + TR / ROW * row_i) * velocity_i;\n    pts_j_td = pts_j - (td - td_j + TR / ROW * row_j) * velocity_j;\n    Eigen::Vector3d pts_camera_i = pts_i_td / inv_dep_i;\n    Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic;\n    Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi;\n    Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj);\n    Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic);\n    Eigen::Map<Eigen::Vector2d> residual(residuals);\n\n#ifdef UNIT_SPHERE_ERROR \n    residual =  tangent_base * (pts_camera_j.normalized() - pts_j_td.normalized());\n#else\n    double dep_j = pts_camera_j.z();\n    residual = (pts_camera_j / dep_j).head<2>() - pts_j_td.head<2>();\n#endif\n\n    residual = sqrt_info * residual;\n\n    if (jacobians)\n    {\n        Eigen::Matrix3d Ri = Qi.toRotationMatrix();\n        Eigen::Matrix3d Rj = Qj.toRotationMatrix();\n        Eigen::Matrix3d ric = qic.toRotationMatrix();\n        Eigen::Matrix<double, 2, 3> reduce(2, 3);\n#ifdef UNIT_SPHERE_ERROR\n        double norm = pts_camera_j.norm();\n        Eigen::Matrix3d norm_jaco;\n        double x1, x2, x3;\n        x1 = pts_camera_j(0);\n        x2 = pts_camera_j(1);\n        x3 = pts_camera_j(2);\n        norm_jaco << 1.0 / norm - x1 * x1 / pow(norm, 3), - x1 * x2 / pow(norm, 3),            - x1 * x3 / pow(norm, 3),\n                     - x1 * x2 / pow(norm, 3),            1.0 / norm - x2 * x2 / pow(norm, 3), - x2 * x3 / pow(norm, 3),\n                     - x1 * x3 / pow(norm, 3),            - x2 * x3 / pow(norm, 3),            1.0 / norm - x3 * x3 / pow(norm, 3);\n        reduce = tangent_base * norm_jaco;\n#else\n        reduce << 1. / dep_j, 0, -pts_camera_j(0) / (dep_j * dep_j),\n            0, 1. / dep_j, -pts_camera_j(1) / (dep_j * dep_j);\n#endif\n        reduce = sqrt_info * reduce;\n\n        if (jacobians[0])\n        {\n            Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>> jacobian_pose_i(jacobians[0]);\n\n            Eigen::Matrix<double, 3, 6> jaco_i;\n            jaco_i.leftCols<3>() = ric.transpose() * Rj.transpose();\n            jaco_i.rightCols<3>() = ric.transpose() * Rj.transpose() * Ri * -Utility::skewSymmetric(pts_imu_i);\n\n            jacobian_pose_i.leftCols<6>() = reduce * jaco_i;\n            jacobian_pose_i.rightCols<1>().setZero();\n        }\n\n        if (jacobians[1])\n        {\n            Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>> jacobian_pose_j(jacobians[1]);\n\n            Eigen::Matrix<double, 3, 6> jaco_j;\n            jaco_j.leftCols<3>() = ric.transpose() * -Rj.transpose();\n            jaco_j.rightCols<3>() = ric.transpose() * Utility::skewSymmetric(pts_imu_j);\n\n            jacobian_pose_j.leftCols<6>() = reduce * jaco_j;\n            jacobian_pose_j.rightCols<1>().setZero();\n        }\n        if (jacobians[2])\n        {\n            Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>> jacobian_ex_pose(jacobians[2]);\n            Eigen::Matrix<double, 3, 6> jaco_ex;\n            jaco_ex.leftCols<3>() = ric.transpose() * (Rj.transpose() * Ri - Eigen::Matrix3d::Identity());\n            Eigen::Matrix3d tmp_r = ric.transpose() * Rj.transpose() * Ri * ric;\n            jaco_ex.rightCols<3>() = -tmp_r * Utility::skewSymmetric(pts_camera_i) + Utility::skewSymmetric(tmp_r * pts_camera_i) +\n                                     Utility::skewSymmetric(ric.transpose() * (Rj.transpose() * (Ri * tic + Pi - Pj) - tic));\n            jacobian_ex_pose.leftCols<6>() = reduce * jaco_ex;\n            jacobian_ex_pose.rightCols<1>().setZero();\n        }\n        if (jacobians[3])\n        {\n            Eigen::Map<Eigen::Vector2d> jacobian_feature(jacobians[3]);\n            jacobian_feature = reduce * ric.transpose() * Rj.transpose() * Ri * ric * pts_i_td * -1.0 / (inv_dep_i * inv_dep_i);\n        }\n        if (jacobians[4])\n        {\n            Eigen::Map<Eigen::Vector2d> jacobian_td(jacobians[4]);\n            jacobian_td = reduce * ric.transpose() * Rj.transpose() * Ri * ric * velocity_i / inv_dep_i * -1.0  +\n                          sqrt_info * velocity_j.head(2);\n        }\n    }\n    sum_t += tic_toc.toc();\n\n    return true;\n}\n\nvoid ProjectionTdFactor::check(double **parameters)\n{\n    double *res = new double[2];\n    double **jaco = new double *[5];\n    jaco[0] = new double[2 * 7];\n    jaco[1] = new double[2 * 7];\n    jaco[2] = new double[2 * 7];\n    jaco[3] = new double[2 * 1];\n    jaco[4] = new double[2 * 1];\n    Evaluate(parameters, res, jaco);\n    puts(\"check begins\");\n\n    puts(\"my\");\n\n    std::cout << Eigen::Map<Eigen::Matrix<double, 2, 1>>(res).transpose() << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>>(jaco[0]) << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>>(jaco[1]) << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Matrix<double, 2, 7, Eigen::RowMajor>>(jaco[2]) << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Vector2d>(jaco[3]) << std::endl\n              << std::endl;\n    std::cout << Eigen::Map<Eigen::Vector2d>(jaco[4]) << std::endl\n              << std::endl;\n\n    Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]);\n    Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]);\n\n    Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]);\n    Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]);\n\n    Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]);\n    Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]);\n    double inv_dep_i = parameters[3][0];\n    double td = parameters[4][0];\n\n    Eigen::Vector3d pts_i_td, pts_j_td;\n    pts_i_td = pts_i - (td - td_i + TR / ROW * row_i) * velocity_i;\n    pts_j_td = pts_j - (td - td_j + TR / ROW * row_j) * velocity_j;\n    Eigen::Vector3d pts_camera_i = pts_i_td / inv_dep_i;\n    Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic;\n    Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi;\n    Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj);\n    Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic);\n    Eigen::Vector2d residual;\n\n#ifdef UNIT_SPHERE_ERROR \n    residual =  tangent_base * (pts_camera_j.normalized() - pts_j_td.normalized());\n#else\n    double dep_j = pts_camera_j.z();\n    residual = (pts_camera_j / dep_j).head<2>() - pts_j_td.head<2>();\n#endif\n    residual = sqrt_info * residual;\n\n    puts(\"num\");\n    std::cout << residual.transpose() << std::endl;\n\n    const double eps = 1e-6;\n    Eigen::Matrix<double, 2, 20> num_jacobian;\n    for (int k = 0; k < 20; k++)\n    {\n        Eigen::Vector3d Pi(parameters[0][0], parameters[0][1], parameters[0][2]);\n        Eigen::Quaterniond Qi(parameters[0][6], parameters[0][3], parameters[0][4], parameters[0][5]);\n\n        Eigen::Vector3d Pj(parameters[1][0], parameters[1][1], parameters[1][2]);\n        Eigen::Quaterniond Qj(parameters[1][6], parameters[1][3], parameters[1][4], parameters[1][5]);\n\n        Eigen::Vector3d tic(parameters[2][0], parameters[2][1], parameters[2][2]);\n        Eigen::Quaterniond qic(parameters[2][6], parameters[2][3], parameters[2][4], parameters[2][5]);\n        double inv_dep_i = parameters[3][0];\n        double td = parameters[4][0];\n\n\n        int a = k / 3, b = k % 3;\n        Eigen::Vector3d delta = Eigen::Vector3d(b == 0, b == 1, b == 2) * eps;\n\n        if (a == 0)\n            Pi += delta;\n        else if (a == 1)\n            Qi = Qi * Utility::deltaQ(delta);\n        else if (a == 2)\n            Pj += delta;\n        else if (a == 3)\n            Qj = Qj * Utility::deltaQ(delta);\n        else if (a == 4)\n            tic += delta;\n        else if (a == 5)\n            qic = qic * Utility::deltaQ(delta);\n        else if (a == 6 && b == 0)\n            inv_dep_i += delta.x();\n        else if (a == 6 && b == 1)\n            td += delta.y();\n\n        Eigen::Vector3d pts_i_td, pts_j_td;\n        pts_i_td = pts_i - (td - td_i + TR / ROW * row_i) * velocity_i;\n        pts_j_td = pts_j - (td - td_j + TR / ROW * row_j) * velocity_j;\n        Eigen::Vector3d pts_camera_i = pts_i_td / inv_dep_i;\n        Eigen::Vector3d pts_imu_i = qic * pts_camera_i + tic;\n        Eigen::Vector3d pts_w = Qi * pts_imu_i + Pi;\n        Eigen::Vector3d pts_imu_j = Qj.inverse() * (pts_w - Pj);\n        Eigen::Vector3d pts_camera_j = qic.inverse() * (pts_imu_j - tic);\n        Eigen::Vector2d tmp_residual;\n\n#ifdef UNIT_SPHERE_ERROR \n        tmp_residual =  tangent_base * (pts_camera_j.normalized() - pts_j_td.normalized());\n#else\n        double dep_j = pts_camera_j.z();\n        tmp_residual = (pts_camera_j / dep_j).head<2>() - pts_j_td.head<2>();\n#endif\n        tmp_residual = sqrt_info * tmp_residual;\n\n        num_jacobian.col(k) = (tmp_residual - residual) / eps;\n    }\n    std::cout << num_jacobian << std::endl;\n}\n"
  },
  {
    "path": "rpvio_estimator/src/factor/projection_td_factor.h",
    "content": "#pragma once\n\n#include <ros/assert.h>\n#include <ceres/ceres.h>\n#include <Eigen/Dense>\n#include \"../utility/utility.h\"\n#include \"../utility/tic_toc.h\"\n#include \"../parameters.h\"\n\nclass ProjectionTdFactor : public ceres::SizedCostFunction<2, 7, 7, 7, 1, 1>\n{\n  public:\n    ProjectionTdFactor(const Eigen::Vector3d &_pts_i, const Eigen::Vector3d &_pts_j,\n    \t\t\t\t   const Eigen::Vector2d &_velocity_i, const Eigen::Vector2d &_velocity_j,\n    \t\t\t\t   const double _td_i, const double _td_j, const double _row_i, const double _row_j);\n    virtual bool Evaluate(double const *const *parameters, double *residuals, double **jacobians) const;\n    void check(double **parameters);\n\n    Eigen::Vector3d pts_i, pts_j;\n    Eigen::Vector3d velocity_i, velocity_j;\n    double td_i, td_j;\n    Eigen::Matrix<double, 2, 3> tangent_base;\n    double row_i, row_j;\n    static Eigen::Matrix2d sqrt_info;\n    static double sum_t;\n};\n"
  },
  {
    "path": "rpvio_estimator/src/feature_manager.cpp",
    "content": "#include \"feature_manager.h\"\n\nint FeaturePerId::endFrame()\n{\n    return start_frame + feature_per_frame.size() - 1;\n}\n\nFeatureManager::FeatureManager(Matrix3d _Rs[])\n    : Rs(_Rs)\n{\n    for (int i = 0; i < NUM_OF_CAM; i++)\n        ric[i].setIdentity();\n}\n\nvoid FeatureManager::setRic(Matrix3d _ric[])\n{\n    for (int i = 0; i < NUM_OF_CAM; i++)\n    {\n        ric[i] = _ric[i];\n    }\n}\n\nvoid FeatureManager::clearState()\n{\n    feature.clear();\n    plane_ids.clear();\n}\n\nint FeatureManager::getFeatureCount()\n{\n    int cnt = 0;\n    for (auto &it : feature)\n    {\n\n        it.used_num = it.feature_per_frame.size();\n\n        if (it.used_num >= 2 && it.start_frame < WINDOW_SIZE - 2)\n        {\n            cnt++;\n        }\n    }\n    return cnt;\n}\n\nint FeatureManager::getLargestPlaneId(std::vector<int> &init_pids)\n{\n    std::map<int, int> plane_counts;\n    for(auto &pid : init_pids)\n        plane_counts[pid] = 0;\n\n    for(auto &it : feature)\n    {\n        if(plane_counts.count(it.plane_id) > 0)\n            plane_counts[it.plane_id]++;\n    }\n\n    int largest_count = 0, largest_pid;\n    for(auto &plane_count : plane_counts)\n    {   \n        if(plane_count.second > largest_count)\n        {\n            largest_count = plane_count.second;\n            largest_pid = plane_count.first;\n        }\n    }\n    \n    return largest_pid;\n}\n\nbool FeatureManager::addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, double td)\n{\n    ROS_DEBUG(\"input feature: %d\", (int)image.size());\n    ROS_DEBUG(\"num of feature: %d\", getFeatureCount());\n    double parallax_sum = 0;\n    int parallax_num = 0;\n    last_track_num = 0;\n    for (auto &id_pts : image)\n    {\n        FeaturePerFrame f_per_fra(id_pts.second[0].second, td);\n\n        int feature_id = id_pts.first;\n        auto it = find_if(feature.begin(), feature.end(), [feature_id](const FeaturePerId &it)\n                          {\n            return it.feature_id == feature_id;\n                          });\n\n        if (it == feature.end())\n        {\n            int plane_id = id_pts.second[0].second(7);\n            plane_ids.insert(plane_id);\n            feature.push_back(FeaturePerId(feature_id, plane_id, frame_count));\n            feature.back().feature_per_frame.push_back(f_per_fra);\n        }\n        else if (it->feature_id == feature_id)\n        {\n            it->feature_per_frame.push_back(f_per_fra);\n            last_track_num++;\n        }\n    }\n\n    if (frame_count < 2 || last_track_num < 20)\n        return true;\n\n    for (auto &it_per_id : feature)\n    {\n        if (it_per_id.start_frame <= frame_count - 2 &&\n            it_per_id.start_frame + int(it_per_id.feature_per_frame.size()) - 1 >= frame_count - 1)\n        {\n            parallax_sum += compensatedParallax2(it_per_id, frame_count);\n            parallax_num++;\n        }\n    }\n\n    if (parallax_num == 0)\n    {\n        return true;\n    }\n    else\n    {\n        ROS_DEBUG(\"parallax_sum: %lf, parallax_num: %d\", parallax_sum, parallax_num);\n        ROS_DEBUG(\"current parallax: %lf\", parallax_sum / parallax_num * FOCAL_LENGTH);\n        return parallax_sum / parallax_num >= MIN_PARALLAX;\n    }\n}\n\nvoid FeatureManager::debugShow()\n{\n    ROS_DEBUG(\"debug show\");\n    for (auto &it : feature)\n    {\n        ROS_ASSERT(it.feature_per_frame.size() != 0);\n        ROS_ASSERT(it.start_frame >= 0);\n        ROS_ASSERT(it.used_num >= 0);\n\n        ROS_DEBUG(\"%d,%d,%d \", it.feature_id, it.used_num, it.start_frame);\n        int sum = 0;\n        for (auto &j : it.feature_per_frame)\n        {\n            ROS_DEBUG(\"%d,\", int(j.is_used));\n            sum += j.is_used;\n            printf(\"(%lf,%lf) \",j.point(0), j.point(1));\n        }\n        ROS_ASSERT(it.used_num == sum);\n    }\n}\n\nmap<int, vector<pair<Vector3d, Vector3d>>> FeatureManager::getCorresponding(int frame_count_l, int frame_count_r)\n{\n    map<int, vector<pair<Vector3d, Vector3d>>> corres;\n    for (auto &it : feature)\n    {\n        if (it.start_frame <= frame_count_l && it.endFrame() >= frame_count_r)\n        {\n            Vector3d a = Vector3d::Zero(), b = Vector3d::Zero();\n            int idx_l = frame_count_l - it.start_frame;\n            int idx_r = frame_count_r - it.start_frame;\n\n            a = it.feature_per_frame[idx_l].point;\n\n            b = it.feature_per_frame[idx_r].point;\n            \n            corres[it.plane_id].push_back(make_pair(a, b));\n        }\n    }\n    return corres;\n}\n\nvector<pair<Vector3d, Vector3d>> FeatureManager::getCorrespondingByPlane(int frame_count_l, int frame_count_r, int plane_id)\n{\n    vector<pair<Vector3d, Vector3d>> corres;\n    for (auto &it : feature)\n    {\n        if(it.plane_id != plane_id)\n            continue;\n\n        if (it.start_frame <= frame_count_l && it.endFrame() >= frame_count_r)\n        {\n            Vector3d a = Vector3d::Zero(), b = Vector3d::Zero();\n            int idx_l = frame_count_l - it.start_frame;\n            int idx_r = frame_count_r - it.start_frame;\n\n            a = it.feature_per_frame[idx_l].point;\n\n            b = it.feature_per_frame[idx_r].point;\n            \n            corres.push_back(make_pair(a, b));\n        }\n    }\n    return corres;\n}\n\nvoid FeatureManager::setDepth(const VectorXd &x)\n{\n    int feature_index = -1;\n    for (auto &it_per_id : feature)\n    {\n        it_per_id.used_num = it_per_id.feature_per_frame.size();\n        if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n            continue;\n\n        it_per_id.estimated_depth = 1.0 / x(++feature_index);\n        //ROS_INFO(\"feature id %d , start_frame %d, depth %f \", it_per_id->feature_id, it_per_id-> start_frame, it_per_id->estimated_depth);\n        if (it_per_id.estimated_depth < 0)\n        {\n            it_per_id.solve_flag = 2;\n        }\n        else\n            it_per_id.solve_flag = 1;\n    }\n}\n\nvoid FeatureManager::removeFailures()\n{\n    for (auto it = feature.begin(), it_next = feature.begin();\n         it != feature.end(); it = it_next)\n    {\n        it_next++;\n        if (it->solve_flag == 2)\n            feature.erase(it);\n    }\n}\n\nvoid FeatureManager::clearDepth(const VectorXd &x)\n{\n    int feature_index = -1;\n    for (auto &it_per_id : feature)\n    {\n        it_per_id.used_num = it_per_id.feature_per_frame.size();\n        if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n            continue;\n        it_per_id.estimated_depth = 1.0 / x(++feature_index);\n    }\n}\n\nVectorXd FeatureManager::getDepthVector()\n{\n    VectorXd dep_vec(getFeatureCount());\n    int feature_index = -1;\n    for (auto &it_per_id : feature)\n    {\n        it_per_id.used_num = it_per_id.feature_per_frame.size();\n        if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n            continue;\n#if 1\n        dep_vec(++feature_index) = 1. / it_per_id.estimated_depth;\n#else\n        dep_vec(++feature_index) = it_per_id->estimated_depth;\n#endif\n    }\n    return dep_vec;\n}\n\nvoid FeatureManager::triangulate(Vector3d Ps[], Vector3d tic[], Matrix3d ric[])\n{\n    for (auto &it_per_id : feature)\n    {\n        it_per_id.used_num = it_per_id.feature_per_frame.size();\n        if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n            continue;\n\n        if (it_per_id.estimated_depth > 0)\n            continue;\n\n        int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;\n\n        ROS_ASSERT(NUM_OF_CAM == 1);\n        Eigen::MatrixXd svd_A(2 * it_per_id.feature_per_frame.size(), 4);\n        int svd_idx = 0;\n\n        Eigen::Matrix<double, 3, 4> P0;\n        Eigen::Vector3d t0 = Ps[imu_i] + Rs[imu_i] * tic[0];\n        Eigen::Matrix3d R0 = Rs[imu_i] * ric[0];\n        P0.leftCols<3>() = Eigen::Matrix3d::Identity();\n        P0.rightCols<1>() = Eigen::Vector3d::Zero();\n\n        for (auto &it_per_frame : it_per_id.feature_per_frame)\n        {\n            imu_j++;\n\n            Eigen::Vector3d t1 = Ps[imu_j] + Rs[imu_j] * tic[0];\n            Eigen::Matrix3d R1 = Rs[imu_j] * ric[0];\n            Eigen::Vector3d t = R0.transpose() * (t1 - t0);\n            Eigen::Matrix3d R = R0.transpose() * R1;\n            Eigen::Matrix<double, 3, 4> P;\n            P.leftCols<3>() = R.transpose();\n            P.rightCols<1>() = -R.transpose() * t;\n            Eigen::Vector3d f = it_per_frame.point.normalized();\n            svd_A.row(svd_idx++) = f[0] * P.row(2) - f[2] * P.row(0);\n            svd_A.row(svd_idx++) = f[1] * P.row(2) - f[2] * P.row(1);\n\n            if (imu_i == imu_j)\n                continue;\n        }\n        ROS_ASSERT(svd_idx == svd_A.rows());\n        Eigen::Vector4d svd_V = Eigen::JacobiSVD<Eigen::MatrixXd>(svd_A, Eigen::ComputeThinV).matrixV().rightCols<1>();\n        double svd_method = svd_V[2] / svd_V[3];\n        //it_per_id->estimated_depth = -b / A;\n        //it_per_id->estimated_depth = svd_V[2] / svd_V[3];\n\n        it_per_id.estimated_depth = svd_method;\n        //it_per_id->estimated_depth = INIT_DEPTH;\n\n        if (it_per_id.estimated_depth < 0.1)\n        {\n            it_per_id.estimated_depth = INIT_DEPTH;\n        }\n\n    }\n}\n\nvoid FeatureManager::removeOutlier()\n{\n    ROS_BREAK();\n    int i = -1;\n    for (auto it = feature.begin(), it_next = feature.begin();\n         it != feature.end(); it = it_next)\n    {\n        it_next++;\n        i += it->used_num != 0;\n        if (it->used_num != 0 && it->is_outlier == true)\n        {\n            feature.erase(it);\n        }\n    }\n}\n\nvoid FeatureManager::removeBackShiftDepth(Eigen::Matrix3d marg_R, Eigen::Vector3d marg_P, Eigen::Matrix3d new_R, Eigen::Vector3d new_P)\n{\n    for (auto it = feature.begin(), it_next = feature.begin();\n         it != feature.end(); it = it_next)\n    {\n        it_next++;\n\n        if (it->start_frame != 0)\n            it->start_frame--;\n        else\n        {\n            Eigen::Vector3d uv_i = it->feature_per_frame[0].point;  \n            it->feature_per_frame.erase(it->feature_per_frame.begin());\n            if (it->feature_per_frame.size() < 2)\n            {\n                feature.erase(it);\n                continue;\n            }\n            else\n            {\n                Eigen::Vector3d pts_i = uv_i * it->estimated_depth;\n                Eigen::Vector3d w_pts_i = marg_R * pts_i + marg_P;\n                Eigen::Vector3d pts_j = new_R.transpose() * (w_pts_i - new_P);\n                double dep_j = pts_j(2);\n                if (dep_j > 0)\n                    it->estimated_depth = dep_j;\n                else\n                    it->estimated_depth = INIT_DEPTH;\n            }\n        }\n        // remove tracking-lost feature after marginalize\n        /*\n        if (it->endFrame() < WINDOW_SIZE - 1)\n        {\n            feature.erase(it);\n        }\n        */\n    }\n}\n\nvoid FeatureManager::removeBack()\n{\n    for (auto it = feature.begin(), it_next = feature.begin();\n         it != feature.end(); it = it_next)\n    {\n        it_next++;\n\n        if (it->start_frame != 0)\n            it->start_frame--;\n        else\n        {\n            it->feature_per_frame.erase(it->feature_per_frame.begin());\n            if (it->feature_per_frame.size() == 0)\n                feature.erase(it);\n        }\n    }\n}\n\nvoid FeatureManager::removeFront(int frame_count)\n{\n    for (auto it = feature.begin(), it_next = feature.begin(); it != feature.end(); it = it_next)\n    {\n        it_next++;\n\n        if (it->start_frame == frame_count)\n        {\n            it->start_frame--;\n        }\n        else\n        {\n            int j = WINDOW_SIZE - 1 - it->start_frame;\n            if (it->endFrame() < frame_count - 1)\n                continue;\n            it->feature_per_frame.erase(it->feature_per_frame.begin() + j);\n            if (it->feature_per_frame.size() == 0)\n                feature.erase(it);\n        }\n    }\n}\n\ndouble FeatureManager::compensatedParallax2(const FeaturePerId &it_per_id, int frame_count)\n{\n    //check the second last frame is keyframe or not\n    //parallax betwwen seconde last frame and third last frame\n    const FeaturePerFrame &frame_i = it_per_id.feature_per_frame[frame_count - 2 - it_per_id.start_frame];\n    const FeaturePerFrame &frame_j = it_per_id.feature_per_frame[frame_count - 1 - it_per_id.start_frame];\n\n    double ans = 0;\n    Vector3d p_j = frame_j.point;\n\n    double u_j = p_j(0);\n    double v_j = p_j(1);\n\n    Vector3d p_i = frame_i.point;\n    Vector3d p_i_comp;\n\n    //int r_i = frame_count - 2;\n    //int r_j = frame_count - 1;\n    //p_i_comp = ric[camera_id_j].transpose() * Rs[r_j].transpose() * Rs[r_i] * ric[camera_id_i] * p_i;\n    p_i_comp = p_i;\n    double dep_i = p_i(2);\n    double u_i = p_i(0) / dep_i;\n    double v_i = p_i(1) / dep_i;\n    double du = u_i - u_j, dv = v_i - v_j;\n\n    double dep_i_comp = p_i_comp(2);\n    double u_i_comp = p_i_comp(0) / dep_i_comp;\n    double v_i_comp = p_i_comp(1) / dep_i_comp;\n    double du_comp = u_i_comp - u_j, dv_comp = v_i_comp - v_j;\n\n    ans = max(ans, sqrt(min(du * du + dv * dv, du_comp * du_comp + dv_comp * dv_comp)));\n\n    return ans;\n}"
  },
  {
    "path": "rpvio_estimator/src/feature_manager.h",
    "content": "#ifndef FEATURE_MANAGER_H\n#define FEATURE_MANAGER_H\n\n#include <list>\n#include <algorithm>\n#include <vector>\n#include <numeric>\nusing namespace std;\n\n#include <eigen3/Eigen/Dense>\nusing namespace Eigen;\n\n#include <ros/console.h>\n#include <ros/assert.h>\n\n#include \"parameters.h\"\n\nclass FeaturePerFrame\n{\n  public:\n    FeaturePerFrame(const Eigen::Matrix<double, 8, 1> &_point, double td)\n    {\n        point.x() = _point(0);\n        point.y() = _point(1);\n        point.z() = _point(2);\n        uv.x() = _point(3);\n        uv.y() = _point(4);\n        velocity.x() = _point(5); \n        velocity.y() = _point(6); \n        cur_td = td;\n    }\n    double cur_td;\n    Vector3d point;\n    Vector2d uv;\n    Vector2d velocity;\n    double z;\n    bool is_used;\n    double parallax;\n    MatrixXd A;\n    VectorXd b;\n    double dep_gradient;\n};\n\nclass FeaturePerId\n{\n  public:\n    const int feature_id, plane_id;\n    int start_frame;\n    vector<FeaturePerFrame> feature_per_frame;\n\n    int used_num;\n    bool is_outlier;\n    bool is_margin;\n    double estimated_depth;\n    int solve_flag; // 0 haven't solve yet; 1 solve succ; 2 solve fail;\n\n    Vector3d gt_p;\n\n    FeaturePerId(int _feature_id, int _plane_id, int _start_frame)\n        : feature_id(_feature_id), plane_id(_plane_id), start_frame(_start_frame),\n          used_num(0), estimated_depth(-1.0), solve_flag(0)\n    {\n    }\n\n    int endFrame();\n};\n\nclass FeatureManager\n{\n  public:\n    FeatureManager(Matrix3d _Rs[]);\n\n    void setRic(Matrix3d _ric[]);\n\n    void clearState();\n\n    int getFeatureCount();\n    int getLargestPlaneId(std::vector<int> &init_pids);\n\n    bool addFeatureCheckParallax(int frame_count, const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>> &image, double td);\n    void debugShow();\n    map<int, vector<pair<Vector3d, Vector3d>>> getCorresponding(int frame_count_l, int frame_count_r);\n    vector<pair<Vector3d, Vector3d>> getCorrespondingByPlane(int frame_count_l, int frame_count_r, int plane_id);\n    set<int> plane_ids;\n\n    //void updateDepth(const VectorXd &x);\n    void setDepth(const VectorXd &x);\n    void removeFailures();\n    void clearDepth(const VectorXd &x);\n    VectorXd getDepthVector();\n    void triangulate(Vector3d Ps[], Vector3d tic[], Matrix3d ric[]);\n    void removeBackShiftDepth(Eigen::Matrix3d marg_R, Eigen::Vector3d marg_P, Eigen::Matrix3d new_R, Eigen::Vector3d new_P);\n    void removeBack();\n    void removeFront(int frame_count);\n    void removeOutlier();\n    list<FeaturePerId> feature;\n    int last_track_num;\n\n  private:\n    double compensatedParallax2(const FeaturePerId &it_per_id, int frame_count);\n    const Matrix3d *Rs;\n    Matrix3d ric[NUM_OF_CAM];\n};\n\n#endif"
  },
  {
    "path": "rpvio_estimator/src/initial/initial_aligment.cpp",
    "content": "#include \"initial_alignment.h\"\n\nvoid solveGyroscopeBias(map<double, ImageFrame> &all_image_frame, Vector3d* Bgs)\n{\n    Matrix3d A;\n    Vector3d b;\n    Vector3d delta_bg;\n    A.setZero();\n    b.setZero();\n    map<double, ImageFrame>::iterator frame_i;\n    map<double, ImageFrame>::iterator frame_j;\n    for (frame_i = all_image_frame.begin(); next(frame_i) != all_image_frame.end(); frame_i++)\n    {\n        frame_j = next(frame_i);\n        MatrixXd tmp_A(3, 3);\n        tmp_A.setZero();\n        VectorXd tmp_b(3);\n        tmp_b.setZero();\n        Eigen::Quaterniond q_ij(frame_i->second.R.transpose() * frame_j->second.R);\n        tmp_A = frame_j->second.pre_integration->jacobian.template block<3, 3>(O_R, O_BG);\n        tmp_b = 2 * (frame_j->second.pre_integration->delta_q.inverse() * q_ij).vec();\n        A += tmp_A.transpose() * tmp_A;\n        b += tmp_A.transpose() * tmp_b;\n\n    }\n    delta_bg = A.ldlt().solve(b);\n    ROS_WARN_STREAM(\"gyroscope bias initial calibration \" << delta_bg.transpose());\n\n    for (int i = 0; i <= WINDOW_SIZE; i++)\n        Bgs[i] += delta_bg;\n\n    for (frame_i = all_image_frame.begin(); next(frame_i) != all_image_frame.end( ); frame_i++)\n    {\n        frame_j = next(frame_i);\n        frame_j->second.pre_integration->repropagate(Vector3d::Zero(), Bgs[0]);\n    }\n}\n\n\nMatrixXd TangentBasis(Vector3d &g0)\n{\n    Vector3d b, c;\n    Vector3d a = g0.normalized();\n    Vector3d tmp(0, 0, 1);\n    if(a == tmp)\n        tmp << 1, 0, 0;\n    b = (tmp - a * (a.transpose() * tmp)).normalized();\n    c = a.cross(b);\n    MatrixXd bc(3, 2);\n    bc.block<3, 1>(0, 0) = b;\n    bc.block<3, 1>(0, 1) = c;\n    return bc;\n}\n\nvoid RefineGravity(map<double, ImageFrame> &all_image_frame, Vector3d &g, VectorXd &x)\n{\n    Vector3d g0 = g.normalized() * G.norm();\n    Vector3d lx, ly;\n    //VectorXd x;\n    int all_frame_count = all_image_frame.size();\n    int n_state = all_frame_count * 3 + 2 + 1;\n\n    MatrixXd A{n_state, n_state};\n    A.setZero();\n    VectorXd b{n_state};\n    b.setZero();\n\n    map<double, ImageFrame>::iterator frame_i;\n    map<double, ImageFrame>::iterator frame_j;\n    for(int k = 0; k < 4; k++)\n    {\n        MatrixXd lxly(3, 2);\n        lxly = TangentBasis(g0);\n        int i = 0;\n        for (frame_i = all_image_frame.begin(); next(frame_i) != all_image_frame.end(); frame_i++, i++)\n        {\n            frame_j = next(frame_i);\n\n            MatrixXd tmp_A(6, 9);\n            tmp_A.setZero();\n            VectorXd tmp_b(6);\n            tmp_b.setZero();\n\n            double dt = frame_j->second.pre_integration->sum_dt;\n\n\n            tmp_A.block<3, 3>(0, 0) = -dt * Matrix3d::Identity();\n            tmp_A.block<3, 2>(0, 6) = frame_i->second.R.transpose() * dt * dt / 2 * Matrix3d::Identity() * lxly;\n            tmp_A.block<3, 1>(0, 8) = frame_i->second.R.transpose() * (frame_j->second.T - frame_i->second.T) / 100.0;     \n            tmp_b.block<3, 1>(0, 0) = frame_j->second.pre_integration->delta_p + frame_i->second.R.transpose() * frame_j->second.R * TIC[0] - TIC[0] - frame_i->second.R.transpose() * dt * dt / 2 * g0;\n\n            tmp_A.block<3, 3>(3, 0) = -Matrix3d::Identity();\n            tmp_A.block<3, 3>(3, 3) = frame_i->second.R.transpose() * frame_j->second.R;\n            tmp_A.block<3, 2>(3, 6) = frame_i->second.R.transpose() * dt * Matrix3d::Identity() * lxly;\n            tmp_b.block<3, 1>(3, 0) = frame_j->second.pre_integration->delta_v - frame_i->second.R.transpose() * dt * Matrix3d::Identity() * g0;\n\n\n            Matrix<double, 6, 6> cov_inv = Matrix<double, 6, 6>::Zero();\n            //cov.block<6, 6>(0, 0) = IMU_cov[i + 1];\n            //MatrixXd cov_inv = cov.inverse();\n            cov_inv.setIdentity();\n\n            MatrixXd r_A = tmp_A.transpose() * cov_inv * tmp_A;\n            VectorXd r_b = tmp_A.transpose() * cov_inv * tmp_b;\n\n            A.block<6, 6>(i * 3, i * 3) += r_A.topLeftCorner<6, 6>();\n            b.segment<6>(i * 3) += r_b.head<6>();\n\n            A.bottomRightCorner<3, 3>() += r_A.bottomRightCorner<3, 3>();\n            b.tail<3>() += r_b.tail<3>();\n\n            A.block<6, 3>(i * 3, n_state - 3) += r_A.topRightCorner<6, 3>();\n            A.block<3, 6>(n_state - 3, i * 3) += r_A.bottomLeftCorner<3, 6>();\n        }\n            A = A * 1000.0;\n            b = b * 1000.0;\n            x = A.ldlt().solve(b);\n            VectorXd dg = x.segment<2>(n_state - 3);\n            g0 = (g0 + lxly * dg).normalized() * G.norm();\n            //double s = x(n_state - 1);\n    }   \n    g = g0;\n}\n\nbool LinearAlignment(map<double, ImageFrame> &all_image_frame, Vector3d &g, VectorXd &x)\n{\n    int all_frame_count = all_image_frame.size();\n    int n_state = all_frame_count * 3 + 3 + 1;\n\n    MatrixXd A{n_state, n_state};\n    A.setZero();\n    VectorXd b{n_state};\n    b.setZero();\n\n    map<double, ImageFrame>::iterator frame_i;\n    map<double, ImageFrame>::iterator frame_j;\n    int i = 0;\n    for (frame_i = all_image_frame.begin(); next(frame_i) != all_image_frame.end(); frame_i++, i++)\n    {\n        frame_j = next(frame_i);\n\n        MatrixXd tmp_A(6, 10);\n        tmp_A.setZero();\n        VectorXd tmp_b(6);\n        tmp_b.setZero();\n\n        double dt = frame_j->second.pre_integration->sum_dt;\n\n        tmp_A.block<3, 3>(0, 0) = -dt * Matrix3d::Identity();\n        tmp_A.block<3, 3>(0, 6) = frame_i->second.R.transpose() * dt * dt / 2 * Matrix3d::Identity();\n        tmp_A.block<3, 1>(0, 9) = frame_i->second.R.transpose() * (frame_j->second.T - frame_i->second.T) / 100.0;     \n        tmp_b.block<3, 1>(0, 0) = frame_j->second.pre_integration->delta_p + frame_i->second.R.transpose() * frame_j->second.R * TIC[0] - TIC[0];\n        //cout << \"delta_p   \" << frame_j->second.pre_integration->delta_p.transpose() << endl;\n        tmp_A.block<3, 3>(3, 0) = -Matrix3d::Identity();\n        tmp_A.block<3, 3>(3, 3) = frame_i->second.R.transpose() * frame_j->second.R;\n        tmp_A.block<3, 3>(3, 6) = frame_i->second.R.transpose() * dt * Matrix3d::Identity();\n        tmp_b.block<3, 1>(3, 0) = frame_j->second.pre_integration->delta_v;\n        //cout << \"delta_v   \" << frame_j->second.pre_integration->delta_v.transpose() << endl;\n\n        Matrix<double, 6, 6> cov_inv = Matrix<double, 6, 6>::Zero();\n        //cov.block<6, 6>(0, 0) = IMU_cov[i + 1];\n        //MatrixXd cov_inv = cov.inverse();\n        cov_inv.setIdentity();\n\n        MatrixXd r_A = tmp_A.transpose() * cov_inv * tmp_A;\n        VectorXd r_b = tmp_A.transpose() * cov_inv * tmp_b;\n\n        A.block<6, 6>(i * 3, i * 3) += r_A.topLeftCorner<6, 6>();\n        b.segment<6>(i * 3) += r_b.head<6>();\n\n        A.bottomRightCorner<4, 4>() += r_A.bottomRightCorner<4, 4>();\n        b.tail<4>() += r_b.tail<4>();\n\n        A.block<6, 4>(i * 3, n_state - 4) += r_A.topRightCorner<6, 4>();\n        A.block<4, 6>(n_state - 4, i * 3) += r_A.bottomLeftCorner<4, 6>();\n    }\n    A = A * 1000.0;\n    b = b * 1000.0;\n    x = A.ldlt().solve(b);\n    double s = x(n_state - 1) / 100.0;\n    ROS_DEBUG(\"estimated scale: %f\", s);\n    g = x.segment<3>(n_state - 4);\n    ROS_DEBUG_STREAM(\" result g     \" << g.norm() << \" \" << g.transpose());\n    if(fabs(g.norm() - G.norm()) > 1.0 || s < 0)\n    {\n        return false;\n    }\n\n    RefineGravity(all_image_frame, g, x);\n    s = (x.tail<1>())(0) / 100.0;\n    (x.tail<1>())(0) = s;\n    ROS_DEBUG_STREAM(\" refine     \" << g.norm() << \" \" << g.transpose());\n    if(s < 0.0 )\n        return false;   \n    else\n        return true;\n}\n\nbool VisualIMUAlignment(map<double, ImageFrame> &all_image_frame, Vector3d* Bgs, Vector3d &g, VectorXd &x)\n{\n    solveGyroscopeBias(all_image_frame, Bgs);\n\n    if(LinearAlignment(all_image_frame, g, x))\n        return true;\n    else \n        return false;\n}\n"
  },
  {
    "path": "rpvio_estimator/src/initial/initial_alignment.h",
    "content": "#pragma once\n#include <eigen3/Eigen/Dense>\n#include <iostream>\n#include \"../factor/imu_factor.h\"\n#include \"../utility/utility.h\"\n#include <ros/ros.h>\n#include <map>\n#include \"../feature_manager.h\"\n\nusing namespace Eigen;\nusing namespace std;\n\nclass ImageFrame\n{\n    public:\n        ImageFrame(){};\n        ImageFrame(const map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>>>>& _points, double _t):t{_t},is_key_frame{false}\n        {\n            points = _points;\n        };\n        map<int, vector<pair<int, Eigen::Matrix<double, 8, 1>> > > points;\n        double t;\n        Matrix3d R;\n        Vector3d T;\n        IntegrationBase *pre_integration;\n        bool is_key_frame;\n};\n\nbool VisualIMUAlignment(map<double, ImageFrame> &all_image_frame, Vector3d* Bgs, Vector3d &g, VectorXd &x);"
  },
  {
    "path": "rpvio_estimator/src/initial/initial_ex_rotation.cpp",
    "content": "#include \"initial_ex_rotation.h\"\n\nInitialEXRotation::InitialEXRotation(){\n    frame_count = 0;\n    Rc.push_back(Matrix3d::Identity());\n    Rc_g.push_back(Matrix3d::Identity());\n    Rimu.push_back(Matrix3d::Identity());\n    ric = Matrix3d::Identity();\n}\n\nbool InitialEXRotation::CalibrationExRotation(vector<pair<Vector3d, Vector3d>> corres, Quaterniond delta_q_imu, Matrix3d &calib_ric_result)\n{\n    frame_count++;\n    Rc.push_back(solveRelativeR(corres));\n    Rimu.push_back(delta_q_imu.toRotationMatrix());\n    Rc_g.push_back(ric.inverse() * delta_q_imu * ric);\n\n    Eigen::MatrixXd A(frame_count * 4, 4);\n    A.setZero();\n    int sum_ok = 0;\n    for (int i = 1; i <= frame_count; i++)\n    {\n        Quaterniond r1(Rc[i]);\n        Quaterniond r2(Rc_g[i]);\n\n        double angular_distance = 180 / M_PI * r1.angularDistance(r2);\n        ROS_DEBUG(\n            \"%d %f\", i, angular_distance);\n\n        double huber = angular_distance > 5.0 ? 5.0 / angular_distance : 1.0;\n        ++sum_ok;\n        Matrix4d L, R;\n\n        double w = Quaterniond(Rc[i]).w();\n        Vector3d q = Quaterniond(Rc[i]).vec();\n        L.block<3, 3>(0, 0) = w * Matrix3d::Identity() + Utility::skewSymmetric(q);\n        L.block<3, 1>(0, 3) = q;\n        L.block<1, 3>(3, 0) = -q.transpose();\n        L(3, 3) = w;\n\n        Quaterniond R_ij(Rimu[i]);\n        w = R_ij.w();\n        q = R_ij.vec();\n        R.block<3, 3>(0, 0) = w * Matrix3d::Identity() - Utility::skewSymmetric(q);\n        R.block<3, 1>(0, 3) = q;\n        R.block<1, 3>(3, 0) = -q.transpose();\n        R(3, 3) = w;\n\n        A.block<4, 4>((i - 1) * 4, 0) = huber * (L - R);\n    }\n\n    JacobiSVD<MatrixXd> svd(A, ComputeFullU | ComputeFullV);\n    Matrix<double, 4, 1> x = svd.matrixV().col(3);\n    Quaterniond estimated_R(x);\n    ric = estimated_R.toRotationMatrix().inverse();\n    //cout << svd.singularValues().transpose() << endl;\n    //cout << ric << endl;\n    Vector3d ric_cov;\n    ric_cov = svd.singularValues().tail<3>();\n    if (frame_count >= WINDOW_SIZE && ric_cov(1) > 0.25)\n    {\n        calib_ric_result = ric;\n        return true;\n    }\n    else\n        return false;\n}\n\nMatrix3d InitialEXRotation::solveRelativeR(const vector<pair<Vector3d, Vector3d>> &corres)\n{\n    if (corres.size() >= 9)\n    {\n        vector<cv::Point2f> ll, rr;\n        for (int i = 0; i < int(corres.size()); i++)\n        {\n            ll.push_back(cv::Point2f(corres[i].first(0), corres[i].first(1)));\n            rr.push_back(cv::Point2f(corres[i].second(0), corres[i].second(1)));\n        }\n        cv::Mat E = cv::findFundamentalMat(ll, rr);\n        cv::Mat_<double> R1, R2, t1, t2;\n        decomposeE(E, R1, R2, t1, t2);\n\n        if (determinant(R1) + 1.0 < 1e-09)\n        {\n            E = -E;\n            decomposeE(E, R1, R2, t1, t2);\n        }\n        double ratio1 = max(testTriangulation(ll, rr, R1, t1), testTriangulation(ll, rr, R1, t2));\n        double ratio2 = max(testTriangulation(ll, rr, R2, t1), testTriangulation(ll, rr, R2, t2));\n        cv::Mat_<double> ans_R_cv = ratio1 > ratio2 ? R1 : R2;\n\n        Matrix3d ans_R_eigen;\n        for (int i = 0; i < 3; i++)\n            for (int j = 0; j < 3; j++)\n                ans_R_eigen(j, i) = ans_R_cv(i, j);\n        return ans_R_eigen;\n    }\n    return Matrix3d::Identity();\n}\n\ndouble InitialEXRotation::testTriangulation(const vector<cv::Point2f> &l,\n                                          const vector<cv::Point2f> &r,\n                                          cv::Mat_<double> R, cv::Mat_<double> t)\n{\n    cv::Mat pointcloud;\n    cv::Matx34f P = cv::Matx34f(1, 0, 0, 0,\n                                0, 1, 0, 0,\n                                0, 0, 1, 0);\n    cv::Matx34f P1 = cv::Matx34f(R(0, 0), R(0, 1), R(0, 2), t(0),\n                                 R(1, 0), R(1, 1), R(1, 2), t(1),\n                                 R(2, 0), R(2, 1), R(2, 2), t(2));\n    cv::triangulatePoints(P, P1, l, r, pointcloud);\n    int front_count = 0;\n    for (int i = 0; i < pointcloud.cols; i++)\n    {\n        double normal_factor = pointcloud.col(i).at<float>(3);\n\n        cv::Mat_<double> p_3d_l = cv::Mat(P) * (pointcloud.col(i) / normal_factor);\n        cv::Mat_<double> p_3d_r = cv::Mat(P1) * (pointcloud.col(i) / normal_factor);\n        if (p_3d_l(2) > 0 && p_3d_r(2) > 0)\n            front_count++;\n    }\n    ROS_DEBUG(\"MotionEstimator: %f\", 1.0 * front_count / pointcloud.cols);\n    return 1.0 * front_count / pointcloud.cols;\n}\n\nvoid InitialEXRotation::decomposeE(cv::Mat E,\n                                 cv::Mat_<double> &R1, cv::Mat_<double> &R2,\n                                 cv::Mat_<double> &t1, cv::Mat_<double> &t2)\n{\n    cv::SVD svd(E, cv::SVD::MODIFY_A);\n    cv::Matx33d W(0, -1, 0,\n                  1, 0, 0,\n                  0, 0, 1);\n    cv::Matx33d Wt(0, 1, 0,\n                   -1, 0, 0,\n                   0, 0, 1);\n    R1 = svd.u * cv::Mat(W) * svd.vt;\n    R2 = svd.u * cv::Mat(Wt) * svd.vt;\n    t1 = svd.u.col(2);\n    t2 = -svd.u.col(2);\n}\n"
  },
  {
    "path": "rpvio_estimator/src/initial/initial_ex_rotation.h",
    "content": "#pragma once \n\n#include <vector>\n#include \"../parameters.h\"\nusing namespace std;\n\n#include <opencv2/opencv.hpp>\n\n#include <eigen3/Eigen/Dense>\nusing namespace Eigen;\n#include <ros/console.h>\n\n/* This class help you to calibrate extrinsic rotation between imu and camera when your totally don't konw the extrinsic parameter */\nclass InitialEXRotation\n{\npublic:\n\tInitialEXRotation();\n    bool CalibrationExRotation(vector<pair<Vector3d, Vector3d>> corres, Quaterniond delta_q_imu, Matrix3d &calib_ric_result);\nprivate:\n\tMatrix3d solveRelativeR(const vector<pair<Vector3d, Vector3d>> &corres);\n\n    double testTriangulation(const vector<cv::Point2f> &l,\n                             const vector<cv::Point2f> &r,\n                             cv::Mat_<double> R, cv::Mat_<double> t);\n    void decomposeE(cv::Mat E,\n                    cv::Mat_<double> &R1, cv::Mat_<double> &R2,\n                    cv::Mat_<double> &t1, cv::Mat_<double> &t2);\n    int frame_count;\n\n    vector< Matrix3d > Rc;\n    vector< Matrix3d > Rimu;\n    vector< Matrix3d > Rc_g;\n    Matrix3d ric;\n};\n\n\n"
  },
  {
    "path": "rpvio_estimator/src/initial/initial_sfm.cpp",
    "content": "#include \"initial_sfm.h\"\n\nGlobalSFM::GlobalSFM(){}\n\nvoid GlobalSFM::triangulatePoint(Eigen::Matrix<double, 3, 4> &Pose0, Eigen::Matrix<double, 3, 4> &Pose1,\n\t\t\t\t\t\tVector2d &point0, Vector2d &point1, Vector3d &point_3d)\n{\n\tMatrix4d design_matrix = Matrix4d::Zero();\n\tdesign_matrix.row(0) = point0[0] * Pose0.row(2) - Pose0.row(0);\n\tdesign_matrix.row(1) = point0[1] * Pose0.row(2) - Pose0.row(1);\n\tdesign_matrix.row(2) = point1[0] * Pose1.row(2) - Pose1.row(0);\n\tdesign_matrix.row(3) = point1[1] * Pose1.row(2) - Pose1.row(1);\n\tVector4d triangulated_point;\n\ttriangulated_point =\n\t\t      design_matrix.jacobiSvd(Eigen::ComputeFullV).matrixV().rightCols<1>();\n\tpoint_3d(0) = triangulated_point(0) / triangulated_point(3);\n\tpoint_3d(1) = triangulated_point(1) / triangulated_point(3);\n\tpoint_3d(2) = triangulated_point(2) / triangulated_point(3);\n}\n\n\nbool GlobalSFM::solveFrameByPnP(Matrix3d &R_initial, Vector3d &P_initial, int i,\n\t\t\t\t\t\t\t\tvector<SFMFeature> &sfm_f)\n{\n\tvector<cv::Point2f> pts_2_vector;\n\tvector<cv::Point3f> pts_3_vector;\n\tfor (int j = 0; j < feature_num; j++)\n\t{\n\t\tif (sfm_f[j].state != true)\n\t\t\tcontinue;\n\t\tVector2d point2d;\n\t\tfor (int k = 0; k < (int)sfm_f[j].observation.size(); k++)\n\t\t{\n\t\t\tif (sfm_f[j].observation[k].first == i)\n\t\t\t{\n\t\t\t\tVector2d img_pts = sfm_f[j].observation[k].second;\n\t\t\t\tcv::Point2f pts_2(img_pts(0), img_pts(1));\n\t\t\t\tpts_2_vector.push_back(pts_2);\n\t\t\t\tcv::Point3f pts_3(sfm_f[j].position[0], sfm_f[j].position[1], sfm_f[j].position[2]);\n\t\t\t\tpts_3_vector.push_back(pts_3);\n\t\t\t\tbreak;\n\t\t\t}\n\t\t}\n\t}\n\tif (int(pts_2_vector.size()) < 15)\n\t{\n\t\tprintf(\"unstable features tracking, please slowly move you device!\\n\");\n\t\tif (int(pts_2_vector.size()) < 10)\n\t\t\treturn false;\n\t}\n\tcv::Mat r, rvec, t, D, tmp_r;\n\tcv::eigen2cv(R_initial, tmp_r);\n\tcv::Rodrigues(tmp_r, rvec);\n\tcv::eigen2cv(P_initial, t);\n\tcv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n\tbool pnp_succ;\n\tpnp_succ = cv::solvePnP(pts_3_vector, pts_2_vector, K, D, rvec, t, 1);\n\tif(!pnp_succ)\n\t{\n\t\treturn false;\n\t}\n\tcv::Rodrigues(rvec, r);\n\t//cout << \"r \" << endl << r << endl;\n\tMatrixXd R_pnp;\n\tcv::cv2eigen(r, R_pnp);\n\tMatrixXd T_pnp;\n\tcv::cv2eigen(t, T_pnp);\n\tR_initial = R_pnp;\n\tP_initial = T_pnp;\n\treturn true;\n\n}\n\nvoid GlobalSFM::triangulateTwoFrames(int frame0, Eigen::Matrix<double, 3, 4> &Pose0, \n\t\t\t\t\t\t\t\t\t int frame1, Eigen::Matrix<double, 3, 4> &Pose1,\n\t\t\t\t\t\t\t\t\t vector<SFMFeature> &sfm_f)\n{\n\tassert(frame0 != frame1);\n\tfor (int j = 0; j < feature_num; j++)\n\t{\n\t\tif (sfm_f[j].state == true)\n\t\t\tcontinue;\n\t\tbool has_0 = false, has_1 = false;\n\t\tVector2d point0;\n\t\tVector2d point1;\n\t\tfor (int k = 0; k < (int)sfm_f[j].observation.size(); k++)\n\t\t{\n\t\t\tif (sfm_f[j].observation[k].first == frame0)\n\t\t\t{\n\t\t\t\tpoint0 = sfm_f[j].observation[k].second;\n\t\t\t\thas_0 = true;\n\t\t\t}\n\t\t\tif (sfm_f[j].observation[k].first == frame1)\n\t\t\t{\n\t\t\t\tpoint1 = sfm_f[j].observation[k].second;\n\t\t\t\thas_1 = true;\n\t\t\t}\n\t\t}\n\t\tif (has_0 && has_1)\n\t\t{\n\t\t\tVector3d point_3d;\n\t\t\ttriangulatePoint(Pose0, Pose1, point0, point1, point_3d);\n\t\t\tsfm_f[j].state = true;\n\t\t\tsfm_f[j].position[0] = point_3d(0);\n\t\t\tsfm_f[j].position[1] = point_3d(1);\n\t\t\tsfm_f[j].position[2] = point_3d(2);\n            sfm_f[j].image[0] = point1[0];\n            sfm_f[j].image[1] = point1[1];\n            sfm_f[j].image[2] = 1;\n\t\t\t//cout << \"trangulated : \" << frame1 << \"  3d point : \"  << j << \"  \" << point_3d.transpose() << endl;\n\t\t}\t\t\t\t\t\t\t  \n\t}\n}\n\n// \t q w_R_cam t w_R_cam\n//  c_rotation cam_R_w \n//  c_translation cam_R_w\n// relative_q[i][j]  j_q_i\n// relative_t[i][j]  j_t_ji  (j < i)\nbool GlobalSFM::construct(int frame_num, Quaterniond* q, Vector3d* T, int l,\n\t\t\t  const Matrix3d relative_R, const Vector3d relative_T,\n\t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points)\n{\n\tfeature_num = sfm_f.size();\n\t//cout << \"set 0 and \" << l << \" as known \" << endl;\n\t// have relative_r relative_t\n\t// intial two view\n\tq[l].w() = 1;\n\tq[l].x() = 0;\n\tq[l].y() = 0;\n\tq[l].z() = 0;\n\tT[l].setZero();\n\tq[frame_num - 1] = q[l] * Quaterniond(relative_R);\n\tT[frame_num - 1] = relative_T;\n\t//cout << \"init q_l \" << q[l].w() << \" \" << q[l].vec().transpose() << endl;\n\t//cout << \"init t_l \" << T[l].transpose() << endl;\n\n\t//rotate to cam frame\n\tMatrix3d c_Rotation[frame_num];\n\tVector3d c_Translation[frame_num];\n\tQuaterniond c_Quat[frame_num];\n\tdouble c_rotation[frame_num][4];\n\tdouble c_translation[frame_num][3];\n\tEigen::Matrix<double, 3, 4> Pose[frame_num];\n\n\tc_Quat[l] = q[l].inverse();\n\tc_Rotation[l] = c_Quat[l].toRotationMatrix();\n\tc_Translation[l] = -1 * (c_Rotation[l] * T[l]);\n\tPose[l].block<3, 3>(0, 0) = c_Rotation[l];\n\tPose[l].block<3, 1>(0, 3) = c_Translation[l];\n\n\tc_Quat[frame_num - 1] = q[frame_num - 1].inverse();\n\tc_Rotation[frame_num - 1] = c_Quat[frame_num - 1].toRotationMatrix();\n\tc_Translation[frame_num - 1] = -1 * (c_Rotation[frame_num - 1] * T[frame_num - 1]);\n\tPose[frame_num - 1].block<3, 3>(0, 0) = c_Rotation[frame_num - 1];\n\tPose[frame_num - 1].block<3, 1>(0, 3) = c_Translation[frame_num - 1];\n\n\n\t//1: trangulate between l ----- frame_num - 1\n\t//2: solve pnp l + 1; trangulate l + 1 ------- frame_num - 1; \n\tfor (int i = l; i < frame_num - 1 ; i++)\n\t{\n\t\t// solve pnp\n\t\tif (i > l)\n\t\t{\n\t\t\tMatrix3d R_initial = c_Rotation[i - 1];\n\t\t\tVector3d P_initial = c_Translation[i - 1];\n\t\t\tif(!solveFrameByPnP(R_initial, P_initial, i, sfm_f))\n\t\t\t\treturn false;\n\t\t\tc_Rotation[i] = R_initial;\n\t\t\tc_Translation[i] = P_initial;\n\t\t\tc_Quat[i] = c_Rotation[i];\n\t\t\tPose[i].block<3, 3>(0, 0) = c_Rotation[i];\n\t\t\tPose[i].block<3, 1>(0, 3) = c_Translation[i];\n\t\t}\n\n\t\t// triangulate point based on the solve pnp result\n\t\ttriangulateTwoFrames(i, Pose[i], frame_num - 1, Pose[frame_num - 1], sfm_f);\n\t}\n\t//3: triangulate l-----l+1 l+2 ... frame_num -2\n\tfor (int i = l + 1; i < frame_num - 1; i++)\n\t\ttriangulateTwoFrames(l, Pose[l], i, Pose[i], sfm_f);\n\t//4: solve pnp l-1; triangulate l-1 ----- l\n\t//             l-2              l-2 ----- l\n\tfor (int i = l - 1; i >= 0; i--)\n\t{\n\t\t//solve pnp\n\t\tMatrix3d R_initial = c_Rotation[i + 1];\n\t\tVector3d P_initial = c_Translation[i + 1];\n\t\tif(!solveFrameByPnP(R_initial, P_initial, i, sfm_f))\n\t\t\treturn false;\n\t\tc_Rotation[i] = R_initial;\n\t\tc_Translation[i] = P_initial;\n\t\tc_Quat[i] = c_Rotation[i];\n\t\tPose[i].block<3, 3>(0, 0) = c_Rotation[i];\n\t\tPose[i].block<3, 1>(0, 3) = c_Translation[i];\n\t\t//triangulate\n\t\ttriangulateTwoFrames(i, Pose[i], l, Pose[l], sfm_f);\n\t}\n\t//5: triangulate all other points\n\tfor (int j = 0; j < feature_num; j++)\n\t{\n\t\tif (sfm_f[j].state == true)\n\t\t\tcontinue;\n\t\tif ((int)sfm_f[j].observation.size() >= 2)\n\t\t{\n\t\t\tVector2d point0, point1;\n\t\t\tint frame_0 = sfm_f[j].observation[0].first;\n\t\t\tpoint0 = sfm_f[j].observation[0].second;\n\t\t\tint frame_1 = sfm_f[j].observation.back().first;\n\t\t\tpoint1 = sfm_f[j].observation.back().second;\n\t\t\tVector3d point_3d;\n\t\t\ttriangulatePoint(Pose[frame_0], Pose[frame_1], point0, point1, point_3d);\n\t\t\tsfm_f[j].state = true;\n\t\t\tsfm_f[j].position[0] = point_3d(0);\n\t\t\tsfm_f[j].position[1] = point_3d(1);\n\t\t\tsfm_f[j].position[2] = point_3d(2);\n            sfm_f[j].image[0] = point1[0];\n            sfm_f[j].image[1] = point1[1];\n            sfm_f[j].image[2] = 1;\n\t\t\t//cout << \"trangulated : \" << frame_0 << \" \" << frame_1 << \"  3d point : \"  << j << \"  \" << point_3d.transpose() << endl;\n\t\t}\t\t\n\t}\n\n/*\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\t\tq[i] = c_Rotation[i].transpose(); \n\t\tcout << \"solvePnP  q\" << \" i \" << i <<\"  \" <<q[i].w() << \"  \" << q[i].vec().transpose() << endl;\n\t}\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\t\tVector3d t_tmp;\n\t\tt_tmp = -1 * (q[i] * c_Translation[i]);\n\t\tcout << \"solvePnP  t\" << \" i \" << i <<\"  \" << t_tmp.x() <<\"  \"<< t_tmp.y() <<\"  \"<< t_tmp.z() << endl;\n\t}\n*/\n\t//full BA\n\tceres::Problem problem;\n\tceres::LocalParameterization* local_parameterization = new ceres::QuaternionParameterization();\n\t//cout << \" begin full BA \" << endl;\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\t\t//double array for ceres\n\t\tc_translation[i][0] = c_Translation[i].x();\n\t\tc_translation[i][1] = c_Translation[i].y();\n\t\tc_translation[i][2] = c_Translation[i].z();\n\t\tc_rotation[i][0] = c_Quat[i].w();\n\t\tc_rotation[i][1] = c_Quat[i].x();\n\t\tc_rotation[i][2] = c_Quat[i].y();\n\t\tc_rotation[i][3] = c_Quat[i].z();\n\t\tproblem.AddParameterBlock(c_rotation[i], 4, local_parameterization);\n\t\tproblem.AddParameterBlock(c_translation[i], 3);\n\t\tif (i == l)\n\t\t{\n\t\t\tproblem.SetParameterBlockConstant(c_rotation[i]);\n\t\t}\n\t\tif (i == l || i == frame_num - 1)\n\t\t{\n\t\t\tproblem.SetParameterBlockConstant(c_translation[i]);\n\t\t}\n\t}\n\n\tfor (int i = 0; i < feature_num; i++)\n\t{\n\t\tif (sfm_f[i].state != true)\n\t\t\tcontinue;\n\t\tfor (int j = 0; j < int(sfm_f[i].observation.size()); j++)\n\t\t{\n\t\t\tint l = sfm_f[i].observation[j].first;\n\t\t\tceres::CostFunction* cost_function = ReprojectionError3D::Create(\n\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n\n    \t\tproblem.AddResidualBlock(cost_function, NULL, c_rotation[l], c_translation[l], \n    \t\t\t\t\t\t\t\tsfm_f[i].position);\t \n\t\t}\n\n\t}\n\tceres::Solver::Options options;\n\toptions.linear_solver_type = ceres::DENSE_SCHUR;\n\t//options.minimizer_progress_to_stdout = true;\n\toptions.max_solver_time_in_seconds = 0.2;\n\tceres::Solver::Summary summary;\n\tceres::Solve(options, &problem, &summary);\n\t//std::cout << summary.BriefReport() << \"\\n\";\n\tif (summary.termination_type == ceres::CONVERGENCE || summary.final_cost < 5e-03)\n\t{\n\t\tcout << \"vision only BA converge\" << endl;\n\t}\n\telse\n\t{\n\t\tcout << \"vision only BA not converge \" << endl;\n\t\treturn false;\n\t}\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\t\tq[i].w() = c_rotation[i][0]; \n\t\tq[i].x() = c_rotation[i][1]; \n\t\tq[i].y() = c_rotation[i][2]; \n\t\tq[i].z() = c_rotation[i][3]; \n\t\tq[i] = q[i].inverse();\n\t\t//cout << \"final  q\" << \" i \" << i <<\"  \" <<q[i].w() << \"  \" << q[i].vec().transpose() << endl;\n\t}\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\n\t\tT[i] = -1 * (q[i] * Vector3d(c_translation[i][0], c_translation[i][1], c_translation[i][2]));\n\t\t//cout << \"final  t\" << \" i \" << i <<\"  \" << T[i](0) <<\"  \"<< T[i](1) <<\"  \"<< T[i](2) << endl;\n\t}\n\tfor (int i = 0; i < (int)sfm_f.size(); i++)\n\t{\n\t\tif(sfm_f[i].state)\n\t\t\tsfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]);\n\t}\n\treturn true;\n\n}\n\n\nbool GlobalSFM::constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n\t\t\t  const Matrix3d relative_R, const Vector3d relative_T, Vector3d &n,\n\t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points)\n{\n\tfeature_num = sfm_f.size();\n\t//cout << \"set 0 and \" << l << \" as known \" << endl;\n\t// have relative_r relative_t\n\t// intial two view\n\tq[l].w() = 1;\n\tq[l].x() = 0;\n\tq[l].y() = 0;\n\tq[l].z() = 0;\n\tT[l].setZero();\n\tq[frame_num - 1] = q[l] * Quaterniond(relative_R);\n\tT[frame_num - 1] = relative_T;\n\t//cout << \"init q_l \" << q[l].w() << \" \" << q[l].vec().transpose() << endl;\n\t//cout << \"init t_l \" << T[l].transpose() << endl;\n\n\t//rotate to cam frame\n\tMatrix3d c_Rotation[frame_num];\n\tVector3d c_Translation[frame_num];\n\tQuaterniond c_Quat[frame_num];\n\tdouble c_rotation[frame_num][4];\n\tdouble c_translation[frame_num][3];\n    double c_normal[3];\n\tEigen::Matrix<double, 3, 4> Pose[frame_num];\n\n\tc_Quat[l] = q[l].inverse();\n\tc_Rotation[l] = c_Quat[l].toRotationMatrix();\n\tc_Translation[l] = -1 * (c_Rotation[l] * T[l]);\n\tPose[l].block<3, 3>(0, 0) = c_Rotation[l];\n\tPose[l].block<3, 1>(0, 3) = c_Translation[l];\n\n\tc_Quat[frame_num - 1] = q[frame_num - 1].inverse();\n\tc_Rotation[frame_num - 1] = c_Quat[frame_num - 1].toRotationMatrix();\n\tc_Translation[frame_num - 1] = -1 * (c_Rotation[frame_num - 1] * T[frame_num - 1]);\n\tPose[frame_num - 1].block<3, 3>(0, 0) = c_Rotation[frame_num - 1];\n\tPose[frame_num - 1].block<3, 1>(0, 3) = c_Translation[frame_num - 1];\n\n\t//1: trangulate between l ----- frame_num - 1\n\t//2: solve pnp l + 1; trangulate l + 1 ------- frame_num - 1; \n\tfor (int i = l; i < frame_num - 1 ; i++)\n\t{\n\t\t// solve pnp\n\t\tif (i > l)\n\t\t{\n\t\t\tMatrix3d R_initial = c_Rotation[i - 1];\n\t\t\tVector3d P_initial = c_Translation[i - 1];\n\t\t\tif(!solveFrameByPnP(R_initial, P_initial, i, sfm_f))\n\t\t\t\treturn false;\n\t\t\tc_Rotation[i] = R_initial;\n\t\t\tc_Translation[i] = P_initial;\n\t\t\tc_Quat[i] = c_Rotation[i];\n\t\t\tPose[i].block<3, 3>(0, 0) = c_Rotation[i];\n\t\t\tPose[i].block<3, 1>(0, 3) = c_Translation[i];\n\t\t}\n\n\t\t// triangulate point based on the solve pnp result\n\t\ttriangulateTwoFrames(i, Pose[i], frame_num - 1, Pose[frame_num - 1], sfm_f);\n\t}\n\t//3: triangulate l-----l+1 l+2 ... frame_num -2\n\tfor (int i = l + 1; i < frame_num - 1; i++)\n\t\ttriangulateTwoFrames(l, Pose[l], i, Pose[i], sfm_f);\n\t//4: solve pnp l-1; triangulate l-1 ----- l\n\t//             l-2              l-2 ----- l\n\tfor (int i = l - 1; i >= 0; i--)\n\t{\n\t\t//solve pnp\n\t\tMatrix3d R_initial = c_Rotation[i + 1];\n\t\tVector3d P_initial = c_Translation[i + 1];\n\t\tif(!solveFrameByPnP(R_initial, P_initial, i, sfm_f))\n\t\t\treturn false;\n\t\tc_Rotation[i] = R_initial;\n\t\tc_Translation[i] = P_initial;\n\t\tc_Quat[i] = c_Rotation[i];\n\t\tPose[i].block<3, 3>(0, 0) = c_Rotation[i];\n\t\tPose[i].block<3, 1>(0, 3) = c_Translation[i];\n\t\t//triangulate\n\t\ttriangulateTwoFrames(i, Pose[i], l, Pose[l], sfm_f);\n\t}\n\t//5: triangulate all other points\n\tfor (int j = 0; j < feature_num; j++)\n\t{\n\t\tif (sfm_f[j].state == true)\n\t\t\tcontinue;\n\t\tif ((int)sfm_f[j].observation.size() >= 2)\n\t\t{\n\t\t\tVector2d point0, point1;\n\t\t\tint frame_0 = sfm_f[j].observation[0].first;\n\t\t\tpoint0 = sfm_f[j].observation[0].second;\n\t\t\tint frame_1 = sfm_f[j].observation.back().first;\n\t\t\tpoint1 = sfm_f[j].observation.back().second;\n\t\t\tVector3d point_3d;\n\t\t\ttriangulatePoint(Pose[frame_0], Pose[frame_1], point0, point1, point_3d);\n\t\t\tsfm_f[j].state = true;\n\t\t\tsfm_f[j].position[0] = point_3d(0);\n\t\t\tsfm_f[j].position[1] = point_3d(1);\n\t\t\tsfm_f[j].position[2] = point_3d(2);\n            sfm_f[j].image[0] = point1[0];\n            sfm_f[j].image[1] = point1[1];\n            sfm_f[j].image[2] = 1;\n\t\t\t//cout << \"trangulated : \" << frame_0 << \" \" << frame_1 << \"  3d point : \"  << j << \"  \" << point_3d.transpose() << endl;\n\t\t}\t\n\t}\n\n/*\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\t\tq[i] = c_Rotation[i].transpose(); \n\t\tcout << \"solvePnP  q\" << \" i \" << i <<\"  \" <<q[i].w() << \"  \" << q[i].vec().transpose() << endl;\n\t}\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\t\tVector3d t_tmp;\n\t\tt_tmp = -1 * (q[i] * c_Translation[i]);\n\t\tcout << \"solvePnP  t\" << \" i \" << i <<\"  \" << t_tmp.x() <<\"  \"<< t_tmp.y() <<\"  \"<< t_tmp.z() << endl;\n\t}\n*/\n\t//full BA\n\tceres::Problem problem;\n\tceres::LocalParameterization* local_parameterization = new ceres::QuaternionParameterization();\n\tceres::LocalParameterization* local_n_parameterization = new ceres::HomogeneousVectorParameterization(3);\n\t//cout << \" begin full BA \" << endl;\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\t\t//double array for ceres\n\t\tc_translation[i][0] = c_Translation[i].x();\n\t\tc_translation[i][1] = c_Translation[i].y();\n\t\tc_translation[i][2] = c_Translation[i].z();\n\t\tc_rotation[i][0] = c_Quat[i].w();\n\t\tc_rotation[i][1] = c_Quat[i].x();\n\t\tc_rotation[i][2] = c_Quat[i].y();\n\t\tc_rotation[i][3] = c_Quat[i].z();\n\t\tproblem.AddParameterBlock(c_rotation[i], 4, local_parameterization);\n\t\tproblem.AddParameterBlock(c_translation[i], 3);\n\t\tif (i == l)\n\t\t{\n\t\t\tproblem.SetParameterBlockConstant(c_rotation[i]);\n\t\t}\n\t\tif (i == l || i == frame_num - 1)\n\t\t{\n\t\t\tproblem.SetParameterBlockConstant(c_translation[i]);\n\t\t}\n\t}\n\n    c_normal[0] = n[0];\n    c_normal[1] = n[1];\n    c_normal[2] = n[2];\n    problem.AddParameterBlock(c_normal, 3, local_n_parameterization);\n\n\tfor (int i = 0; i < feature_num; i++)\n\t{\n\t\tif (sfm_f[i].state != true)\n\t\t\tcontinue;\n\t\tfor (int j = 0; j < int(sfm_f[i].observation.size()); j++)\n\t\t{\n\t\t\tint l = sfm_f[i].observation[j].first;\n            \n\t\t\tceres::CostFunction* b_cost_function = ReprojectionError3D::Create(\n\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n\n    \t\tproblem.AddResidualBlock(b_cost_function, NULL, c_rotation[l], c_translation[l],\n    \t\t\t\t\t\t\t\tsfm_f[i].position);\n\n\t\t\tceres::CostFunction* h_cost_function = ReprojectionErrorH::Create(\n\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.x(),\n\t\t\t\t\t\t\t\t\t\t\t\tsfm_f[i].observation[j].second.y());\n\n    \t\tproblem.AddResidualBlock(h_cost_function, NULL, c_rotation[l], c_translation[l], c_normal, sfm_f[i].position);\n\t\t}\n\t}\n\tceres::Solver::Options options;\n\toptions.linear_solver_type = ceres::DENSE_SCHUR;\n\t//options.minimizer_progress_to_stdout = true;\n\toptions.max_solver_time_in_seconds = 0.2;\n\tceres::Solver::Summary summary;\n\tceres::Solve(options, &problem, &summary);\n\t//std::cout << summary.BriefReport() << \"\\n\";\n\tif (summary.termination_type == ceres::CONVERGENCE || summary.final_cost < 5e-03)\n\t{\n\t\tcout << \"vision only BA converge\" << endl;\n\t}\n\telse\n\t{\n\t\tcout << \"vision only BA not converge \" << endl;\n\t\treturn false;\n\t}\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\t\tq[i].w() = c_rotation[i][0]; \n\t\tq[i].x() = c_rotation[i][1]; \n\t\tq[i].y() = c_rotation[i][2]; \n\t\tq[i].z() = c_rotation[i][3]; \n\t\tq[i] = q[i].inverse();\n\t\t//cout << \"final  q\" << \" i \" << i <<\"  \" <<q[i].w() << \"  \" << q[i].vec().transpose() << endl;\n\t}\n\tfor (int i = 0; i < frame_num; i++)\n\t{\n\n\t\tT[i] = -1 * (q[i] * Vector3d(c_translation[i][0], c_translation[i][1], c_translation[i][2]));\n\t\t//cout << \"final  t\" << \" i \" << i <<\"  \" << T[i](0) <<\"  \"<< T[i](1) <<\"  \"<< T[i](2) << endl;\n\t}\n\tfor (int i = 0; i < (int)sfm_f.size(); i++)\n\t{\n\t\tif(sfm_f[i].state)\n\t\t\tsfm_tracked_points[sfm_f[i].id] = Vector3d(sfm_f[i].position[0], sfm_f[i].position[1], sfm_f[i].position[2]);\n\t}\n\n    n[0] = c_normal[0];\n    n[1] = c_normal[1];\n    n[2] = c_normal[2];\n\n\treturn true;\n\n}\n\n\n"
  },
  {
    "path": "rpvio_estimator/src/initial/initial_sfm.h",
    "content": "#pragma once\n#include <ceres/ceres.h>\n#include <ceres/rotation.h>\n#include <eigen3/Eigen/Dense>\n#include <iostream>\n#include <cstdlib>\n#include <deque>\n#include <map>\n#include <opencv2/core/eigen.hpp>\n#include <opencv2/opencv.hpp>\nusing namespace Eigen;\nusing namespace std;\n\nstruct SFMFeature\n{\n    bool state;\n    int id;\n\tint plane_id;\n    vector<pair<int,Vector2d>> observation;\n    double position[3];\n    double image[3]; // u v 1\n    double depth;\n};\n\nstruct ReprojectionError3D\n{\n\tReprojectionError3D(double observed_u, double observed_v)\n\t\t:observed_u(observed_u), observed_v(observed_v)\n\t\t{}\n\n\ttemplate <typename T>\n\tbool operator()(const T* const camera_R, const T* const camera_T, const T* point, T* residuals) const\n\t{\n\t\tT p[3];\n\t\tceres::QuaternionRotatePoint(camera_R, point, p);\n\t\tp[0] += camera_T[0]; p[1] += camera_T[1]; p[2] += camera_T[2];\n\t\tT xp = p[0] / p[2];\n    \tT yp = p[1] / p[2];\n    \tresiduals[0] = xp - T(observed_u);\n    \tresiduals[1] = yp - T(observed_v);\n    \treturn true;\n\t}\n\n\tstatic ceres::CostFunction* Create(const double observed_x,\n\t                                   const double observed_y) \n\t{\n\t  return (new ceres::AutoDiffCostFunction<\n\t          ReprojectionError3D, 2, 4, 3, 3>(\n\t          \tnew ReprojectionError3D(observed_x,observed_y)));\n\t}\n\n\tdouble observed_u;\n\tdouble observed_v;\n};\n\n\nstruct ReprojectionErrorH\n{\n\tReprojectionErrorH(double observed_u, double observed_v)\n\t\t:observed_u(observed_u), observed_v(observed_v)\n\t\t{}\n\n\ttemplate <typename T>\n\tbool operator()(const T* const R, const T* const t, const T* const n, const T* point, T* residuals) const\n\t{\n\t\tT rp[3];\n\t\tT norm_point[3] = {point[0]/point[2], point[1]/point[2], point[2]/point[2]};\n        ceres::QuaternionRotatePoint(R, norm_point, rp);\n        T np = n[0] * norm_point[0] + n[1] * norm_point[1] + n[2] * norm_point[2];\n        rp[0] += t[0]*np; rp[1] += t[1]*np; rp[2] += t[2]*np;\n\t\tT xp = rp[0] / rp[2];\n    \tT yp = rp[1] / rp[2];\n    \tresiduals[0] = xp - T(observed_u);\n    \tresiduals[1] = yp - T(observed_v);\n    \treturn true;\n\t}\n\n\tstatic ceres::CostFunction* Create(const double observed_x,\n\t                                   const double observed_y) \n\t{\n\t  return (new ceres::AutoDiffCostFunction<\n\t          ReprojectionErrorH, 2, 4, 3, 3, 3>(\n\t          \tnew ReprojectionErrorH(observed_x,observed_y)));\n\t}\n\n\tdouble observed_u;\n\tdouble observed_v;\n};\n\n\nclass GlobalSFM\n{\npublic:\n\tGlobalSFM();\n\tbool construct(int frame_num, Quaterniond* q, Vector3d* T, int l,\n\t\t\t  const Matrix3d relative_R, const Vector3d relative_T,\n\t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points);\n\n    bool constructH(int frame_num, Quaterniond* q, Vector3d* T, int l,\n\t\t\t  const Matrix3d relative_R, const Vector3d relative_T, Vector3d &n,\n\t\t\t  vector<SFMFeature> &sfm_f, map<int, Vector3d> &sfm_tracked_points);\n\nprivate:\n\tbool solveFrameByPnP(Matrix3d &R_initial, Vector3d &P_initial, int i, vector<SFMFeature> &sfm_f);\n\n\tvoid triangulatePoint(Eigen::Matrix<double, 3, 4> &Pose0, Eigen::Matrix<double, 3, 4> &Pose1,\n\t\t\t\t\t\t\tVector2d &point0, Vector2d &point1, Vector3d &point_3d);\n\tvoid triangulateTwoFrames(int frame0, Eigen::Matrix<double, 3, 4> &Pose0, \n\t\t\t\t\t\t\t  int frame1, Eigen::Matrix<double, 3, 4> &Pose1,\n\t\t\t\t\t\t\t  vector<SFMFeature> &sfm_f);\n\n\tint feature_num;\n};\n"
  },
  {
    "path": "rpvio_estimator/src/initial/solve_5pts.cpp",
    "content": "#include \"solve_5pts.h\"\n#include <algorithm>\n\n\nnamespace cv {\n    void decomposeEssentialMat( InputArray _E, OutputArray _R1, OutputArray _R2, OutputArray _t )\n    {\n\n        Mat E = _E.getMat().reshape(1, 3);\n        CV_Assert(E.cols == 3 && E.rows == 3);\n\n        Mat D, U, Vt;\n        SVD::compute(E, D, U, Vt);\n\n        if (determinant(U) < 0) U *= -1.;\n        if (determinant(Vt) < 0) Vt *= -1.;\n\n        Mat W = (Mat_<double>(3, 3) << 0, 1, 0, -1, 0, 0, 0, 0, 1);\n        W.convertTo(W, E.type());\n\n        Mat R1, R2, t;\n        R1 = U * W * Vt;\n        R2 = U * W.t() * Vt;\n        t = U.col(2) * 1.0;\n\n        R1.copyTo(_R1);\n        R2.copyTo(_R2);\n        t.copyTo(_t);\n    }\n\n    int recoverPose( InputArray E, InputArray _points1, InputArray _points2, InputArray _cameraMatrix,\n                         OutputArray _R, OutputArray _t, InputOutputArray _mask)\n    {\n\n        Mat points1, points2, cameraMatrix;\n        _points1.getMat().convertTo(points1, CV_64F);\n        _points2.getMat().convertTo(points2, CV_64F);\n        _cameraMatrix.getMat().convertTo(cameraMatrix, CV_64F);\n\n        int npoints = points1.checkVector(2);\n        CV_Assert( npoints >= 0 && points2.checkVector(2) == npoints &&\n                                  points1.type() == points2.type());\n\n        CV_Assert(cameraMatrix.rows == 3 && cameraMatrix.cols == 3 && cameraMatrix.channels() == 1);\n\n        if (points1.channels() > 1)\n        {\n            points1 = points1.reshape(1, npoints);\n            points2 = points2.reshape(1, npoints);\n        }\n\n        double fx = cameraMatrix.at<double>(0,0);\n        double fy = cameraMatrix.at<double>(1,1);\n        double cx = cameraMatrix.at<double>(0,2);\n        double cy = cameraMatrix.at<double>(1,2);\n\n        points1.col(0) = (points1.col(0) - cx) / fx;\n        points2.col(0) = (points2.col(0) - cx) / fx;\n        points1.col(1) = (points1.col(1) - cy) / fy;\n        points2.col(1) = (points2.col(1) - cy) / fy;\n\n        points1 = points1.t();\n        points2 = points2.t();\n\n        Mat R1, R2, t;\n        decomposeEssentialMat(E, R1, R2, t);\n        Mat P0 = Mat::eye(3, 4, R1.type());\n        Mat P1(3, 4, R1.type()), P2(3, 4, R1.type()), P3(3, 4, R1.type()), P4(3, 4, R1.type());\n        P1(Range::all(), Range(0, 3)) = R1 * 1.0; P1.col(3) = t * 1.0;\n        P2(Range::all(), Range(0, 3)) = R2 * 1.0; P2.col(3) = t * 1.0;\n        P3(Range::all(), Range(0, 3)) = R1 * 1.0; P3.col(3) = -t * 1.0;\n        P4(Range::all(), Range(0, 3)) = R2 * 1.0; P4.col(3) = -t * 1.0;\n\n        // Do the cheirality check.\n        // Notice here a threshold dist is used to filter\n        // out far away points (i.e. infinite points) since\n        // there depth may vary between postive and negtive.\n        double dist = 50.0;\n        Mat Q;\n        triangulatePoints(P0, P1, points1, points2, Q);\n        Mat mask1 = Q.row(2).mul(Q.row(3)) > 0;\n        Q.row(0) /= Q.row(3);\n        Q.row(1) /= Q.row(3);\n        Q.row(2) /= Q.row(3);\n        Q.row(3) /= Q.row(3);\n        mask1 = (Q.row(2) < dist) & mask1;\n        Q = P1 * Q;\n        mask1 = (Q.row(2) > 0) & mask1;\n        mask1 = (Q.row(2) < dist) & mask1;\n\n        triangulatePoints(P0, P2, points1, points2, Q);\n        Mat mask2 = Q.row(2).mul(Q.row(3)) > 0;\n        Q.row(0) /= Q.row(3);\n        Q.row(1) /= Q.row(3);\n        Q.row(2) /= Q.row(3);\n        Q.row(3) /= Q.row(3);\n        mask2 = (Q.row(2) < dist) & mask2;\n        Q = P2 * Q;\n        mask2 = (Q.row(2) > 0) & mask2;\n        mask2 = (Q.row(2) < dist) & mask2;\n\n        triangulatePoints(P0, P3, points1, points2, Q);\n        Mat mask3 = Q.row(2).mul(Q.row(3)) > 0;\n        Q.row(0) /= Q.row(3);\n        Q.row(1) /= Q.row(3);\n        Q.row(2) /= Q.row(3);\n        Q.row(3) /= Q.row(3);\n        mask3 = (Q.row(2) < dist) & mask3;\n        Q = P3 * Q;\n        mask3 = (Q.row(2) > 0) & mask3;\n        mask3 = (Q.row(2) < dist) & mask3;\n\n        triangulatePoints(P0, P4, points1, points2, Q);\n        Mat mask4 = Q.row(2).mul(Q.row(3)) > 0;\n        Q.row(0) /= Q.row(3);\n        Q.row(1) /= Q.row(3);\n        Q.row(2) /= Q.row(3);\n        Q.row(3) /= Q.row(3);\n        mask4 = (Q.row(2) < dist) & mask4;\n        Q = P4 * Q;\n        mask4 = (Q.row(2) > 0) & mask4;\n        mask4 = (Q.row(2) < dist) & mask4;\n\n        mask1 = mask1.t();\n        mask2 = mask2.t();\n        mask3 = mask3.t();\n        mask4 = mask4.t();\n\n        // If _mask is given, then use it to filter outliers.\n        if (!_mask.empty())\n        {\n            Mat mask = _mask.getMat();\n            CV_Assert(mask.size() == mask1.size());\n            bitwise_and(mask, mask1, mask1);\n            bitwise_and(mask, mask2, mask2);\n            bitwise_and(mask, mask3, mask3);\n            bitwise_and(mask, mask4, mask4);\n        }\n        if (_mask.empty() && _mask.needed())\n        {\n            _mask.create(mask1.size(), CV_8U);\n        }\n\n        CV_Assert(_R.needed() && _t.needed());\n        _R.create(3, 3, R1.type());\n        _t.create(3, 1, t.type());\n\n        int good1 = countNonZero(mask1);\n        int good2 = countNonZero(mask2);\n        int good3 = countNonZero(mask3);\n        int good4 = countNonZero(mask4);\n\n        if (good1 >= good2 && good1 >= good3 && good1 >= good4)\n        {\n            R1.copyTo(_R);\n            t.copyTo(_t);\n            if (_mask.needed()) mask1.copyTo(_mask);\n            return good1;\n        }\n        else if (good2 >= good1 && good2 >= good3 && good2 >= good4)\n        {\n            R2.copyTo(_R);\n            t.copyTo(_t);\n            if (_mask.needed()) mask2.copyTo(_mask);\n            return good2;\n        }\n        else if (good3 >= good1 && good3 >= good2 && good3 >= good4)\n        {\n            t = -t;\n            R1.copyTo(_R);\n            t.copyTo(_t);\n            if (_mask.needed()) mask3.copyTo(_mask);\n            return good3;\n        }\n        else\n        {\n            t = -t;\n            R2.copyTo(_R);\n            t.copyTo(_t);\n            if (_mask.needed()) mask4.copyTo(_mask);\n            return good4;\n        }\n    }\n\n    int recoverPose( InputArray E, InputArray _points1, InputArray _points2, OutputArray _R,\n                         OutputArray _t, double focal, Point2d pp, InputOutputArray _mask)\n    {\n        Mat cameraMatrix = (Mat_<double>(3,3) << focal, 0, pp.x, 0, focal, pp.y, 0, 0, 1);\n        return cv::recoverPose(E, _points1, _points2, cameraMatrix, _R, _t, _mask);\n    }\n}\n\n\nbool MotionEstimator::solveRelativeRT(const vector<pair<Vector3d, Vector3d>> &corres, Matrix3d &Rotation, Vector3d &Translation)\n{\n    if (corres.size() >= 15)\n    {\n        vector<cv::Point2f> ll, rr;\n        for (int i = 0; i < int(corres.size()); i++)\n        {\n            ll.push_back(cv::Point2f(corres[i].first(0), corres[i].first(1)));\n            rr.push_back(cv::Point2f(corres[i].second(0), corres[i].second(1)));\n        }\n        cv::Mat mask;\n        cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask);\n        cv::Mat cameraMatrix = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n        cv::Mat rot, trans;\n        int inlier_cnt = cv::recoverPose(E, ll, rr, cameraMatrix, rot, trans, mask);\n        //cout << \"inlier_cnt \" << inlier_cnt << endl;\n\n        Eigen::Matrix3d R;\n        Eigen::Vector3d T;\n        for (int i = 0; i < 3; i++)\n        {   \n            T(i) = trans.at<double>(i, 0);\n            for (int j = 0; j < 3; j++)\n                R(i, j) = rot.at<double>(i, j);\n        }\n\n        Rotation = R.transpose();\n        Translation = -R.transpose() * T;\n        if(inlier_cnt > 12)\n            return true;\n        else\n            return false;\n    }\n    return false;\n}\n\n\nbool MotionEstimator::solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &Rotation, Vector3d &Translation, Vector3d &n)\n{\n    if (corres.size() >= 15)\n    {\n        vector<cv::Point2f> ll, rr;\n        for (int i = 0; i < int(corres.size()); i++)\n        {\n            ll.push_back(cv::Point2f(corres[i].first(0), corres[i].first(1)));\n            rr.push_back(cv::Point2f(corres[i].second(0), corres[i].second(1)));\n        }\n        cv::Mat mask;\n        //cv::Mat E = cv::findFundamentalMat(ll, rr, cv::FM_RANSAC, 0.3 / 460, 0.99, mask);\n        cv::Mat H = cv::findHomography(ll, rr, cv::RANSAC, 0.3/460, mask, 2000, 0.99);\n        cv::Mat K = (cv::Mat_<double>(3, 3) << 1, 0, 0, 0, 1, 0, 0, 0, 1);\n\n        // Compute mean point vector\n        Eigen::Vector3d mean_l(0, 0, 1);\n        int inlier_count = 0;\n        for(int i = 0; i < ll.size(); i++)\n        {\n            if(mask.at<uchar>(i,0) != 0)\n            {\n                mean_l(0) += ll[i].x;\n                mean_l(1) += ll[i].y;\n                inlier_count++;\n            }\n        }\n        mean_l(0) /= int(inlier_count);\n        mean_l(1) /= int(inlier_count);\n\n        Eigen::Matrix4d est_Tr;\n        Eigen::Vector3d est_n(0,0,0);\n        decomposeH(H, K, R_imu, TrIC, mean_l, est_Tr, est_n);\n        Rotation = est_Tr.block(0,0,3,3);\n        Translation = est_Tr.block(0,3,3,1);\n        n = est_n;\n        if(n.isZero())\n            return false;\n\n        return true;\n    }\n    return false;\n}\n\n\nvoid MotionEstimator::decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, \n    const Matrix4d &TrIC, const Vector3d &mean_l, Matrix4d &est_Tr, Vector3d &est_n)\n{\n    vector<cv::Mat> cv_Rs, cv_ts, cv_ns;\n    int n_sols = cv::decomposeHomographyMat(H, K, cv_Rs, cv_ts, cv_ns);\n\n    vector<Matrix4d> positive_depth_transforms;\n    vector<Vector3d> positive_depth_normals;\n\n    if(n_sols > 1)\n    {\n        for(int i = 0; i < n_sols; i++)\n        {\n            Matrix4d Tr = Matrix4d::Identity();\n            Matrix3d R;\n            cv::cv2eigen(cv_Rs[i], R);\n            Tr.block(0,0,3,3) = R;\n\n            Vector3d t;\n            cv::cv2eigen(cv_ts[i], t);\n            Tr.block(0,3,3,1) = t;\n\n            //Tr = TrIC * Tr * TrIC.inverse();\n            Tr = Tr.inverse().eval();\n\n            Vector3d n;\n            cv::cv2eigen(cv_ns[i], n);\n            n.normalize();\n            if(n.dot(mean_l) > 0)\n            {\n                positive_depth_transforms.push_back(Tr);\n                positive_depth_normals.push_back(n);\n            }\n        }\n\n        if(positive_depth_transforms.size() > 0)\n        {\n            vector<double> rot_diff;\n            for(size_t i = 0; i < positive_depth_transforms.size(); i++)\n            {\n                Eigen::Matrix4d Tr = TrIC * positive_depth_transforms[i] * TrIC.inverse();\n                Eigen::Matrix3d R = Tr.block(0,0,3,3);\n                double f = (R.transpose()*R_imu - MatrixXd::Identity(3,3)).norm();\n                rot_diff.push_back(f);\n            }\n\n            int min_index = std::min_element(rot_diff.begin(), rot_diff.end()) - rot_diff.begin();\n            est_Tr = positive_depth_transforms[min_index];\n            est_n = positive_depth_normals[min_index];\n        }\n    }\n\n    else\n    {\n        Matrix4d Tr = Matrix4d::Identity();\n        Matrix3d R;\n        cv::cv2eigen(cv_Rs[0], R);\n        Tr.block(0,0,3,3) = R;\n\n        Vector3d t;\n        cv::cv2eigen(cv_ts[0], t);\n        Tr.block(0,3,3,1) = t;\n\n        Vector3d n;\n        cv::cv2eigen(cv_ns[0], n);\n\n        //Tr = TrIC * Tr * TrIC.inverse();\n        est_Tr = Tr.inverse().eval();\n        est_n = n.normalized();\n    }\n}\n"
  },
  {
    "path": "rpvio_estimator/src/initial/solve_5pts.h",
    "content": "#pragma once\n\n#include <vector>\nusing namespace std;\n\n#include <opencv2/opencv.hpp>\n#include <eigen3/Eigen/Dense>\n#include <opencv2/core/eigen.hpp>\nusing namespace Eigen;\n\n#include <ros/console.h>\n\nclass MotionEstimator\n{\n  public:\n\n    bool solveRelativeRT(const vector<pair<Vector3d, Vector3d>> &corres, Matrix3d &R, Vector3d &T);\n    bool solveRelativeHRT(const vector<pair<Vector3d, Vector3d>> &corres, const Matrix3d &R_imu, const Matrix4d &TrIC, Matrix3d &R, Vector3d &T, Vector3d &n);\n    void decomposeH(const cv::Mat &H, const cv::Mat &K, const Matrix3d &R_imu, const  Matrix4d &TrIC, const Vector3d &mean_l, Matrix4d &est_Tr, Vector3d &est_n);\n\n  private:\n    double testTriangulation(const vector<cv::Point2f> &l,\n                             const vector<cv::Point2f> &r,\n                             cv::Mat_<double> R, cv::Mat_<double> t);\n    void decomposeE(cv::Mat E,\n                    cv::Mat_<double> &R1, cv::Mat_<double> &R2,\n                    cv::Mat_<double> &t1, cv::Mat_<double> &t2);\n};\n\n\n"
  },
  {
    "path": "rpvio_estimator/src/parameters.cpp",
    "content": "#include \"parameters.h\"\n\ndouble INIT_DEPTH;\ndouble MIN_PARALLAX;\ndouble ACC_N, ACC_W;\ndouble GYR_N, GYR_W;\n\nstd::vector<Eigen::Matrix3d> RIC;\nstd::vector<Eigen::Vector3d> TIC;\n\nEigen::Vector3d G{0.0, 0.0, 9.8};\n\ndouble BIAS_ACC_THRESHOLD;\ndouble BIAS_GYR_THRESHOLD;\ndouble SOLVER_TIME;\nint NUM_ITERATIONS;\nint ESTIMATE_EXTRINSIC;\nint ESTIMATE_TD;\nint ROLLING_SHUTTER;\nstd::string EX_CALIB_RESULT_PATH;\nstd::string RPVIO_RESULT_PATH;\nstd::string IMU_TOPIC;\ndouble ROW, COL;\ndouble TD, TR;\n\ntemplate <typename T>\nT readParam(ros::NodeHandle &n, std::string name)\n{\n    T ans;\n    if (n.getParam(name, ans))\n    {\n        ROS_INFO_STREAM(\"Loaded \" << name << \": \" << ans);\n    }\n    else\n    {\n        ROS_ERROR_STREAM(\"Failed to load \" << name);\n        n.shutdown();\n    }\n    return ans;\n}\n\nvoid readParameters(ros::NodeHandle &n)\n{\n    std::string config_file;\n    config_file = readParam<std::string>(n, \"config_file\");\n    cv::FileStorage fsSettings(config_file, cv::FileStorage::READ);\n    if(!fsSettings.isOpened())\n    {\n        std::cerr << \"ERROR: Wrong path to settings\" << std::endl;\n    }\n\n    fsSettings[\"imu_topic\"] >> IMU_TOPIC;\n\n    SOLVER_TIME = fsSettings[\"max_solver_time\"];\n    NUM_ITERATIONS = fsSettings[\"max_num_iterations\"];\n    MIN_PARALLAX = fsSettings[\"keyframe_parallax\"];\n    MIN_PARALLAX = MIN_PARALLAX / FOCAL_LENGTH;\n\n    std::string OUTPUT_PATH;\n    fsSettings[\"output_path\"] >> OUTPUT_PATH;\n    RPVIO_RESULT_PATH = OUTPUT_PATH + \"/rpvio_result_no_loop.csv\";\n    std::cout << \"result path \" << RPVIO_RESULT_PATH << std::endl;\n\n    // create folder if not exists\n    FileSystemHelper::createDirectoryIfNotExists(OUTPUT_PATH.c_str());\n\n    std::ofstream fout(RPVIO_RESULT_PATH, std::ios::out);\n    fout.close();\n\n    ACC_N = fsSettings[\"acc_n\"];\n    ACC_W = fsSettings[\"acc_w\"];\n    GYR_N = fsSettings[\"gyr_n\"];\n    GYR_W = fsSettings[\"gyr_w\"];\n    G.z() = fsSettings[\"g_norm\"];\n    ROW = fsSettings[\"image_height\"];\n    COL = fsSettings[\"image_width\"];\n    ROS_INFO(\"ROW: %f COL: %f \", ROW, COL);\n\n    ESTIMATE_EXTRINSIC = fsSettings[\"estimate_extrinsic\"];\n    if (ESTIMATE_EXTRINSIC == 2)\n    {\n        ROS_WARN(\"have no prior about extrinsic param, calibrate extrinsic param\");\n        RIC.push_back(Eigen::Matrix3d::Identity());\n        TIC.push_back(Eigen::Vector3d::Zero());\n        EX_CALIB_RESULT_PATH = OUTPUT_PATH + \"/extrinsic_parameter.csv\";\n\n    }\n    else \n    {\n        if ( ESTIMATE_EXTRINSIC == 1)\n        {\n            ROS_WARN(\" Optimize extrinsic param around initial guess!\");\n            EX_CALIB_RESULT_PATH = OUTPUT_PATH + \"/extrinsic_parameter.csv\";\n        }\n        if (ESTIMATE_EXTRINSIC == 0)\n            ROS_WARN(\" fix extrinsic param \");\n\n        cv::Mat cv_R, cv_T;\n        fsSettings[\"extrinsicRotation\"] >> cv_R;\n        fsSettings[\"extrinsicTranslation\"] >> cv_T;\n        Eigen::Matrix3d eigen_R;\n        Eigen::Vector3d eigen_T;\n        cv::cv2eigen(cv_R, eigen_R);\n        cv::cv2eigen(cv_T, eigen_T);\n        Eigen::Quaterniond Q(eigen_R);\n        eigen_R = Q.normalized();\n        RIC.push_back(eigen_R);\n        TIC.push_back(eigen_T);\n        ROS_INFO_STREAM(\"Extrinsic_R : \" << std::endl << RIC[0]);\n        ROS_INFO_STREAM(\"Extrinsic_T : \" << std::endl << TIC[0].transpose());\n        \n    } \n\n    INIT_DEPTH = 5.0;\n    BIAS_ACC_THRESHOLD = 0.1;\n    BIAS_GYR_THRESHOLD = 0.1;\n\n    TD = fsSettings[\"td\"];\n    ESTIMATE_TD = fsSettings[\"estimate_td\"];\n    if (ESTIMATE_TD)\n        ROS_INFO_STREAM(\"Unsynchronized sensors, online estimate time offset, initial td: \" << TD);\n    else\n        ROS_INFO_STREAM(\"Synchronized sensors, fix time offset: \" << TD);\n\n    ROLLING_SHUTTER = fsSettings[\"rolling_shutter\"];\n    if (ROLLING_SHUTTER)\n    {\n        TR = fsSettings[\"rolling_shutter_tr\"];\n        ROS_INFO_STREAM(\"rolling shutter camera, read out time per line: \" << TR);\n    }\n    else\n    {\n        TR = 0;\n    }\n    \n    fsSettings.release();\n}\n"
  },
  {
    "path": "rpvio_estimator/src/parameters.h",
    "content": "#pragma once\n\n#include <ros/ros.h>\n#include <vector>\n#include <eigen3/Eigen/Dense>\n#include \"utility/utility.h\"\n#include <opencv2/opencv.hpp>\n#include <opencv2/core/eigen.hpp>\n#include <fstream>\n\nconst double FOCAL_LENGTH = 460.0;\nconst int WINDOW_SIZE = 10;\nconst int NUM_OF_CAM = 1;\nconst int NUM_OF_F = 1000;\n//#define UNIT_SPHERE_ERROR\n\nextern double INIT_DEPTH;\nextern double MIN_PARALLAX;\nextern int ESTIMATE_EXTRINSIC;\n\nextern double ACC_N, ACC_W;\nextern double GYR_N, GYR_W;\n\nextern std::vector<Eigen::Matrix3d> RIC;\nextern std::vector<Eigen::Vector3d> TIC;\nextern Eigen::Vector3d G;\n\nextern double BIAS_ACC_THRESHOLD;\nextern double BIAS_GYR_THRESHOLD;\nextern double SOLVER_TIME;\nextern int NUM_ITERATIONS;\nextern std::string EX_CALIB_RESULT_PATH;\nextern std::string RPVIO_RESULT_PATH;\nextern std::string IMU_TOPIC;\nextern double TD;\nextern double TR;\nextern int ESTIMATE_TD;\nextern int ROLLING_SHUTTER;\nextern double ROW, COL;\n\n\nvoid readParameters(ros::NodeHandle &n);\n\nenum SIZE_PARAMETERIZATION\n{\n    SIZE_POSE = 7,\n    SIZE_SPEEDBIAS = 9,\n    SIZE_FEATURE = 1\n};\n\nenum StateOrder\n{\n    O_P = 0,\n    O_R = 3,\n    O_V = 6,\n    O_BA = 9,\n    O_BG = 12\n};\n\nenum NoiseOrder\n{\n    O_AN = 0,\n    O_GN = 3,\n    O_AW = 6,\n    O_GW = 9\n};\n"
  },
  {
    "path": "rpvio_estimator/src/utility/CameraPoseVisualization.cpp",
    "content": "#include \"CameraPoseVisualization.h\"\n\nconst Eigen::Vector3d CameraPoseVisualization::imlt = Eigen::Vector3d(-1.0, -0.5, 1.0);\nconst Eigen::Vector3d CameraPoseVisualization::imrt = Eigen::Vector3d( 1.0, -0.5, 1.0);\nconst Eigen::Vector3d CameraPoseVisualization::imlb = Eigen::Vector3d(-1.0,  0.5, 1.0);\nconst Eigen::Vector3d CameraPoseVisualization::imrb = Eigen::Vector3d( 1.0,  0.5, 1.0);\nconst Eigen::Vector3d CameraPoseVisualization::lt0 = Eigen::Vector3d(-0.7, -0.5, 1.0);\nconst Eigen::Vector3d CameraPoseVisualization::lt1 = Eigen::Vector3d(-0.7, -0.2, 1.0);\nconst Eigen::Vector3d CameraPoseVisualization::lt2 = Eigen::Vector3d(-1.0, -0.2, 1.0);\nconst Eigen::Vector3d CameraPoseVisualization::oc = Eigen::Vector3d(0.0, 0.0, 0.0);\n\nvoid Eigen2Point(const Eigen::Vector3d& v, geometry_msgs::Point& p) {\n    p.x = v.x();\n    p.y = v.y();\n    p.z = v.z();\n}\n\nCameraPoseVisualization::CameraPoseVisualization(float r, float g, float b, float a)\n    : m_marker_ns(\"CameraPoseVisualization\"), m_scale(0.2), m_line_width(0.01) {\n    m_image_boundary_color.r = r;\n    m_image_boundary_color.g = g;\n    m_image_boundary_color.b = b;\n    m_image_boundary_color.a = a;\n    m_optical_center_connector_color.r = r;\n    m_optical_center_connector_color.g = g;\n    m_optical_center_connector_color.b = b;\n    m_optical_center_connector_color.a = a;\n}\n\nvoid CameraPoseVisualization::setImageBoundaryColor(float r, float g, float b, float a) {\n    m_image_boundary_color.r = r;\n    m_image_boundary_color.g = g;\n    m_image_boundary_color.b = b;\n    m_image_boundary_color.a = a;\n}\n\nvoid CameraPoseVisualization::setOpticalCenterConnectorColor(float r, float g, float b, float a) {\n    m_optical_center_connector_color.r = r;\n    m_optical_center_connector_color.g = g;\n    m_optical_center_connector_color.b = b;\n    m_optical_center_connector_color.a = a;\n}\n\nvoid CameraPoseVisualization::setScale(double s) {\n    m_scale = s;\n}\nvoid CameraPoseVisualization::setLineWidth(double width) {\n    m_line_width = width;\n}\nvoid CameraPoseVisualization::add_edge(const Eigen::Vector3d& p0, const Eigen::Vector3d& p1){\n    visualization_msgs::Marker marker;\n\n    marker.ns = m_marker_ns;\n    marker.id = m_markers.size() + 1;\n    marker.type = visualization_msgs::Marker::LINE_LIST;\n    marker.action = visualization_msgs::Marker::ADD;\n    marker.scale.x = 0.005;\n\n    marker.color.g = 1.0f;\n    marker.color.a = 1.0;\n\n    geometry_msgs::Point point0, point1;\n\n    Eigen2Point(p0, point0);\n    Eigen2Point(p1, point1);\n\n    marker.points.push_back(point0);\n    marker.points.push_back(point1);\n\n    m_markers.push_back(marker);\n}\n\nvoid CameraPoseVisualization::add_loopedge(const Eigen::Vector3d& p0, const Eigen::Vector3d& p1){\n    visualization_msgs::Marker marker;\n\n    marker.ns = m_marker_ns;\n    marker.id = m_markers.size() + 1;\n    marker.type = visualization_msgs::Marker::LINE_LIST;\n    marker.action = visualization_msgs::Marker::ADD;\n    marker.scale.x = 0.04;\n    //marker.scale.x = 0.3;\n\n    marker.color.r = 1.0f;\n    marker.color.b = 1.0f;\n    marker.color.a = 1.0;\n\n    geometry_msgs::Point point0, point1;\n\n    Eigen2Point(p0, point0);\n    Eigen2Point(p1, point1);\n\n    marker.points.push_back(point0);\n    marker.points.push_back(point1);\n\n    m_markers.push_back(marker);\n}\n\n\nvoid CameraPoseVisualization::add_pose(const Eigen::Vector3d& p, const Eigen::Quaterniond& q) {\n    visualization_msgs::Marker marker;\n\n    marker.ns = m_marker_ns;\n    marker.id = m_markers.size() + 1;\n    marker.type = visualization_msgs::Marker::LINE_STRIP;\n    marker.action = visualization_msgs::Marker::ADD;\n    marker.scale.x = m_line_width;\n\n    marker.pose.position.x = 0.0;\n    marker.pose.position.y = 0.0;\n    marker.pose.position.z = 0.0;\n    marker.pose.orientation.w = 1.0;\n    marker.pose.orientation.x = 0.0;\n    marker.pose.orientation.y = 0.0;\n    marker.pose.orientation.z = 0.0;\n\n\n    geometry_msgs::Point pt_lt, pt_lb, pt_rt, pt_rb, pt_oc, pt_lt0, pt_lt1, pt_lt2;\n\n    Eigen2Point(q * (m_scale *imlt) + p, pt_lt);\n    Eigen2Point(q * (m_scale *imlb) + p, pt_lb);\n    Eigen2Point(q * (m_scale *imrt) + p, pt_rt);\n    Eigen2Point(q * (m_scale *imrb) + p, pt_rb);\n    Eigen2Point(q * (m_scale *lt0 ) + p, pt_lt0);\n    Eigen2Point(q * (m_scale *lt1 ) + p, pt_lt1);\n    Eigen2Point(q * (m_scale *lt2 ) + p, pt_lt2);\n    Eigen2Point(q * (m_scale *oc  ) + p, pt_oc);\n\n    // image boundaries\n    marker.points.push_back(pt_lt);\n    marker.points.push_back(pt_lb);\n    marker.colors.push_back(m_image_boundary_color);\n    marker.colors.push_back(m_image_boundary_color);\n\n    marker.points.push_back(pt_lb);\n    marker.points.push_back(pt_rb);\n    marker.colors.push_back(m_image_boundary_color);\n    marker.colors.push_back(m_image_boundary_color);\n\n    marker.points.push_back(pt_rb);\n    marker.points.push_back(pt_rt);\n    marker.colors.push_back(m_image_boundary_color);\n    marker.colors.push_back(m_image_boundary_color);\n\n    marker.points.push_back(pt_rt);\n    marker.points.push_back(pt_lt);\n    marker.colors.push_back(m_image_boundary_color);\n    marker.colors.push_back(m_image_boundary_color);\n\n    // top-left indicator\n    marker.points.push_back(pt_lt0);\n    marker.points.push_back(pt_lt1);\n    marker.colors.push_back(m_image_boundary_color);\n    marker.colors.push_back(m_image_boundary_color);\n\n    marker.points.push_back(pt_lt1);\n    marker.points.push_back(pt_lt2);\n    marker.colors.push_back(m_image_boundary_color);\n    marker.colors.push_back(m_image_boundary_color);\n\n    // optical center connector\n    marker.points.push_back(pt_lt);\n    marker.points.push_back(pt_oc);\n    marker.colors.push_back(m_optical_center_connector_color);\n    marker.colors.push_back(m_optical_center_connector_color);\n\n\n    marker.points.push_back(pt_lb);\n    marker.points.push_back(pt_oc);\n    marker.colors.push_back(m_optical_center_connector_color);\n    marker.colors.push_back(m_optical_center_connector_color);\n\n    marker.points.push_back(pt_rt);\n    marker.points.push_back(pt_oc);\n    marker.colors.push_back(m_optical_center_connector_color);\n    marker.colors.push_back(m_optical_center_connector_color);\n\n    marker.points.push_back(pt_rb);\n    marker.points.push_back(pt_oc);\n    marker.colors.push_back(m_optical_center_connector_color);\n    marker.colors.push_back(m_optical_center_connector_color);\n\n    m_markers.push_back(marker);\n}\n\nvoid CameraPoseVisualization::reset() {\n\tm_markers.clear();\n}\n\nvoid CameraPoseVisualization::publish_by( ros::Publisher &pub, const std_msgs::Header &header ) {\n\tvisualization_msgs::MarkerArray markerArray_msg;\n\t\n\tfor(auto& marker : m_markers) {\n\t\tmarker.header = header;\n\t\tmarkerArray_msg.markers.push_back(marker);\n\t}\n\n\tpub.publish(markerArray_msg);\n}"
  },
  {
    "path": "rpvio_estimator/src/utility/CameraPoseVisualization.h",
    "content": "#pragma once\n\n#include <ros/ros.h>\n#include <std_msgs/ColorRGBA.h>\n#include <visualization_msgs/Marker.h>\n#include <visualization_msgs/MarkerArray.h>\n#include <Eigen/Dense>\n#include <Eigen/Geometry>\n\nclass CameraPoseVisualization {\npublic:\n\tstd::string m_marker_ns;\n\n\tCameraPoseVisualization(float r, float g, float b, float a);\n\t\n\tvoid setImageBoundaryColor(float r, float g, float b, float a=1.0);\n\tvoid setOpticalCenterConnectorColor(float r, float g, float b, float a=1.0);\n\tvoid setScale(double s);\n\tvoid setLineWidth(double width);\n\n\tvoid add_pose(const Eigen::Vector3d& p, const Eigen::Quaterniond& q);\n\tvoid reset();\n\n\tvoid publish_by(ros::Publisher& pub, const std_msgs::Header& header);\n\tvoid add_edge(const Eigen::Vector3d& p0, const Eigen::Vector3d& p1);\n\tvoid add_loopedge(const Eigen::Vector3d& p0, const Eigen::Vector3d& p1);\nprivate:\n\tstd::vector<visualization_msgs::Marker> m_markers;\n\tstd_msgs::ColorRGBA m_image_boundary_color;\n\tstd_msgs::ColorRGBA m_optical_center_connector_color;\n\tdouble m_scale;\n\tdouble m_line_width;\n\n\tstatic const Eigen::Vector3d imlt;\n\tstatic const Eigen::Vector3d imlb;\n\tstatic const Eigen::Vector3d imrt;\n\tstatic const Eigen::Vector3d imrb;\n\tstatic const Eigen::Vector3d oc  ;\n\tstatic const Eigen::Vector3d lt0 ;\n\tstatic const Eigen::Vector3d lt1 ;\n\tstatic const Eigen::Vector3d lt2 ;\n};\n"
  },
  {
    "path": "rpvio_estimator/src/utility/tic_toc.h",
    "content": "#pragma once\n\n#include <ctime>\n#include <cstdlib>\n#include <chrono>\n\nclass TicToc\n{\n  public:\n    TicToc()\n    {\n        tic();\n    }\n\n    void tic()\n    {\n        start = std::chrono::system_clock::now();\n    }\n\n    double toc()\n    {\n        end = std::chrono::system_clock::now();\n        std::chrono::duration<double> elapsed_seconds = end - start;\n        return elapsed_seconds.count() * 1000;\n    }\n\n  private:\n    std::chrono::time_point<std::chrono::system_clock> start, end;\n};\n"
  },
  {
    "path": "rpvio_estimator/src/utility/utility.cpp",
    "content": "#include \"utility.h\"\n\nEigen::Matrix3d Utility::g2R(const Eigen::Vector3d &g)\n{\n    Eigen::Matrix3d R0;\n    Eigen::Vector3d ng1 = g.normalized();\n    Eigen::Vector3d ng2{0, 0, 1.0};\n    R0 = Eigen::Quaterniond::FromTwoVectors(ng1, ng2).toRotationMatrix();\n    double yaw = Utility::R2ypr(R0).x();\n    R0 = Utility::ypr2R(Eigen::Vector3d{-yaw, 0, 0}) * R0;\n    // R0 = Utility::ypr2R(Eigen::Vector3d{-90, 0, 0}) * R0;\n    return R0;\n}\n"
  },
  {
    "path": "rpvio_estimator/src/utility/utility.h",
    "content": "#pragma once\n#include <sys/types.h>\n#include <sys/stat.h>\n#include <libgen.h>\n#include <stdio.h>\n#include <stdlib.h>\n#include <cmath>\n#include <cassert>\n#include <cstring>\n#include <eigen3/Eigen/Dense>\n\nclass Utility\n{\n  public:\n    template <typename Derived>\n    static Eigen::Quaternion<typename Derived::Scalar> deltaQ(const Eigen::MatrixBase<Derived> &theta)\n    {\n        typedef typename Derived::Scalar Scalar_t;\n\n        Eigen::Quaternion<Scalar_t> dq;\n        Eigen::Matrix<Scalar_t, 3, 1> half_theta = theta;\n        half_theta /= static_cast<Scalar_t>(2.0);\n        dq.w() = static_cast<Scalar_t>(1.0);\n        dq.x() = half_theta.x();\n        dq.y() = half_theta.y();\n        dq.z() = half_theta.z();\n        return dq;\n    }\n\n    template <typename Derived>\n    static Eigen::Matrix<typename Derived::Scalar, 3, 3> skewSymmetric(const Eigen::MatrixBase<Derived> &q)\n    {\n        Eigen::Matrix<typename Derived::Scalar, 3, 3> ans;\n        ans << typename Derived::Scalar(0), -q(2), q(1),\n            q(2), typename Derived::Scalar(0), -q(0),\n            -q(1), q(0), typename Derived::Scalar(0);\n        return ans;\n    }\n\n    template <typename Derived>\n    static Eigen::Quaternion<typename Derived::Scalar> positify(const Eigen::QuaternionBase<Derived> &q)\n    {\n        //printf(\"a: %f %f %f %f\", q.w(), q.x(), q.y(), q.z());\n        //Eigen::Quaternion<typename Derived::Scalar> p(-q.w(), -q.x(), -q.y(), -q.z());\n        //printf(\"b: %f %f %f %f\", p.w(), p.x(), p.y(), p.z());\n        //return q.template w() >= (typename Derived::Scalar)(0.0) ? q : Eigen::Quaternion<typename Derived::Scalar>(-q.w(), -q.x(), -q.y(), -q.z());\n        return q;\n    }\n\n    template <typename Derived>\n    static Eigen::Matrix<typename Derived::Scalar, 4, 4> Qleft(const Eigen::QuaternionBase<Derived> &q)\n    {\n        Eigen::Quaternion<typename Derived::Scalar> qq = positify(q);\n        Eigen::Matrix<typename Derived::Scalar, 4, 4> ans;\n        ans(0, 0) = qq.w(), ans.template block<1, 3>(0, 1) = -qq.vec().transpose();\n        ans.template block<3, 1>(1, 0) = qq.vec(), ans.template block<3, 3>(1, 1) = qq.w() * Eigen::Matrix<typename Derived::Scalar, 3, 3>::Identity() + skewSymmetric(qq.vec());\n        return ans;\n    }\n\n    template <typename Derived>\n    static Eigen::Matrix<typename Derived::Scalar, 4, 4> Qright(const Eigen::QuaternionBase<Derived> &p)\n    {\n        Eigen::Quaternion<typename Derived::Scalar> pp = positify(p);\n        Eigen::Matrix<typename Derived::Scalar, 4, 4> ans;\n        ans(0, 0) = pp.w(), ans.template block<1, 3>(0, 1) = -pp.vec().transpose();\n        ans.template block<3, 1>(1, 0) = pp.vec(), ans.template block<3, 3>(1, 1) = pp.w() * Eigen::Matrix<typename Derived::Scalar, 3, 3>::Identity() - skewSymmetric(pp.vec());\n        return ans;\n    }\n\n    static Eigen::Vector3d R2ypr(const Eigen::Matrix3d &R)\n    {\n        Eigen::Vector3d n = R.col(0);\n        Eigen::Vector3d o = R.col(1);\n        Eigen::Vector3d a = R.col(2);\n\n        Eigen::Vector3d ypr(3);\n        double y = atan2(n(1), n(0));\n        double p = atan2(-n(2), n(0) * cos(y) + n(1) * sin(y));\n        double r = atan2(a(0) * sin(y) - a(1) * cos(y), -o(0) * sin(y) + o(1) * cos(y));\n        ypr(0) = y;\n        ypr(1) = p;\n        ypr(2) = r;\n\n        return ypr / M_PI * 180.0;\n    }\n\n    template <typename Derived>\n    static Eigen::Matrix<typename Derived::Scalar, 3, 3> ypr2R(const Eigen::MatrixBase<Derived> &ypr)\n    {\n        typedef typename Derived::Scalar Scalar_t;\n\n        Scalar_t y = ypr(0) / 180.0 * M_PI;\n        Scalar_t p = ypr(1) / 180.0 * M_PI;\n        Scalar_t r = ypr(2) / 180.0 * M_PI;\n\n        Eigen::Matrix<Scalar_t, 3, 3> Rz;\n        Rz << cos(y), -sin(y), 0,\n            sin(y), cos(y), 0,\n            0, 0, 1;\n\n        Eigen::Matrix<Scalar_t, 3, 3> Ry;\n        Ry << cos(p), 0., sin(p),\n            0., 1., 0.,\n            -sin(p), 0., cos(p);\n\n        Eigen::Matrix<Scalar_t, 3, 3> Rx;\n        Rx << 1., 0., 0.,\n            0., cos(r), -sin(r),\n            0., sin(r), cos(r);\n\n        return Rz * Ry * Rx;\n    }\n\n    static Eigen::Matrix3d g2R(const Eigen::Vector3d &g);\n\n    template <size_t N>\n    struct uint_\n    {\n    };\n\n    template <size_t N, typename Lambda, typename IterT>\n    void unroller(const Lambda &f, const IterT &iter, uint_<N>)\n    {\n        unroller(f, iter, uint_<N - 1>());\n        f(iter + N);\n    }\n\n    template <typename Lambda, typename IterT>\n    void unroller(const Lambda &f, const IterT &iter, uint_<0>)\n    {\n        f(iter);\n    }\n\n    template <typename T>\n    static T normalizeAngle(const T& angle_degrees) {\n      T two_pi(2.0 * 180);\n      if (angle_degrees > 0)\n      return angle_degrees -\n          two_pi * std::floor((angle_degrees + T(180)) / two_pi);\n      else\n        return angle_degrees +\n            two_pi * std::floor((-angle_degrees + T(180)) / two_pi);\n    };\n};\n\nclass FileSystemHelper\n{\n  public:\n\n    /******************************************************************************\n     * Recursively create directory if `path` not exists.\n     * Return 0 if success.\n     *****************************************************************************/\n    static int createDirectoryIfNotExists(const char *path)\n    {\n        struct stat info;\n        int statRC = stat(path, &info);\n        if( statRC != 0 )\n        {\n            if (errno == ENOENT)  \n            {\n                printf(\"%s not exists, trying to create it \\n\", path);\n                if (! createDirectoryIfNotExists(dirname(strdupa(path))))\n                {\n                    if (mkdir(path, S_IRWXU | S_IRWXG | S_IROTH | S_IXOTH) != 0)\n                    {\n                        fprintf(stderr, \"Failed to create folder %s \\n\", path);\n                        return 1;\n                    }\n                    else\n                        return 0;\n                }\n                else \n                    return 1;\n            } // directory not exists\n            if (errno == ENOTDIR) \n            { \n                fprintf(stderr, \"%s is not a directory path \\n\", path);\n                return 1; \n            } // something in path prefix is not a dir\n            return 1;\n        }\n        return ( info.st_mode & S_IFDIR ) ? 0 : 1;\n    }\n};\n"
  },
  {
    "path": "rpvio_estimator/src/utility/visualization.cpp",
    "content": "#include \"visualization.h\"\n\nusing namespace ros;\nusing namespace Eigen;\nros::Publisher pub_odometry, pub_latest_odometry;\nros::Publisher pub_path, pub_relo_path;\nros::Publisher pub_point_cloud, pub_margin_cloud;\nros::Publisher pub_key_poses;\nros::Publisher pub_relo_relative_pose;\nros::Publisher pub_camera_pose;\nros::Publisher pub_camera_pose_visual;\nnav_msgs::Path path, relo_path;\n\nros::Publisher pub_keyframe_pose;\nros::Publisher pub_keyframe_point;\nros::Publisher pub_extrinsic;\n\nCameraPoseVisualization cameraposevisual(0, 1, 0, 1);\nCameraPoseVisualization keyframebasevisual(0.0, 0.0, 1.0, 1.0);\nstatic double sum_of_path = 0;\nstatic Vector3d last_path(0.0, 0.0, 0.0);\n\nvoid registerPub(ros::NodeHandle &n)\n{\n    pub_latest_odometry = n.advertise<nav_msgs::Odometry>(\"imu_propagate\", 1000);\n    pub_path = n.advertise<nav_msgs::Path>(\"path\", 1000);\n    pub_relo_path = n.advertise<nav_msgs::Path>(\"relocalization_path\", 1000);\n    pub_odometry = n.advertise<nav_msgs::Odometry>(\"odometry\", 1000);\n    pub_point_cloud = n.advertise<sensor_msgs::PointCloud>(\"point_cloud\", 1000);\n    pub_margin_cloud = n.advertise<sensor_msgs::PointCloud>(\"history_cloud\", 1000);\n    pub_key_poses = n.advertise<visualization_msgs::Marker>(\"key_poses\", 1000);\n    pub_camera_pose = n.advertise<nav_msgs::Odometry>(\"camera_pose\", 1000);\n    pub_camera_pose_visual = n.advertise<visualization_msgs::MarkerArray>(\"camera_pose_visual\", 1000);\n    pub_keyframe_pose = n.advertise<nav_msgs::Odometry>(\"keyframe_pose\", 1000);\n    pub_keyframe_point = n.advertise<sensor_msgs::PointCloud>(\"keyframe_point\", 1000);\n    pub_extrinsic = n.advertise<nav_msgs::Odometry>(\"extrinsic\", 1000);\n    pub_relo_relative_pose=  n.advertise<nav_msgs::Odometry>(\"relo_relative_pose\", 1000);\n\n    cameraposevisual.setScale(1);\n    cameraposevisual.setLineWidth(0.05);\n    keyframebasevisual.setScale(0.1);\n    keyframebasevisual.setLineWidth(0.01);\n}\n\nvoid pubLatestOdometry(const Eigen::Vector3d &P, const Eigen::Quaterniond &Q, const Eigen::Vector3d &V, const std_msgs::Header &header)\n{\n    Eigen::Quaterniond quadrotor_Q = Q ;\n\n    nav_msgs::Odometry odometry;\n    odometry.header = header;\n    odometry.header.frame_id = \"world\";\n    odometry.pose.pose.position.x = P.x();\n    odometry.pose.pose.position.y = P.y();\n    odometry.pose.pose.position.z = P.z();\n    odometry.pose.pose.orientation.x = quadrotor_Q.x();\n    odometry.pose.pose.orientation.y = quadrotor_Q.y();\n    odometry.pose.pose.orientation.z = quadrotor_Q.z();\n    odometry.pose.pose.orientation.w = quadrotor_Q.w();\n    odometry.twist.twist.linear.x = V.x();\n    odometry.twist.twist.linear.y = V.y();\n    odometry.twist.twist.linear.z = V.z();\n    pub_latest_odometry.publish(odometry);\n}\n\nvoid printStatistics(const Estimator &estimator, double t)\n{\n    if (estimator.solver_flag != Estimator::SolverFlag::NON_LINEAR)\n        return;\n    printf(\"position: %f, %f, %f\\r\", estimator.Ps[WINDOW_SIZE].x(), estimator.Ps[WINDOW_SIZE].y(), estimator.Ps[WINDOW_SIZE].z());\n    ROS_DEBUG_STREAM(\"position: \" << estimator.Ps[WINDOW_SIZE].transpose());\n    ROS_DEBUG_STREAM(\"orientation: \" << estimator.Vs[WINDOW_SIZE].transpose());\n    for (int i = 0; i < NUM_OF_CAM; i++)\n    {\n        //ROS_DEBUG(\"calibration result for camera %d\", i);\n        ROS_DEBUG_STREAM(\"extirnsic tic: \" << estimator.tic[i].transpose());\n        ROS_DEBUG_STREAM(\"extrinsic ric: \" << Utility::R2ypr(estimator.ric[i]).transpose());\n        if (ESTIMATE_EXTRINSIC)\n        {\n            cv::FileStorage fs(EX_CALIB_RESULT_PATH, cv::FileStorage::WRITE);\n            Eigen::Matrix3d eigen_R;\n            Eigen::Vector3d eigen_T;\n            eigen_R = estimator.ric[i];\n            eigen_T = estimator.tic[i];\n            cv::Mat cv_R, cv_T;\n            cv::eigen2cv(eigen_R, cv_R);\n            cv::eigen2cv(eigen_T, cv_T);\n            fs << \"extrinsicRotation\" << cv_R << \"extrinsicTranslation\" << cv_T;\n            fs.release();\n        }\n    }\n\n    static double sum_of_time = 0;\n    static int sum_of_calculation = 0;\n    sum_of_time += t;\n    sum_of_calculation++;\n    ROS_DEBUG(\"vo solver costs: %f ms\", t);\n    ROS_DEBUG(\"average of time %f ms\", sum_of_time / sum_of_calculation);\n\n    sum_of_path += (estimator.Ps[WINDOW_SIZE] - last_path).norm();\n    last_path = estimator.Ps[WINDOW_SIZE];\n    ROS_DEBUG(\"sum of path %f\", sum_of_path);\n    if (ESTIMATE_TD)\n        ROS_INFO(\"td %f\", estimator.td);\n}\n\nvoid pubOdometry(const Estimator &estimator, const std_msgs::Header &header)\n{\n    if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR)\n    {\n        nav_msgs::Odometry odometry;\n        odometry.header = header;\n        odometry.header.frame_id = \"world\";\n        odometry.child_frame_id = \"world\";\n        Quaterniond tmp_Q;\n        tmp_Q = Quaterniond(estimator.Rs[WINDOW_SIZE]);\n        odometry.pose.pose.position.x = estimator.Ps[WINDOW_SIZE].x();\n        odometry.pose.pose.position.y = estimator.Ps[WINDOW_SIZE].y();\n        odometry.pose.pose.position.z = estimator.Ps[WINDOW_SIZE].z();\n        odometry.pose.pose.orientation.x = tmp_Q.x();\n        odometry.pose.pose.orientation.y = tmp_Q.y();\n        odometry.pose.pose.orientation.z = tmp_Q.z();\n        odometry.pose.pose.orientation.w = tmp_Q.w();\n        odometry.twist.twist.linear.x = estimator.Vs[WINDOW_SIZE].x();\n        odometry.twist.twist.linear.y = estimator.Vs[WINDOW_SIZE].y();\n        odometry.twist.twist.linear.z = estimator.Vs[WINDOW_SIZE].z();\n        pub_odometry.publish(odometry);\n\n        geometry_msgs::PoseStamped pose_stamped;\n        pose_stamped.header = header;\n        pose_stamped.header.frame_id = \"world\";\n        pose_stamped.pose = odometry.pose.pose;\n        path.header = header;\n        path.header.frame_id = \"world\";\n        path.poses.push_back(pose_stamped);\n        pub_path.publish(path);\n\n        Vector3d correct_t;\n        Vector3d correct_v;\n        Quaterniond correct_q;\n        correct_t = estimator.drift_correct_r * estimator.Ps[WINDOW_SIZE] + estimator.drift_correct_t;\n        correct_q = estimator.drift_correct_r * estimator.Rs[WINDOW_SIZE];\n        odometry.pose.pose.position.x = correct_t.x();\n        odometry.pose.pose.position.y = correct_t.y();\n        odometry.pose.pose.position.z = correct_t.z();\n        odometry.pose.pose.orientation.x = correct_q.x();\n        odometry.pose.pose.orientation.y = correct_q.y();\n        odometry.pose.pose.orientation.z = correct_q.z();\n        odometry.pose.pose.orientation.w = correct_q.w();\n\n        pose_stamped.pose = odometry.pose.pose;\n        relo_path.header = header;\n        relo_path.header.frame_id = \"world\";\n        relo_path.poses.push_back(pose_stamped);\n        pub_relo_path.publish(relo_path);\n\n        // write result to file\n        ofstream foutC(RPVIO_RESULT_PATH, ios::app);\n        foutC.setf(ios::fixed, ios::floatfield);\n        foutC.precision(0);\n        foutC << header.stamp.toSec() * 1e9 << \",\";\n        foutC.precision(5);\n        foutC << estimator.Ps[WINDOW_SIZE].x() << \",\"\n              << estimator.Ps[WINDOW_SIZE].y() << \",\"\n              << estimator.Ps[WINDOW_SIZE].z() << \",\"\n              << tmp_Q.w() << \",\"\n              << tmp_Q.x() << \",\"\n              << tmp_Q.y() << \",\"\n              << tmp_Q.z() << \",\"\n              << estimator.Vs[WINDOW_SIZE].x() << \",\"\n              << estimator.Vs[WINDOW_SIZE].y() << \",\"\n              << estimator.Vs[WINDOW_SIZE].z() << \",\" << endl;\n        foutC.close();\n    }\n}\n\nvoid pubKeyPoses(const Estimator &estimator, const std_msgs::Header &header)\n{\n    if (estimator.key_poses.size() == 0)\n        return;\n    visualization_msgs::Marker key_poses;\n    key_poses.header = header;\n    key_poses.header.frame_id = \"world\";\n    key_poses.ns = \"key_poses\";\n    key_poses.type = visualization_msgs::Marker::SPHERE_LIST;\n    key_poses.action = visualization_msgs::Marker::ADD;\n    key_poses.pose.orientation.w = 1.0;\n    key_poses.lifetime = ros::Duration();\n\n    //static int key_poses_id = 0;\n    key_poses.id = 0; //key_poses_id++;\n    key_poses.scale.x = 0.05;\n    key_poses.scale.y = 0.05;\n    key_poses.scale.z = 0.05;\n    key_poses.color.r = 1.0;\n    key_poses.color.a = 1.0;\n\n    for (int i = 0; i <= WINDOW_SIZE; i++)\n    {\n        geometry_msgs::Point pose_marker;\n        Vector3d correct_pose;\n        correct_pose = estimator.key_poses[i];\n        pose_marker.x = correct_pose.x();\n        pose_marker.y = correct_pose.y();\n        pose_marker.z = correct_pose.z();\n        key_poses.points.push_back(pose_marker);\n    }\n    pub_key_poses.publish(key_poses);\n}\n\nvoid pubCameraPose(const Estimator &estimator, const std_msgs::Header &header)\n{\n    int idx2 = WINDOW_SIZE - 1;\n\n    if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR)\n    {\n        int i = idx2;\n        Vector3d P = estimator.Ps[i] + estimator.Rs[i] * estimator.tic[0];\n        Quaterniond R = Quaterniond(estimator.Rs[i] * estimator.ric[0]);\n\n        nav_msgs::Odometry odometry;\n        odometry.header = header;\n        odometry.header.frame_id = \"world\";\n        odometry.pose.pose.position.x = P.x();\n        odometry.pose.pose.position.y = P.y();\n        odometry.pose.pose.position.z = P.z();\n        odometry.pose.pose.orientation.x = R.x();\n        odometry.pose.pose.orientation.y = R.y();\n        odometry.pose.pose.orientation.z = R.z();\n        odometry.pose.pose.orientation.w = R.w();\n\n        pub_camera_pose.publish(odometry);\n\n        cameraposevisual.reset();\n        cameraposevisual.add_pose(P, R);\n        cameraposevisual.publish_by(pub_camera_pose_visual, odometry.header);\n    }\n}\n\n\nvoid pubPointCloud(const Estimator &estimator, const std_msgs::Header &header)\n{\n    sensor_msgs::PointCloud point_cloud, loop_point_cloud;\n    point_cloud.header = header;\n    loop_point_cloud.header = header;\n\n\n    for (auto &it_per_id : estimator.f_manager.feature)\n    {\n        int used_num;\n        used_num = it_per_id.feature_per_frame.size();\n        if (!(used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n            continue;\n        if (it_per_id.start_frame > WINDOW_SIZE * 3.0 / 4.0 || it_per_id.solve_flag != 1)\n            continue;\n\n        int imu_i = it_per_id.start_frame;\n        Vector3d pts_i = it_per_id.feature_per_frame[0].point * it_per_id.estimated_depth;\n        Vector3d w_pts_i = estimator.Rs[imu_i] * (estimator.ric[0] * pts_i + estimator.tic[0]) + estimator.Ps[imu_i];\n\n        geometry_msgs::Point32 p;\n        p.x = w_pts_i(0);\n        p.y = w_pts_i(1);\n        p.z = w_pts_i(2);\n        point_cloud.points.push_back(p);\n    }\n    pub_point_cloud.publish(point_cloud);\n\n\n    // pub margined potin\n    sensor_msgs::PointCloud margin_cloud;\n    margin_cloud.header = header;\n\n    for (auto &it_per_id : estimator.f_manager.feature)\n    { \n        int used_num;\n        used_num = it_per_id.feature_per_frame.size();\n        if (!(used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))\n            continue;\n        //if (it_per_id->start_frame > WINDOW_SIZE * 3.0 / 4.0 || it_per_id->solve_flag != 1)\n        //        continue;\n\n        if (it_per_id.start_frame == 0 && it_per_id.feature_per_frame.size() <= 2 \n            && it_per_id.solve_flag == 1 )\n        {\n            int imu_i = it_per_id.start_frame;\n            Vector3d pts_i = it_per_id.feature_per_frame[0].point * it_per_id.estimated_depth;\n            Vector3d w_pts_i = estimator.Rs[imu_i] * (estimator.ric[0] * pts_i + estimator.tic[0]) + estimator.Ps[imu_i];\n\n            geometry_msgs::Point32 p;\n            p.x = w_pts_i(0);\n            p.y = w_pts_i(1);\n            p.z = w_pts_i(2);\n            margin_cloud.points.push_back(p);\n        }\n    }\n    pub_margin_cloud.publish(margin_cloud);\n}\n\n\nvoid pubTF(const Estimator &estimator, const std_msgs::Header &header)\n{\n    if( estimator.solver_flag != Estimator::SolverFlag::NON_LINEAR)\n        return;\n    static tf::TransformBroadcaster br;\n    tf::Transform transform;\n    tf::Quaternion q;\n    // body frame\n    Vector3d correct_t;\n    Quaterniond correct_q;\n    correct_t = estimator.Ps[WINDOW_SIZE];\n    correct_q = estimator.Rs[WINDOW_SIZE];\n\n    transform.setOrigin(tf::Vector3(correct_t(0),\n                                    correct_t(1),\n                                    correct_t(2)));\n    q.setW(correct_q.w());\n    q.setX(correct_q.x());\n    q.setY(correct_q.y());\n    q.setZ(correct_q.z());\n    transform.setRotation(q);\n    br.sendTransform(tf::StampedTransform(transform, header.stamp, \"world\", \"body\"));\n\n    // camera frame\n    transform.setOrigin(tf::Vector3(estimator.tic[0].x(),\n                                    estimator.tic[0].y(),\n                                    estimator.tic[0].z()));\n    q.setW(Quaterniond(estimator.ric[0]).w());\n    q.setX(Quaterniond(estimator.ric[0]).x());\n    q.setY(Quaterniond(estimator.ric[0]).y());\n    q.setZ(Quaterniond(estimator.ric[0]).z());\n    transform.setRotation(q);\n    br.sendTransform(tf::StampedTransform(transform, header.stamp, \"body\", \"camera\"));\n\n    nav_msgs::Odometry odometry;\n    odometry.header = header;\n    odometry.header.frame_id = \"world\";\n    odometry.pose.pose.position.x = estimator.tic[0].x();\n    odometry.pose.pose.position.y = estimator.tic[0].y();\n    odometry.pose.pose.position.z = estimator.tic[0].z();\n    Quaterniond tmp_q{estimator.ric[0]};\n    odometry.pose.pose.orientation.x = tmp_q.x();\n    odometry.pose.pose.orientation.y = tmp_q.y();\n    odometry.pose.pose.orientation.z = tmp_q.z();\n    odometry.pose.pose.orientation.w = tmp_q.w();\n    pub_extrinsic.publish(odometry);\n\n}\n\nvoid pubKeyframe(const Estimator &estimator)\n{\n    // pub camera pose, 2D-3D points of keyframe\n    if (estimator.solver_flag == Estimator::SolverFlag::NON_LINEAR && estimator.marginalization_flag == 0)\n    {\n        int i = WINDOW_SIZE - 2;\n        //Vector3d P = estimator.Ps[i] + estimator.Rs[i] * estimator.tic[0];\n        Vector3d P = estimator.Ps[i];\n        Quaterniond R = Quaterniond(estimator.Rs[i]);\n\n        nav_msgs::Odometry odometry;\n        odometry.header = estimator.Headers[WINDOW_SIZE - 2];\n        odometry.header.frame_id = \"world\";\n        odometry.pose.pose.position.x = P.x();\n        odometry.pose.pose.position.y = P.y();\n        odometry.pose.pose.position.z = P.z();\n        odometry.pose.pose.orientation.x = R.x();\n        odometry.pose.pose.orientation.y = R.y();\n        odometry.pose.pose.orientation.z = R.z();\n        odometry.pose.pose.orientation.w = R.w();\n        //printf(\"time: %f t: %f %f %f r: %f %f %f %f\\n\", odometry.header.stamp.toSec(), P.x(), P.y(), P.z(), R.w(), R.x(), R.y(), R.z());\n\n        pub_keyframe_pose.publish(odometry);\n\n\n        sensor_msgs::PointCloud point_cloud;\n        point_cloud.header = estimator.Headers[WINDOW_SIZE - 2];\n        for (auto &it_per_id : estimator.f_manager.feature)\n        {\n            int frame_size = it_per_id.feature_per_frame.size();\n            if(it_per_id.start_frame < WINDOW_SIZE - 2 && it_per_id.start_frame + frame_size - 1 >= WINDOW_SIZE - 2 && it_per_id.solve_flag == 1)\n            {\n\n                int imu_i = it_per_id.start_frame;\n                Vector3d pts_i = it_per_id.feature_per_frame[0].point * it_per_id.estimated_depth;\n                Vector3d w_pts_i = estimator.Rs[imu_i] * (estimator.ric[0] * pts_i + estimator.tic[0])\n                                      + estimator.Ps[imu_i];\n                geometry_msgs::Point32 p;\n                p.x = w_pts_i(0);\n                p.y = w_pts_i(1);\n                p.z = w_pts_i(2);\n                point_cloud.points.push_back(p);\n\n                int imu_j = WINDOW_SIZE - 2 - it_per_id.start_frame;\n                sensor_msgs::ChannelFloat32 p_2d;\n                p_2d.values.push_back(it_per_id.feature_per_frame[imu_j].point.x());\n                p_2d.values.push_back(it_per_id.feature_per_frame[imu_j].point.y());\n                p_2d.values.push_back(it_per_id.feature_per_frame[imu_j].uv.x());\n                p_2d.values.push_back(it_per_id.feature_per_frame[imu_j].uv.y());\n                p_2d.values.push_back(it_per_id.feature_id);\n                point_cloud.channels.push_back(p_2d);\n            }\n\n        }\n        pub_keyframe_point.publish(point_cloud);\n    }\n}\n\nvoid pubRelocalization(const Estimator &estimator)\n{\n    nav_msgs::Odometry odometry;\n    odometry.header.stamp = ros::Time(estimator.relo_frame_stamp);\n    odometry.header.frame_id = \"world\";\n    odometry.pose.pose.position.x = estimator.relo_relative_t.x();\n    odometry.pose.pose.position.y = estimator.relo_relative_t.y();\n    odometry.pose.pose.position.z = estimator.relo_relative_t.z();\n    odometry.pose.pose.orientation.x = estimator.relo_relative_q.x();\n    odometry.pose.pose.orientation.y = estimator.relo_relative_q.y();\n    odometry.pose.pose.orientation.z = estimator.relo_relative_q.z();\n    odometry.pose.pose.orientation.w = estimator.relo_relative_q.w();\n    odometry.twist.twist.linear.x = estimator.relo_relative_yaw;\n    odometry.twist.twist.linear.y = estimator.relo_frame_index;\n\n    pub_relo_relative_pose.publish(odometry);\n}\n"
  },
  {
    "path": "rpvio_estimator/src/utility/visualization.h",
    "content": "#pragma once\n\n#include <ros/ros.h>\n#include <std_msgs/Header.h>\n#include <std_msgs/Float32.h>\n#include <std_msgs/Bool.h>\n#include <sensor_msgs/Imu.h>\n#include <sensor_msgs/PointCloud.h>\n#include <sensor_msgs/Image.h>\n#include <sensor_msgs/image_encodings.h>\n#include <nav_msgs/Path.h>\n#include <nav_msgs/Odometry.h>\n#include <geometry_msgs/PointStamped.h>\n#include <visualization_msgs/Marker.h>\n#include <tf/transform_broadcaster.h>\n#include \"CameraPoseVisualization.h\"\n#include <eigen3/Eigen/Dense>\n#include \"../estimator.h\"\n#include \"../parameters.h\"\n#include <fstream>\n\nextern ros::Publisher pub_odometry;\nextern ros::Publisher pub_path, pub_pose;\nextern ros::Publisher pub_cloud, pub_map;\nextern ros::Publisher pub_key_poses;\nextern ros::Publisher pub_ref_pose, pub_cur_pose;\nextern ros::Publisher pub_key;\nextern nav_msgs::Path path;\nextern ros::Publisher pub_pose_graph;\nextern int IMAGE_ROW, IMAGE_COL;\n\nvoid registerPub(ros::NodeHandle &n);\n\nvoid pubLatestOdometry(const Eigen::Vector3d &P, const Eigen::Quaterniond &Q, const Eigen::Vector3d &V, const std_msgs::Header &header);\n\nvoid printStatistics(const Estimator &estimator, double t);\n\nvoid pubOdometry(const Estimator &estimator, const std_msgs::Header &header);\n\nvoid pubInitialGuess(const Estimator &estimator, const std_msgs::Header &header);\n\nvoid pubKeyPoses(const Estimator &estimator, const std_msgs::Header &header);\n\nvoid pubCameraPose(const Estimator &estimator, const std_msgs::Header &header);\n\nvoid pubPointCloud(const Estimator &estimator, const std_msgs::Header &header);\n\nvoid pubTF(const Estimator &estimator, const std_msgs::Header &header);\n\nvoid pubKeyframe(const Estimator &estimator);\n\nvoid pubRelocalization(const Estimator &estimator);"
  },
  {
    "path": "rpvio_feature_tracker/CMakeLists.txt",
    "content": "cmake_minimum_required(VERSION 2.8.3)\nproject(rpvio_feature_tracker)\n\nset(CMAKE_BUILD_TYPE \"Release\")\nset(CMAKE_CXX_FLAGS \"-std=c++11\")\nset(CMAKE_CXX_FLAGS_RELEASE \"-O3 -Wall -g\")\n\nfind_package(catkin REQUIRED COMPONENTS\n    roscpp\n    std_msgs\n    sensor_msgs\n    cv_bridge\n    camera_model\n    message_filters\n    )\n\nfind_package(OpenCV REQUIRED)\n\ncatkin_package()\n\ninclude_directories(\n    ${catkin_INCLUDE_DIRS}\n    )\n\nset(CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake)\nfind_package(Eigen3)\ninclude_directories(\n  ${catkin_INCLUDE_DIRS}\n  ${EIGEN3_INCLUDE_DIR}\n)\n\nadd_executable(rpvio_feature_tracker\n    src/feature_tracker_node.cpp\n    src/parameters.cpp\n    src/feature_tracker.cpp\n    )\n\ntarget_link_libraries(rpvio_feature_tracker ${catkin_LIBRARIES} ${OpenCV_LIBS})\n"
  },
  {
    "path": "rpvio_feature_tracker/cmake/FindEigen.cmake",
    "content": "# Ceres Solver - A fast non-linear least squares minimizer\n# Copyright 2015 Google Inc. All rights reserved.\n# http://ceres-solver.org/\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice,\n#   this list of conditions and the following disclaimer.\n# * Redistributions in binary form must reproduce the above copyright notice,\n#   this list of conditions and the following disclaimer in the documentation\n#   and/or other materials provided with the distribution.\n# * Neither the name of Google Inc. nor the names of its contributors may be\n#   used to endorse or promote products derived from this software without\n#   specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE\n# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE\n# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR\n# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF\n# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS\n# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN\n# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)\n# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE\n# POSSIBILITY OF SUCH DAMAGE.\n#\n# Author: alexs.mac@gmail.com (Alex Stewart)\n#\n\n# FindEigen.cmake - Find Eigen library, version >= 3.\n#\n# This module defines the following variables:\n#\n# EIGEN_FOUND: TRUE iff Eigen is found.\n# EIGEN_INCLUDE_DIRS: Include directories for Eigen.\n#\n# EIGEN_VERSION: Extracted from Eigen/src/Core/util/Macros.h\n# EIGEN_WORLD_VERSION: Equal to 3 if EIGEN_VERSION = 3.2.0\n# EIGEN_MAJOR_VERSION: Equal to 2 if EIGEN_VERSION = 3.2.0\n# EIGEN_MINOR_VERSION: Equal to 0 if EIGEN_VERSION = 3.2.0\n#\n# The following variables control the behaviour of this module:\n#\n# EIGEN_INCLUDE_DIR_HINTS: List of additional directories in which to\n#                          search for eigen includes, e.g: /timbuktu/eigen3.\n#\n# The following variables are also defined by this module, but in line with\n# CMake recommended FindPackage() module style should NOT be referenced directly\n# by callers (use the plural variables detailed above instead).  These variables\n# do however affect the behaviour of the module via FIND_[PATH/LIBRARY]() which\n# are NOT re-called (i.e. search for library is not repeated) if these variables\n# are set with valid values _in the CMake cache_. This means that if these\n# variables are set directly in the cache, either by the user in the CMake GUI,\n# or by the user passing -DVAR=VALUE directives to CMake when called (which\n# explicitly defines a cache variable), then they will be used verbatim,\n# bypassing the HINTS variables and other hard-coded search locations.\n#\n# EIGEN_INCLUDE_DIR: Include directory for CXSparse, not including the\n#                    include directory of any dependencies.\n\n# Called if we failed to find Eigen or any of it's required dependencies,\n# unsets all public (designed to be used externally) variables and reports\n# error message at priority depending upon [REQUIRED/QUIET/<NONE>] argument.\nmacro(EIGEN_REPORT_NOT_FOUND REASON_MSG)\n  unset(EIGEN_FOUND)\n  unset(EIGEN_INCLUDE_DIRS)\n  # Make results of search visible in the CMake GUI if Eigen has not\n  # been found so that user does not have to toggle to advanced view.\n  mark_as_advanced(CLEAR EIGEN_INCLUDE_DIR)\n  # Note <package>_FIND_[REQUIRED/QUIETLY] variables defined by FindPackage()\n  # use the camelcase library name, not uppercase.\n  if (Eigen_FIND_QUIETLY)\n    message(STATUS \"Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  elseif (Eigen_FIND_REQUIRED)\n    message(FATAL_ERROR \"Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  else()\n    # Neither QUIETLY nor REQUIRED, use no priority which emits a message\n    # but continues configuration and allows generation.\n    message(\"-- Failed to find Eigen - \" ${REASON_MSG} ${ARGN})\n  endif ()\n  return()\nendmacro(EIGEN_REPORT_NOT_FOUND)\n\n# Protect against any alternative find_package scripts for this library having\n# been called previously (in a client project) which set EIGEN_FOUND, but not\n# the other variables we require / set here which could cause the search logic\n# here to fail.\nunset(EIGEN_FOUND)\n\n# Search user-installed locations first, so that we prefer user installs\n# to system installs where both exist.\nlist(APPEND EIGEN_CHECK_INCLUDE_DIRS\n  /usr/local/include\n  /usr/local/homebrew/include # Mac OS X\n  /opt/local/var/macports/software # Mac OS X.\n  /opt/local/include\n  /usr/include)\n# Additional suffixes to try appending to each search path.\nlist(APPEND EIGEN_CHECK_PATH_SUFFIXES\n  eigen3 # Default root directory for Eigen.\n  Eigen/include/eigen3 # Windows (for C:/Program Files prefix) < 3.3\n  Eigen3/include/eigen3 ) # Windows (for C:/Program Files prefix) >= 3.3\n\n# Search supplied hint directories first if supplied.\nfind_path(EIGEN_INCLUDE_DIR\n  NAMES Eigen/Core\n  PATHS ${EIGEN_INCLUDE_DIR_HINTS}\n  ${EIGEN_CHECK_INCLUDE_DIRS}\n  PATH_SUFFIXES ${EIGEN_CHECK_PATH_SUFFIXES})\n\nif (NOT EIGEN_INCLUDE_DIR OR\n    NOT EXISTS ${EIGEN_INCLUDE_DIR})\n  eigen_report_not_found(\n    \"Could not find eigen3 include directory, set EIGEN_INCLUDE_DIR to \"\n    \"path to eigen3 include directory, e.g. /usr/local/include/eigen3.\")\nendif (NOT EIGEN_INCLUDE_DIR OR\n       NOT EXISTS ${EIGEN_INCLUDE_DIR})\n\n# Mark internally as found, then verify. EIGEN_REPORT_NOT_FOUND() unsets\n# if called.\nset(EIGEN_FOUND TRUE)\n\n# Extract Eigen version from Eigen/src/Core/util/Macros.h\nif (EIGEN_INCLUDE_DIR)\n  set(EIGEN_VERSION_FILE ${EIGEN_INCLUDE_DIR}/Eigen/src/Core/util/Macros.h)\n  if (NOT EXISTS ${EIGEN_VERSION_FILE})\n    eigen_report_not_found(\n      \"Could not find file: ${EIGEN_VERSION_FILE} \"\n      \"containing version information in Eigen install located at: \"\n      \"${EIGEN_INCLUDE_DIR}.\")\n  else (NOT EXISTS ${EIGEN_VERSION_FILE})\n    file(READ ${EIGEN_VERSION_FILE} EIGEN_VERSION_FILE_CONTENTS)\n\n    string(REGEX MATCH \"#define EIGEN_WORLD_VERSION [0-9]+\"\n      EIGEN_WORLD_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_WORLD_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_WORLD_VERSION \"${EIGEN_WORLD_VERSION}\")\n\n    string(REGEX MATCH \"#define EIGEN_MAJOR_VERSION [0-9]+\"\n      EIGEN_MAJOR_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_MAJOR_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_MAJOR_VERSION \"${EIGEN_MAJOR_VERSION}\")\n\n    string(REGEX MATCH \"#define EIGEN_MINOR_VERSION [0-9]+\"\n      EIGEN_MINOR_VERSION \"${EIGEN_VERSION_FILE_CONTENTS}\")\n    string(REGEX REPLACE \"#define EIGEN_MINOR_VERSION ([0-9]+)\" \"\\\\1\"\n      EIGEN_MINOR_VERSION \"${EIGEN_MINOR_VERSION}\")\n\n    # This is on a single line s/t CMake does not interpret it as a list of\n    # elements and insert ';' separators which would result in 3.;2.;0 nonsense.\n    set(EIGEN_VERSION \"${EIGEN_WORLD_VERSION}.${EIGEN_MAJOR_VERSION}.${EIGEN_MINOR_VERSION}\")\n  endif (NOT EXISTS ${EIGEN_VERSION_FILE})\nendif (EIGEN_INCLUDE_DIR)\n\n# Set standard CMake FindPackage variables if found.\nif (EIGEN_FOUND)\n  set(EIGEN_INCLUDE_DIRS ${EIGEN_INCLUDE_DIR})\nendif (EIGEN_FOUND)\n\n# Handle REQUIRED / QUIET optional arguments and version.\ninclude(FindPackageHandleStandardArgs)\nfind_package_handle_standard_args(Eigen\n  REQUIRED_VARS EIGEN_INCLUDE_DIRS\n  VERSION_VAR EIGEN_VERSION)\n\n# Only mark internal variables as advanced if we found Eigen, otherwise\n# leave it visible in the standard GUI for the user to set manually.\nif (EIGEN_FOUND)\n  mark_as_advanced(FORCE EIGEN_INCLUDE_DIR)\nendif (EIGEN_FOUND)\n"
  },
  {
    "path": "rpvio_feature_tracker/package.xml",
    "content": "<?xml version=\"1.0\"?>\n<package>\n  <name>rpvio_feature_tracker</name>\n  <version>0.0.0</version>\n  <description>The rpvio_feature_tracker package</description>\n\n  <maintainer email=\"karnikram@gmail.com\">karnikram</maintainer>\n  <license>GPLv3</license>\n\n  <buildtool_depend>catkin</buildtool_depend>\n  <build_depend>roscpp</build_depend>\n  <build_depend>camera_model</build_depend>\n  <build_depend>message_generation</build_depend>\n  <run_depend>roscpp</run_depend>\n  <run_depend>camera_model</run_depend>\n  <run_depend>message_runtime</run_depend>\n\n</package>\n"
  },
  {
    "path": "rpvio_feature_tracker/src/feature_tracker.cpp",
    "content": "#include \"feature_tracker.h\"\n\nint FeatureTracker::n_id = 0;\n\nbool inBorder(const cv::Point2f &pt)\n{\n    const int BORDER_SIZE = 1;\n    int img_x = cvRound(pt.x);\n    int img_y = cvRound(pt.y);\n    return BORDER_SIZE <= img_x && img_x < COL - BORDER_SIZE && BORDER_SIZE <= img_y && img_y < ROW - BORDER_SIZE;\n}\n\nvoid reduceVector(vector<cv::Point2f> &v, vector<uchar> status)\n{\n    int j = 0;\n    for (int i = 0; i < int(v.size()); i++)\n        if (status[i])\n            v[j++] = v[i];\n    v.resize(j);\n}\n\nvoid reduceVector(vector<int> &v, vector<uchar> status)\n{\n    int j = 0;\n    for (int i = 0; i < int(v.size()); i++)\n        if (status[i])\n            v[j++] = v[i];\n    v.resize(j);\n}\n\nFeatureTracker::FeatureTracker()\n{\n}\n\nvoid FeatureTracker::setMask(const cv::Mat &_mask)\n{\n    //mask = cv::Mat(ROW, COL, CV_8UC1, cv::Scalar(255));\n    mask = _mask.clone();\n    //cv::Mat erosion_mat = cv::getStructuringElement(cv::MORPH_RECT, cv::Size(21,21));\n    cv::erode(mask, mask, cv::Mat(), cv::Point(-1, -1), 2, 1, 1);\n\n    // prefer to keep features that are tracked for long time\n    vector<pair<int, pair<cv::Point2f, pair<int,int>>>> cnt_pts_id_pid;\n\n    for (unsigned int i = 0; i < forw_pts.size(); i++)\n        cnt_pts_id_pid.push_back(make_pair(track_cnt[i], \n            make_pair(forw_pts[i], make_pair(ids[i], plane_ids[i]))));\n\n    sort(cnt_pts_id_pid.begin(), cnt_pts_id_pid.end(), \n        [](const pair<int, pair<cv::Point2f, pair<int,int>>> &a, \n            const pair<int, pair<cv::Point2f, pair<int,int>>> &b)\n         {\n            return a.first > b.first;\n         });\n\n    forw_pts.clear();\n    ids.clear();\n    plane_ids.clear();\n    track_cnt.clear();\n\n    for (auto &it : cnt_pts_id_pid)\n    {\n        if (mask.at<uchar>(it.second.first) != 0)\n        {\n            forw_pts.push_back(it.second.first);\n            ids.push_back(it.second.second.first);\n            plane_ids.push_back(it.second.second.second);\n            track_cnt.push_back(it.first);\n            cv::circle(mask, it.second.first, MIN_DIST, 0, -1);\n        }\n    }\n}\n\nvoid FeatureTracker::addPoints(const cv::Mat &_mask)\n{\n    // Find unique labels\n    cv::Mat mask_copy = _mask.clone();\n    std::sort(mask_copy.begin<uchar>(), mask_copy.end<uchar>());\n    auto last = std::unique(mask_copy.begin<uchar>(), mask_copy.end<uchar>());\n    std::vector<uchar> unique_labels(mask_copy.begin<uchar>(), last);\n\n    // Create mask for each label\n    std::map<uchar, cv::Mat> unique_masks;\n    for(auto &label : unique_labels)\n        unique_masks.emplace(label, cv::Mat::zeros(_mask.size(), CV_8UC1));\n\n    for(int i = 0; i < _mask.rows; i++)\n        for(int j = 0; j < _mask.cols; j++)\n            unique_masks[_mask.at<uchar>(i,j)].at<uchar>(i,j) = 255;\n\n    // Erode each mask to avoid dynamic edges\n    for(auto &unique_mask : unique_masks)\n        cv::erode(unique_mask.second, unique_mask.second, cv::Mat(), cv::Point(-1, -1), 2, 1, 1);\n\n    // Store plane id for each feature\n    for (auto &p : n_pts)\n    {\n        for(auto &unique_mask : unique_masks)\n        {\n            if(unique_mask.first == 0)\n                continue;\n                \n            if(unique_mask.second.at<uchar>(p) == 255)\n            {\n                forw_pts.push_back(p);\n                ids.push_back(-1);\n                track_cnt.push_back(1);\n                plane_ids.push_back(unique_mask.first);\n                break;\n            }\n        }\n    }\n}\n\nvoid FeatureTracker::readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time)\n{\n    cv::Mat img;\n    TicToc t_r;\n    cur_time = _cur_time;\n\n    if (EQUALIZE)\n    {\n        cv::Ptr<cv::CLAHE> clahe = cv::createCLAHE(3.0, cv::Size(8, 8));\n        TicToc t_c;\n        clahe->apply(_img, img);\n        ROS_DEBUG(\"CLAHE costs: %fms\", t_c.toc());\n    }\n    else\n        img = _img;\n\n    if (forw_img.empty())\n    {\n        prev_img = cur_img = forw_img = img;\n    }\n    else\n    {\n        forw_img = img;\n    }\n\n    forw_pts.clear();\n\n    if (cur_pts.size() > 0)\n    {\n        TicToc t_o;\n        vector<uchar> status;\n        vector<float> err;\n        cv::calcOpticalFlowPyrLK(cur_img, forw_img, cur_pts, forw_pts, status, err, cv::Size(21, 21), 3);\n\n        for (int i = 0; i < int(forw_pts.size()); i++)\n            if (status[i] && !inBorder(forw_pts[i]))\n                status[i] = 0;\n        reduceVector(prev_pts, status);\n        reduceVector(cur_pts, status);\n        reduceVector(forw_pts, status);\n        reduceVector(ids, status);\n        reduceVector(plane_ids, status);\n        reduceVector(cur_un_pts, status);\n        reduceVector(track_cnt, status);\n        ROS_DEBUG(\"temporal optical flow costs: %fms\", t_o.toc());\n    }\n\n    for (auto &n : track_cnt)\n        n++;\n\n    if (PUB_THIS_FRAME)\n    {\n        rejectWithH();\n        ROS_DEBUG(\"set mask begins\");\n        TicToc t_m;\n        setMask(_mask);\n        ROS_DEBUG(\"set mask costs %fms\", t_m.toc());\n\n        ROS_DEBUG(\"detect feature begins\");\n        TicToc t_t;\n        int n_max_cnt = MAX_CNT - static_cast<int>(forw_pts.size());\n        if (n_max_cnt > 0)\n        {\n            if(mask.empty())\n                cout << \"mask is empty \" << endl;\n            if (mask.type() != CV_8UC1)\n                cout << \"mask type wrong \" << endl;\n            if (mask.size() != forw_img.size())\n                cout << \"wrong size \" << endl;\n            cv::goodFeaturesToTrack(forw_img, n_pts, MAX_CNT - forw_pts.size(), 0.01, MIN_DIST, mask);\n            ROS_DEBUG(\"add feature begins\");\n            TicToc t_a;\n            addPoints(_mask);\n            ROS_DEBUG(\"selectFeature costs: %fms\", t_a.toc());\n        }\n\n        else\n            n_pts.clear();\n        ROS_DEBUG(\"detect feature costs: %fms\", t_t.toc());\n    }\n    prev_img = cur_img;\n    prev_pts = cur_pts;\n    prev_un_pts = cur_un_pts;\n    cur_img = forw_img;\n    cur_pts = forw_pts;\n    undistortedPoints();\n    prev_time = cur_time;\n}\n\nvoid FeatureTracker::rejectWithH()\n{\n    if (forw_pts.size() >= 4)\n    {\n        ROS_DEBUG(\"HM ransac begins\");\n        TicToc t_f;\n        vector<cv::Point2f> un_cur_pts(cur_pts.size()), un_forw_pts(forw_pts.size());\n        for (unsigned int i = 0; i < cur_pts.size(); i++)\n        {\n            Eigen::Vector3d tmp_p;\n            m_camera->liftProjective(Eigen::Vector2d(cur_pts[i].x, cur_pts[i].y), tmp_p);\n            tmp_p.x() = FOCAL_LENGTH * tmp_p.x() / tmp_p.z() + COL / 2.0;\n            tmp_p.y() = FOCAL_LENGTH * tmp_p.y() / tmp_p.z() + ROW / 2.0;\n            un_cur_pts[i] = cv::Point2f(tmp_p.x(), tmp_p.y());\n\n            m_camera->liftProjective(Eigen::Vector2d(forw_pts[i].x, forw_pts[i].y), tmp_p);\n            tmp_p.x() = FOCAL_LENGTH * tmp_p.x() / tmp_p.z() + COL / 2.0;\n            tmp_p.y() = FOCAL_LENGTH * tmp_p.y() / tmp_p.z() + ROW / 2.0;\n            un_forw_pts[i] = cv::Point2f(tmp_p.x(), tmp_p.y());\n        }\n\n        vector<uchar> status(un_cur_pts.size());\n        //cv::findFundamentalMat(un_cur_pts, un_forw_pts, cv::FM_RANSAC, F_THRESHOLD, 0.99, status);\n        //cv::findHomography(un_cur_pts, un_forw_pts, cv::RANSAC, 3, status);\n\n        map<int, vector<cv::Point2f>> pid_un_cur_pts, pid_un_forw_pts;\n        map<pair<int,int>, int> new_ids_to_old;\n        for(uint i = 0; i < un_cur_pts.size(); i++)\n        {\n            pid_un_cur_pts[plane_ids[i]].push_back(un_cur_pts[i]);\n            pid_un_forw_pts[plane_ids[i]].push_back(un_forw_pts[i]);\n            new_ids_to_old[make_pair(plane_ids[i], pid_un_cur_pts[plane_ids[i]].size()-1)] = i;\n        }\n\n        for(auto &imap : pid_un_cur_pts)\n        {\n            vector<uchar> pid_status;\n            cv::findHomography(pid_un_cur_pts[imap.first], pid_un_forw_pts[imap.first], cv::RANSAC, H_THRESHOLD, pid_status, 2000, 0.99);\n            for(int i = 0; i < pid_status.size(); i++)\n                status[new_ids_to_old[make_pair(imap.first, i)]] = pid_status[i];\n        }\n\n        int size_a = cur_pts.size();\n        reduceVector(prev_pts, status);\n        reduceVector(cur_pts, status);\n        reduceVector(forw_pts, status);\n        reduceVector(cur_un_pts, status);\n        reduceVector(ids, status);\n        reduceVector(plane_ids, status);\n        reduceVector(track_cnt, status);\n        ROS_DEBUG(\"FM ransac: %d -> %lu: %f\", size_a, forw_pts.size(), 1.0 * forw_pts.size() / size_a);\n        ROS_DEBUG(\"FM ransac costs: %fms\", t_f.toc());\n    }\n}\n\nbool FeatureTracker::updateID(unsigned int i)\n{\n    if (i < ids.size())\n    {\n        if (ids[i] == -1)\n            ids[i] = n_id++;\n        return true;\n    }\n    else\n        return false;\n}\n\nvoid FeatureTracker::readIntrinsicParameter(const string &calib_file)\n{\n    ROS_INFO(\"reading paramerter of camera %s\", calib_file.c_str());\n    m_camera = CameraFactory::instance()->generateCameraFromYamlFile(calib_file);\n}\n\nvoid FeatureTracker::showUndistortion(const string &name)\n{\n    cv::Mat undistortedImg(ROW + 600, COL + 600, CV_8UC1, cv::Scalar(0));\n    vector<Eigen::Vector2d> distortedp, undistortedp;\n    for (int i = 0; i < COL; i++)\n        for (int j = 0; j < ROW; j++)\n        {\n            Eigen::Vector2d a(i, j);\n            Eigen::Vector3d b;\n            m_camera->liftProjective(a, b);\n            distortedp.push_back(a);\n            undistortedp.push_back(Eigen::Vector2d(b.x() / b.z(), b.y() / b.z()));\n            //printf(\"%f,%f->%f,%f,%f\\n)\\n\", a.x(), a.y(), b.x(), b.y(), b.z());\n        }\n    for (int i = 0; i < int(undistortedp.size()); i++)\n    {\n        cv::Mat pp(3, 1, CV_32FC1);\n        pp.at<float>(0, 0) = undistortedp[i].x() * FOCAL_LENGTH + COL / 2;\n        pp.at<float>(1, 0) = undistortedp[i].y() * FOCAL_LENGTH + ROW / 2;\n        pp.at<float>(2, 0) = 1.0;\n        //cout << trackerData[0].K << endl;\n        //printf(\"%lf %lf\\n\", p.at<float>(1, 0), p.at<float>(0, 0));\n        //printf(\"%lf %lf\\n\", pp.at<float>(1, 0), pp.at<float>(0, 0));\n        if (pp.at<float>(1, 0) + 300 >= 0 && pp.at<float>(1, 0) + 300 < ROW + 600 && pp.at<float>(0, 0) + 300 >= 0 && pp.at<float>(0, 0) + 300 < COL + 600)\n        {\n            undistortedImg.at<uchar>(pp.at<float>(1, 0) + 300, pp.at<float>(0, 0) + 300) = cur_img.at<uchar>(distortedp[i].y(), distortedp[i].x());\n        }\n        else\n        {\n            //ROS_ERROR(\"(%f %f) -> (%f %f)\", distortedp[i].y, distortedp[i].x, pp.at<float>(1, 0), pp.at<float>(0, 0));\n        }\n    }\n    cv::imshow(name, undistortedImg);\n    cv::waitKey(0);\n}\n\nvoid FeatureTracker::undistortedPoints()\n{\n    cur_un_pts.clear();\n    cur_un_pts_map.clear();\n    //cv::undistortPoints(cur_pts, un_pts, K, cv::Mat());\n    for (unsigned int i = 0; i < cur_pts.size(); i++)\n    {\n        Eigen::Vector2d a(cur_pts[i].x, cur_pts[i].y);\n        Eigen::Vector3d b;\n        m_camera->liftProjective(a, b);\n        cur_un_pts.push_back(cv::Point2f(b.x() / b.z(), b.y() / b.z()));\n        cur_un_pts_map.insert(make_pair(ids[i], cv::Point2f(b.x() / b.z(), b.y() / b.z())));\n        //printf(\"cur pts id %d %f %f\", ids[i], cur_un_pts[i].x, cur_un_pts[i].y);\n    }\n    // caculate points velocity\n    if (!prev_un_pts_map.empty())\n    {\n        double dt = cur_time - prev_time;\n        pts_velocity.clear();\n        for (unsigned int i = 0; i < cur_un_pts.size(); i++)\n        {\n            if (ids[i] != -1)\n            {\n                std::map<int, cv::Point2f>::iterator it;\n                it = prev_un_pts_map.find(ids[i]);\n                if (it != prev_un_pts_map.end())\n                {\n                    double v_x = (cur_un_pts[i].x - it->second.x) / dt;\n                    double v_y = (cur_un_pts[i].y - it->second.y) / dt;\n                    pts_velocity.push_back(cv::Point2f(v_x, v_y));\n                }\n                else\n                    pts_velocity.push_back(cv::Point2f(0, 0));\n            }\n            else\n            {\n                pts_velocity.push_back(cv::Point2f(0, 0));\n            }\n        }\n    }\n    else\n    {\n        for (unsigned int i = 0; i < cur_pts.size(); i++)\n        {\n            pts_velocity.push_back(cv::Point2f(0, 0));\n        }\n    }\n    prev_un_pts_map = cur_un_pts_map;\n}\n"
  },
  {
    "path": "rpvio_feature_tracker/src/feature_tracker.h",
    "content": "#pragma once\n\n#include <cstdio>\n#include <iostream>\n#include <queue>\n#include <execinfo.h>\n#include <csignal>\n\n#include <opencv2/opencv.hpp>\n#include <eigen3/Eigen/Dense>\n\n#include \"camodocal/camera_models/CameraFactory.h\"\n#include \"camodocal/camera_models/CataCamera.h\"\n#include \"camodocal/camera_models/PinholeCamera.h\"\n\n#include \"parameters.h\"\n#include \"tic_toc.h\"\n\nusing namespace std;\nusing namespace camodocal;\nusing namespace Eigen;\n\nbool inBorder(const cv::Point2f &pt);\n\nvoid reduceVector(vector<cv::Point2f> &v, vector<uchar> status);\nvoid reduceVector(vector<int> &v, vector<uchar> status);\n\nclass FeatureTracker\n{\n  public:\n    FeatureTracker();\n\n    void readImage(const cv::Mat &_img, const cv::Mat &_mask, double _cur_time);\n\n    void setMask(const cv::Mat &_mask);\n\n    void addPoints(const cv::Mat &_mask);\n\n    bool updateID(unsigned int i);\n\n    void readIntrinsicParameter(const string &calib_file);\n\n    void showUndistortion(const string &name);\n\n    void rejectWithH();\n\n    void undistortedPoints();\n\n    cv::Mat mask;\n    cv::Mat fisheye_mask;\n    cv::Mat prev_img, cur_img, forw_img;\n    vector<cv::Point2f> n_pts;\n    vector<cv::Point2f> prev_pts, cur_pts, forw_pts;\n    vector<cv::Point2f> prev_un_pts, cur_un_pts;\n    vector<cv::Point2f> pts_velocity;\n    vector<int> ids;\n    vector<int> plane_ids;\n    vector<int> track_cnt;\n    map<int, cv::Point2f> cur_un_pts_map;\n    map<int, cv::Point2f> prev_un_pts_map;\n    camodocal::CameraPtr m_camera;\n    double cur_time;\n    double prev_time;\n\n    static int n_id;\n};\n"
  },
  {
    "path": "rpvio_feature_tracker/src/feature_tracker_node.cpp",
    "content": "#include <ros/ros.h>\n#include <sensor_msgs/Image.h>\n#include <sensor_msgs/image_encodings.h>\n#include <sensor_msgs/PointCloud.h>\n#include <sensor_msgs/Imu.h>\n#include <std_msgs/Bool.h>\n#include <cv_bridge/cv_bridge.h>\n#include <message_filters/subscriber.h>\n#include <message_filters/time_synchronizer.h>\n\n#include \"feature_tracker.h\"\n\n#define SHOW_UNDISTORTION 0\n\nvector<uchar> r_status;\nvector<float> r_err;\nqueue<sensor_msgs::ImageConstPtr> img_buf;\n\nros::Publisher pub_img,pub_match;\nros::Publisher pub_restart;\n\nFeatureTracker trackerData[NUM_OF_CAM];\ndouble first_image_time;\nint pub_count = 1;\nbool first_image_flag = true;\ndouble last_image_time = 0;\nbool init_pub = 0;\n\nvoid callback(const sensor_msgs::ImageConstPtr &img_msg, const sensor_msgs::ImageConstPtr &mask_msg)\n{\n    if(first_image_flag)\n    {\n        first_image_flag = false;\n        first_image_time = img_msg->header.stamp.toSec();\n        last_image_time = img_msg->header.stamp.toSec();\n        return;\n    }\n    // detect unstable camera stream\n    if (img_msg->header.stamp.toSec() - last_image_time > 1.0 || img_msg->header.stamp.toSec() < last_image_time)\n    {\n        ROS_WARN(\"image discontinue! reset the feature tracker!\");\n        first_image_flag = true; \n        last_image_time = 0;\n        pub_count = 1;\n        std_msgs::Bool restart_flag;\n        restart_flag.data = true;\n        pub_restart.publish(restart_flag);\n        return;\n    }\n    last_image_time = img_msg->header.stamp.toSec();\n    // frequency control\n    if (round(1.0 * pub_count / (img_msg->header.stamp.toSec() - first_image_time)) <= FREQ)\n    {\n        PUB_THIS_FRAME = true;\n        // reset the frequency control\n        if (abs(1.0 * pub_count / (img_msg->header.stamp.toSec() - first_image_time) - FREQ) < 0.01 * FREQ)\n        {\n            first_image_time = img_msg->header.stamp.toSec();\n            pub_count = 0;\n        }\n    }\n    else\n        PUB_THIS_FRAME = false;\n\n    cv_bridge::CvImageConstPtr ptr, mask_ptr;\n    if (img_msg->encoding == \"8UC1\")\n    {\n        sensor_msgs::Image img;\n        img.header = img_msg->header;\n        img.height = img_msg->height;\n        img.width = img_msg->width;\n        img.is_bigendian = img_msg->is_bigendian;\n        img.step = img_msg->step;\n        img.data = img_msg->data;\n        img.encoding = \"mono8\";\n        ptr = cv_bridge::toCvCopy(img, sensor_msgs::image_encodings::MONO8);\n    }\n    else\n        ptr = cv_bridge::toCvCopy(img_msg, sensor_msgs::image_encodings::MONO8);\n\n    mask_ptr = cv_bridge::toCvCopy(mask_msg, sensor_msgs::image_encodings::MONO8);\n    cv::Mat show_img = ptr->image;\n\n    TicToc t_r;\n    for (int i = 0; i < NUM_OF_CAM; i++)\n    {\n        ROS_DEBUG(\"processing camera %d\", i);\n        if (i != 1 || !STEREO_TRACK)\n            trackerData[i].readImage(ptr->image.rowRange(ROW * i, ROW * (i + 1)), mask_ptr->image.rowRange(ROW * i,ROW * (i + 1)), img_msg->header.stamp.toSec());\n        else\n        {\n            if (EQUALIZE)\n            {\n                cv::Ptr<cv::CLAHE> clahe = cv::createCLAHE();\n                clahe->apply(ptr->image.rowRange(ROW * i, ROW * (i + 1)), trackerData[i].cur_img);\n            }\n            else\n                trackerData[i].cur_img = ptr->image.rowRange(ROW * i, ROW * (i + 1));\n        }\n\n#if SHOW_UNDISTORTION\n        trackerData[i].showUndistortion(\"undistrotion_\" + std::to_string(i));\n#endif\n    }\n\n    for (unsigned int i = 0;; i++)\n    {\n        bool completed = false;\n        for (int j = 0; j < NUM_OF_CAM; j++)\n            if (j != 1 || !STEREO_TRACK)\n                completed |= trackerData[j].updateID(i);\n        if (!completed)\n            break;\n    }\n\n   if (PUB_THIS_FRAME)\n   {\n        pub_count++;\n        sensor_msgs::PointCloudPtr feature_points(new sensor_msgs::PointCloud);\n        sensor_msgs::ChannelFloat32 id_of_point;\n        sensor_msgs::ChannelFloat32 plane_id_of_point; \n        sensor_msgs::ChannelFloat32 u_of_point;\n        sensor_msgs::ChannelFloat32 v_of_point;\n        sensor_msgs::ChannelFloat32 velocity_x_of_point;\n        sensor_msgs::ChannelFloat32 velocity_y_of_point;\n\n        feature_points->header = img_msg->header;\n        feature_points->header.frame_id = \"world\";\n\n        vector<set<int>> hash_ids(NUM_OF_CAM);\n        for (int i = 0; i < NUM_OF_CAM; i++)\n        {\n            auto &un_pts = trackerData[i].cur_un_pts;\n            auto &cur_pts = trackerData[i].cur_pts;\n            auto &ids = trackerData[i].ids;\n            auto &plane_ids = trackerData[i].plane_ids;\n            auto &pts_velocity = trackerData[i].pts_velocity;\n            for (unsigned int j = 0; j < ids.size(); j++)\n            {\n                if (trackerData[i].track_cnt[j] > 1)\n                {\n                    int p_id = ids[j];\n                    hash_ids[i].insert(p_id);\n                    geometry_msgs::Point32 p;\n                    p.x = un_pts[j].x;\n                    p.y = un_pts[j].y;\n                    p.z = 1;\n\n                    feature_points->points.push_back(p);\n                    id_of_point.values.push_back(p_id * NUM_OF_CAM + i);\n                    plane_id_of_point.values.push_back(plane_ids[j]);\n                    u_of_point.values.push_back(cur_pts[j].x);\n                    v_of_point.values.push_back(cur_pts[j].y);\n                    velocity_x_of_point.values.push_back(pts_velocity[j].x);\n                    velocity_y_of_point.values.push_back(pts_velocity[j].y);\n                }\n            }\n        }\n        feature_points->channels.push_back(id_of_point);\n        feature_points->channels.push_back(plane_id_of_point);\n        feature_points->channels.push_back(u_of_point);\n        feature_points->channels.push_back(v_of_point);\n        feature_points->channels.push_back(velocity_x_of_point);\n        feature_points->channels.push_back(velocity_y_of_point);\n        ROS_DEBUG(\"publish %f, at %f\", feature_points->header.stamp.toSec(), ros::Time::now().toSec());\n        // skip the first image; since no optical speed on frist image\n        if (!init_pub)\n        {\n            init_pub = 1;\n        }\n        else\n            pub_img.publish(feature_points);\n\n        if (SHOW_TRACK)\n        {\n            ptr = cv_bridge::cvtColor(ptr, sensor_msgs::image_encodings::BGR8);\n            //cv::Mat stereo_img(ROW * NUM_OF_CAM, COL, CV_8UC3);\n            cv::Mat stereo_img = ptr->image;\n            cv::Mat mask_img = mask_ptr->image;\n            std::vector<int> pids = trackerData[0].plane_ids;\n\n            std::set<int> unique_pids(pids.begin(), pids.end());\n            std::map<int, int> plane_counts;\n            for(auto &pid : unique_pids)\n                plane_counts[pid] = 0;\n\n            for(auto &pid : pids)\n                plane_counts[pid]++;\n\n            int largest_count = 0, largest_pid;\n            for(auto &plane_count : plane_counts)\n            {   \n                if(plane_count.second > largest_count)\n                {\n                    largest_count = plane_count.second;\n                    largest_pid = plane_count.first;\n                }\n            }\n    \n            cv::Mat mask = mask_img == largest_pid;\n            cv::Mat mask_viz(ROW, COL, CV_8UC3, cv::Scalar(0,0,0));\n            mask_viz.setTo(cv::Scalar(0,255,0), mask);\n\n            for (int i = 0; i < NUM_OF_CAM; i++)\n            {\n                cv::Mat tmp_img = stereo_img.rowRange(i * ROW, (i + 1) * ROW);\n                cv::cvtColor(show_img, tmp_img, CV_GRAY2RGB);\n                cv::addWeighted(tmp_img, 1.0, mask_viz, 0.3, 0.0, tmp_img);\n\n                for (unsigned int j = 0; j < trackerData[i].cur_pts.size(); j++)\n                {\n                    double len = std::min(1.0, 1.0 * trackerData[i].track_cnt[j] / WINDOW_SIZE);\n                    cv::circle(tmp_img, trackerData[i].cur_pts[j], 2, cv::Scalar(255 * (1 - len), 0, 255 * len), 2);\n                    //draw speed line\n                    /*\n                    Vector2d tmp_cur_un_pts (trackerData[i].cur_un_pts[j].x, trackerData[i].cur_un_pts[j].y);\n                    Vector2d tmp_pts_velocity (trackerData[i].pts_velocity[j].x, trackerData[i].pts_velocity[j].y);\n                    Vector3d tmp_prev_un_pts;\n                    tmp_prev_un_pts.head(2) = tmp_cur_un_pts - 0.10 * tmp_pts_velocity;\n                    tmp_prev_un_pts.z() = 1;\n                    Vector2d tmp_prev_uv;\n                    trackerData[i].m_camera->spaceToPlane(tmp_prev_un_pts, tmp_prev_uv);\n                    cv::line(tmp_img, trackerData[i].cur_pts[j], cv::Point2f(tmp_prev_uv.x(), tmp_prev_uv.y()), cv::Scalar(255 , 0, 0), 1 , 8, 0);\n                    */\n                    //char name[10];\n                    //sprintf(name, \"%d\", trackerData[i].plane_ids[j]);\n                    //cv::putText(tmp_img, name, trackerData[i].cur_pts[j], cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0));\n                }\n            }\n            //cv::imshow(\"vis\", stereo_img);\n            //cv::waitKey(5);\n            pub_match.publish(ptr->toImageMsg());\n        }\n    }\n    ROS_INFO(\"whole feature tracker processing costs: %f\", t_r.toc());\n\n}\n\nint main(int argc, char **argv)\n{\n    ros::init(argc, argv, \"rpvio_feature_tracker\");\n    ros::NodeHandle n(\"~\");\n    ros::console::set_logger_level(ROSCONSOLE_DEFAULT_NAME, ros::console::levels::Info);\n    readParameters(n);\n\n    for (int i = 0; i < NUM_OF_CAM; i++)\n        trackerData[i].readIntrinsicParameter(CAM_NAMES[i]);\n\n    if(FISHEYE)\n    {\n        for (int i = 0; i < NUM_OF_CAM; i++)\n        {\n            trackerData[i].fisheye_mask = cv::imread(FISHEYE_MASK, 0);\n            if(!trackerData[i].fisheye_mask.data)\n            {\n                ROS_INFO(\"load mask fail\");\n                ROS_BREAK();\n            }\n            else\n                ROS_INFO(\"load mask success\");\n        }\n    }\n\n    //ros::Subscriber sub_img = n.subscribe(IMAGE_TOPIC, 100, img_callback);\n    message_filters::Subscriber<sensor_msgs::Image> image_sub(n, IMAGE_TOPIC, 100);\n    message_filters::Subscriber<sensor_msgs::Image> mask_image_sub(n, MASK_TOPIC, 100);\n\n    message_filters::TimeSynchronizer<sensor_msgs::Image, sensor_msgs::Image> sync(image_sub, mask_image_sub, 10);\n    sync.registerCallback(boost::bind(&callback, _1, _2));\n\n    pub_img = n.advertise<sensor_msgs::PointCloud>(\"feature\", 1000);\n    pub_match = n.advertise<sensor_msgs::Image>(\"feature_img\",1000);\n    pub_restart = n.advertise<std_msgs::Bool>(\"restart\",1000);\n    /*\n    if (SHOW_TRACK)\n        cv::namedWindow(\"vis\", cv::WINDOW_NORMAL);\n    */\n    ros::spin();\n    return 0;\n}\n\n"
  },
  {
    "path": "rpvio_feature_tracker/src/parameters.cpp",
    "content": "#include \"parameters.h\"\n\nstd::string IMAGE_TOPIC;\nstd::string MASK_TOPIC;\nstd::string IMU_TOPIC;\nstd::vector<std::string> CAM_NAMES;\nstd::string FISHEYE_MASK;\nint MAX_CNT;\nint MIN_DIST;\nint WINDOW_SIZE;\nint FREQ;\ndouble H_THRESHOLD;\nint SHOW_TRACK;\nint STEREO_TRACK;\nint EQUALIZE;\nint ROW;\nint COL;\nint FOCAL_LENGTH;\nint FISHEYE;\nbool PUB_THIS_FRAME;\n\ntemplate <typename T>\nT readParam(ros::NodeHandle &n, std::string name)\n{\n    T ans;\n    if (n.getParam(name, ans))\n    {\n        ROS_INFO_STREAM(\"Loaded \" << name << \": \" << ans);\n    }\n    else\n    {\n        ROS_ERROR_STREAM(\"Failed to load \" << name);\n        n.shutdown();\n    }\n    return ans;\n}\n\nvoid readParameters(ros::NodeHandle &n)\n{\n    std::string config_file;\n    config_file = readParam<std::string>(n, \"config_file\");\n    cv::FileStorage fsSettings(config_file, cv::FileStorage::READ);\n    if(!fsSettings.isOpened())\n    {\n        std::cerr << \"ERROR: Wrong path to settings\" << std::endl;\n    }\n    std::string RPVIO_FOLDER_PATH = readParam<std::string>(n, \"rpvio_folder\");\n\n    fsSettings[\"image_topic\"] >> IMAGE_TOPIC;\n    fsSettings[\"mask_topic\"] >> MASK_TOPIC;\n    fsSettings[\"imu_topic\"] >> IMU_TOPIC;\n    MAX_CNT = fsSettings[\"max_cnt\"];\n    MIN_DIST = fsSettings[\"min_dist\"];\n    ROW = fsSettings[\"image_height\"];\n    COL = fsSettings[\"image_width\"];\n    FREQ = fsSettings[\"freq\"];\n    H_THRESHOLD = fsSettings[\"H_threshold\"];\n    SHOW_TRACK = fsSettings[\"show_track\"];\n    EQUALIZE = fsSettings[\"equalize\"];\n    FISHEYE = fsSettings[\"fisheye\"];\n    if (FISHEYE == 1)\n        FISHEYE_MASK = RPVIO_FOLDER_PATH + \"config/fisheye_mask.jpg\";\n    CAM_NAMES.push_back(config_file);\n\n    WINDOW_SIZE = 20;\n    STEREO_TRACK = false;\n    FOCAL_LENGTH = 460;\n    PUB_THIS_FRAME = false;\n\n    if (FREQ == 0)\n        FREQ = 100;\n\n    fsSettings.release();\n\n\n}\n"
  },
  {
    "path": "rpvio_feature_tracker/src/parameters.h",
    "content": "#pragma once\n#include <ros/ros.h>\n#include <opencv2/highgui/highgui.hpp>\n\nextern int ROW;\nextern int COL;\nextern int FOCAL_LENGTH;\nconst int NUM_OF_CAM = 1;\n\n\nextern std::string IMAGE_TOPIC;\nextern std::string MASK_TOPIC;\nextern std::string IMU_TOPIC;\nextern std::string FISHEYE_MASK;\nextern std::vector<std::string> CAM_NAMES;\nextern int MAX_CNT;\nextern int MIN_DIST;\nextern int WINDOW_SIZE;\nextern int FREQ;\nextern double H_THRESHOLD;\nextern int SHOW_TRACK;\nextern int STEREO_TRACK;\nextern int EQUALIZE;\nextern int FISHEYE;\nextern bool PUB_THIS_FRAME;\n\nvoid readParameters(ros::NodeHandle &n);\n"
  },
  {
    "path": "rpvio_feature_tracker/src/tic_toc.h",
    "content": "#pragma once\n\n#include <ctime>\n#include <cstdlib>\n#include <chrono>\n\nclass TicToc\n{\n  public:\n    TicToc()\n    {\n        tic();\n    }\n\n    void tic()\n    {\n        start = std::chrono::system_clock::now();\n    }\n\n    double toc()\n    {\n        end = std::chrono::system_clock::now();\n        std::chrono::duration<double> elapsed_seconds = end - start;\n        return elapsed_seconds.count() * 1000;\n    }\n\n  private:\n    std::chrono::time_point<std::chrono::system_clock> start, end;\n};\n"
  },
  {
    "path": "scripts/convert_vins_to_tum.py",
    "content": "import pandas as pd\nimport argparse\n\ndef main():\n    parser = argparse.ArgumentParser(description='Converts the csv output of VINS-Mono to TUM RGB-D format txt')\n    parser.add_argument('input_file_path', help='Path to the csv file')\n    parser.add_argument('output_file_path', help='Path to the output txt file')\n    args = parser.parse_args()\n\n    data = pd.read_csv(args.input_file_path, usecols=range(8), names=list('txyzwijk'))\n\n    with open(args.output_file_path,'w') as f:\n        for i in range(len(data['t'])):\n            f.write(f\"{data['t'][i]*1e-9} {data['x'][i]} {data['y'][i]} {data['z'][i]} {data['i'][i]} {data['j'][i]} {data['k'][i]} {data['w'][i]}\\n\")\n\nif __name__ == '__main__':\n    main()\n"
  },
  {
    "path": "scripts/run_advio_12.sh",
    "content": "#! /bin/bash\n\ndataset=$1\n\nrm -r $dataset/results/advio\nmkdir -p $dataset/results/advio\ncd $dataset/results/advio\nsed -i \"s@~@$HOME@g\" ~/catkin_ws/src/rp-vio/config/advio_12_config.yaml\n\necho -e \"######advio-12######\\n\" > report.txt\n\nfor((i = 1; i <=1; i++))\ndo\n    roslaunch rpvio_estimator advio_12.launch bagfile_path:=$dataset/12-mask.bag |& tee -a run_log.txt\n    cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv\n    python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv est_traj_$i.txt\n    rm rpvio_est.csv\n    evo_ape tum $dataset/groundtruth.txt est_traj_$i.txt --align --save_plot est_traj_$i.pdf |& tee -a report.txt\ndone\n"
  },
  {
    "path": "scripts/run_ol_market1.sh",
    "content": "#! /bin/bash\n\ndataset=$1\n\nrm -r $dataset/results/ol\nmkdir -p $dataset/results/ol\ncd $dataset/results/ol\nsed -i \"s@~@$HOME@g\" ~/catkin_ws/src/rp-vio/config/ol_market1_config.yaml\n\necho -e \"######ol-market1######\\n\" > report.txt\n\nfor((i = 1; i <=1; i++))\ndo\n    roslaunch rpvio_estimator ol_market1.launch bagfile_path:=$dataset/market1-mask.bag\n    cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv\n    python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv est_traj_$i.txt\n    rm rpvio_est.csv\n    evo_ape tum $dataset/groundtruth.txt est_traj_$i.txt --align --save_plot est_traj_$i.pdf |& tee -a report.txt\ndone\n"
  },
  {
    "path": "scripts/run_rpvio_sim.sh",
    "content": "dataset=$1\n\nrm -r $dataset/results/rpvio-sim/\nmkdir -p $dataset/results/rpvio-sim/\ncd $dataset/results/rpvio-sim/\nsed -i \"s@~@$HOME@g\" ~/catkin_ws/src/rp-vio/config/rpvio_sim_config.yaml\n\n########## static run #############\n\nmkdir static\necho -e \"######static######\\n\" > report.txt\n\nfor((i = 1; i <=1; i++))\ndo\n    roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/static/static.bag\n    cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv\n    python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv static/est_traj_$i.txt\n    rm rpvio_est.csv\n    evo_ape tum $dataset/static/groundtruth.txt static/est_traj_$i.txt --align --save_plot static/est_traj_$i.pdf |& tee -a report.txt\ndone\n\n########## c1 run #############\n\nmkdir c1\necho -e \"######c1######\\n\" > report.txt\n\nfor((i = 1; i <=1; i++))\ndo\n    roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c1/c1.bag\n    cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv\n    python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c1/est_traj_$i.txt\n    rm rpvio_est.csv\n    evo_ape tum $dataset/c1/groundtruth.txt c1/est_traj_$i.txt --align --save_plot c1/est_traj_$i.pdf |& tee -a report.txt\ndone\n\n########### c2 run #############\n\nmkdir c2\necho -e \"######c2######\\n\" > report.txt\n\nfor((i = 1; i <=1; i++))\ndo\n    roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c2/c2.bag\n    cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv\n    python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c2/est_traj_$i.txt\n    rm rpvio_est.csv\n    evo_ape tum $dataset/c2/groundtruth.txt c2/est_traj_$i.txt --align --save_plot c2/est_traj_$i.pdf |& tee -a report.txt\ndone\n\n########### c4 run #############\n\nmkdir c4\necho -e \"######c4######\\n\" > report.txt\n\nfor((i = 1; i <=1; i++))\ndo\n    roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c4/c4.bag\n    cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv\n    python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c4/est_traj_$i.txt\n    rm rpvio_est.csv\n    evo_ape tum $dataset/c4/groundtruth.txt c4/est_traj_$i.txt --align --save_plot c4/est_traj_$i.pdf |& tee -a report.txt\ndone\n\n############ c6 run #############\n\nmkdir c6\necho -e \"######c6######\\n\" > report.txt\n\nfor((i = 1; i <=1; i++))\ndo\n    roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c6/c6.bag\n    cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv\n    python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c6/est_traj_$i.txt\n    rm rpvio_est.csv\n    evo_ape tum $dataset/c6/groundtruth.txt c6/est_traj_$i.txt --align --save_plot c6/est_traj_$i.pdf |& tee -a report.txt\ndone\n\n############ c8 run #############\n\nmkdir c8\necho -e \"######c8######\\n\" > report.txt\n\nfor((i = 1; i <=1; i++))\ndo\n    roslaunch rpvio_estimator rpvio_sim.launch bagfile_path:=$dataset/c8/c8.bag\n    cp ~/output/rpvio_result_no_loop.csv ./rpvio_est.csv\n    python ~/catkin_ws/src/rp-vio/scripts/convert_vins_to_tum.py rpvio_est.csv c8/est_traj_$i.txt\n    rm rpvio_est.csv\n    evo_ape tum $dataset/c8/groundtruth.txt c8/est_traj_$i.txt --align --save_plot c8/est_traj_$i.pdf |& tee -a report.txt\ndone\n"
  }
]