[
  {
    "path": ".gitignore",
    "content": ".VSCodeCounter/\nbeacon_state.txt\nbuild/\nlib/\n.vscode/\nscratch\nlog/\n__pycache__\n*.so\nmarvelmind_SW*\n*previous_plan_state.json\nvenv/\n"
  },
  {
    "path": "LICENSE",
    "content": "                    GNU GENERAL PUBLIC LICENSE\n                       Version 3, 29 June 2007\n\n Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n                            Preamble\n\n  The GNU General Public License is a free, copyleft license for\nsoftware and other kinds of works.\n\n  The licenses for most software and other practical works are designed\nto take away your freedom to share and change the works.  By contrast,\nthe GNU General Public License is intended to guarantee your freedom to\nshare and change all versions of a program--to make sure it remains free\nsoftware for all its users.  We, the Free Software Foundation, use the\nGNU General Public License for most of our software; it applies also to\nany other work released this way by its authors.  You can apply it to\nyour programs, too.\n\n  When we speak of free software, we are referring to freedom, not\nprice.  Our General Public Licenses are designed to make sure that you\nhave the freedom to distribute copies of free software (and charge for\nthem if you wish), that you receive source code or can get it if you\nwant it, that you can change the software or use pieces of it in new\nfree programs, and that you know you can do these things.\n\n  To protect your rights, we need to prevent others from denying you\nthese rights or asking you to surrender the rights.  Therefore, you have\ncertain responsibilities if you distribute copies of the software, or if\nyou modify it: responsibilities to respect the freedom of others.\n\n  For example, if you distribute copies of such a program, whether\ngratis or for a fee, you must pass on to the recipients the same\nfreedoms that you received.  You must make sure that they, too, receive\nor can get the source code.  And you must show them these terms so they\nknow their rights.\n\n  Developers that use the GNU GPL protect your rights with two steps:\n(1) assert copyright on the software, and (2) offer you this License\ngiving you legal permission to copy, distribute and/or modify it.\n\n  For the developers' and authors' protection, the GPL clearly explains\nthat there is no warranty for this free software.  For both users' and\nauthors' sake, the GPL requires that modified versions be marked as\nchanged, so that their problems will not be attributed erroneously to\nauthors of previous versions.\n\n  Some devices are designed to deny users access to install or run\nmodified versions of the software inside them, although the manufacturer\ncan do so.  This is fundamentally incompatible with the aim of\nprotecting users' freedom to change the software.  The systematic\npattern of such abuse occurs in the area of products for individuals to\nuse, which is precisely where it is most unacceptable.  Therefore, we\nhave designed this version of the GPL to prohibit the practice for those\nproducts.  If such problems arise substantially in other domains, we\nstand ready to extend this provision to those domains in future versions\nof the GPL, as needed to protect the freedom of users.\n\n  Finally, every program is threatened constantly by software patents.\nStates should not allow patents to restrict development and use of\nsoftware on general-purpose computers, but in those that do, we wish to\navoid the special danger that patents applied to a free program could\nmake it effectively proprietary.  To prevent this, the GPL assures that\npatents cannot be used to render the program non-free.\n\n  The precise terms and conditions for copying, distribution and\nmodification follow.\n\n                       TERMS AND CONDITIONS\n\n  0. Definitions.\n\n  \"This License\" refers to version 3 of the GNU General Public License.\n\n  \"Copyright\" also means copyright-like laws that apply to other kinds of\nworks, such as semiconductor masks.\n\n  \"The Program\" refers to any copyrightable work licensed under this\nLicense.  Each licensee is addressed as \"you\".  \"Licensees\" and\n\"recipients\" may be individuals or organizations.\n\n  To \"modify\" a work means to copy from or adapt all or part of the work\nin a fashion requiring copyright permission, other than the making of an\nexact copy.  The resulting work is called a \"modified version\" of the\nearlier work or a work \"based on\" the earlier work.\n\n  A \"covered work\" means either the unmodified Program or a work based\non the Program.\n\n  To \"propagate\" a work means to do anything with it that, without\npermission, would make you directly or secondarily liable for\ninfringement under applicable copyright law, except executing it on a\ncomputer or modifying a private copy.  Propagation includes copying,\ndistribution (with or without modification), making available to the\npublic, and in some countries other activities as well.\n\n  To \"convey\" a work means any kind of propagation that enables other\nparties to make or receive copies.  Mere interaction with a user through\na computer network, with no transfer of a copy, is not conveying.\n\n  An interactive user interface displays \"Appropriate Legal Notices\"\nto the extent that it includes a convenient and prominently visible\nfeature that (1) displays an appropriate copyright notice, and (2)\ntells the user that there is no warranty for the work (except to the\nextent that warranties are provided), that licensees may convey the\nwork under this License, and how to view a copy of this License.  If\nthe interface presents a list of user commands or options, such as a\nmenu, a prominent item in the list meets this criterion.\n\n  1. Source Code.\n\n  The \"source code\" for a work means the preferred form of the work\nfor making modifications to it.  \"Object code\" means any non-source\nform of a work.\n\n  A \"Standard Interface\" means an interface that either is an official\nstandard defined by a recognized standards body, or, in the case of\ninterfaces specified for a particular programming language, one that\nis widely used among developers working in that language.\n\n  The \"System Libraries\" of an executable work include anything, other\nthan the work as a whole, that (a) is included in the normal form of\npackaging a Major Component, but which is not part of that Major\nComponent, and (b) serves only to enable use of the work with that\nMajor Component, or to implement a Standard Interface for which an\nimplementation is available to the public in source code form.  A\n\"Major Component\", in this context, means a major essential component\n(kernel, window system, and so on) of the specific operating system\n(if any) on which the executable work runs, or a compiler used to\nproduce the work, or an object code interpreter used to run it.\n\n  The \"Corresponding Source\" for a work in object code form means all\nthe source code needed to generate, install, and (for an executable\nwork) run the object code and to modify the work, including scripts to\ncontrol those activities.  However, it does not include the work's\nSystem Libraries, or general-purpose tools or generally available free\nprograms which are used unmodified in performing those activities but\nwhich are not part of the work.  For example, Corresponding Source\nincludes interface definition files associated with source files for\nthe work, and the source code for shared libraries and dynamically\nlinked subprograms that the work is specifically designed to require,\nsuch as by intimate data communication or control flow between those\nsubprograms and other parts of the work.\n\n  The Corresponding Source need not include anything that users\ncan regenerate automatically from other parts of the Corresponding\nSource.\n\n  The Corresponding Source for a work in source code form is that\nsame work.\n\n  2. Basic Permissions.\n\n  All rights granted under this License are granted for the term of\ncopyright on the Program, and are irrevocable provided the stated\nconditions are met.  This License explicitly affirms your unlimited\npermission to run the unmodified Program.  The output from running a\ncovered work is covered by this License only if the output, given its\ncontent, constitutes a covered work.  This License acknowledges your\nrights of fair use or other equivalent, as provided by copyright law.\n\n  You may make, run and propagate covered works that you do not\nconvey, without conditions so long as your license otherwise remains\nin force.  You may convey covered works to others for the sole purpose\nof having them make modifications exclusively for you, or provide you\nwith facilities for running those works, provided that you comply with\nthe terms of this License in conveying all material for which you do\nnot control copyright.  Those thus making or running the covered works\nfor you must do so exclusively on your behalf, under your direction\nand control, on terms that prohibit them from making any copies of\nyour copyrighted material outside their relationship with you.\n\n  Conveying under any other circumstances is permitted solely under\nthe conditions stated below.  Sublicensing is not allowed; section 10\nmakes it unnecessary.\n\n  3. Protecting Users' Legal Rights From Anti-Circumvention Law.\n\n  No covered work shall be deemed part of an effective technological\nmeasure under any applicable law fulfilling obligations under article\n11 of the WIPO copyright treaty adopted on 20 December 1996, or\nsimilar laws prohibiting or restricting circumvention of such\nmeasures.\n\n  When you convey a covered work, you waive any legal power to forbid\ncircumvention of technological measures to the extent such circumvention\nis effected by exercising rights under this License with respect to\nthe covered work, and you disclaim any intention to limit operation or\nmodification of the work as a means of enforcing, against the work's\nusers, your or third parties' legal rights to forbid circumvention of\ntechnological measures.\n\n  4. Conveying Verbatim Copies.\n\n  You may convey verbatim copies of the Program's source code as you\nreceive it, in any medium, provided that you conspicuously and\nappropriately publish on each copy an appropriate copyright notice;\nkeep intact all notices stating that this License and any\nnon-permissive terms added in accord with section 7 apply to the code;\nkeep intact all notices of the absence of any warranty; and give all\nrecipients a copy of this License along with the Program.\n\n  You may charge any price or no price for each copy that you convey,\nand you may offer support or warranty protection for a fee.\n\n  5. Conveying Modified Source Versions.\n\n  You may convey a work based on the Program, or the modifications to\nproduce it from the Program, in the form of source code under the\nterms of section 4, provided that you also meet all of these conditions:\n\n    a) The work must carry prominent notices stating that you modified\n    it, and giving a relevant date.\n\n    b) The work must carry prominent notices stating that it is\n    released under this License and any conditions added under section\n    7.  This requirement modifies the requirement in section 4 to\n    \"keep intact all notices\".\n\n    c) You must license the entire work, as a whole, under this\n    License to anyone who comes into possession of a copy.  This\n    License will therefore apply, along with any applicable section 7\n    additional terms, to the whole of the work, and all its parts,\n    regardless of how they are packaged.  This License gives no\n    permission to license the work in any other way, but it does not\n    invalidate such permission if you have separately received it.\n\n    d) If the work has interactive user interfaces, each must display\n    Appropriate Legal Notices; however, if the Program has interactive\n    interfaces that do not display Appropriate Legal Notices, your\n    work need not make them do so.\n\n  A compilation of a covered work with other separate and independent\nworks, which are not by their nature extensions of the covered work,\nand which are not combined with it such as to form a larger program,\nin or on a volume of a storage or distribution medium, is called an\n\"aggregate\" if the compilation and its resulting copyright are not\nused to limit the access or legal rights of the compilation's users\nbeyond what the individual works permit.  Inclusion of a covered work\nin an aggregate does not cause this License to apply to the other\nparts of the aggregate.\n\n  6. Conveying Non-Source Forms.\n\n  You may convey a covered work in object code form under the terms\nof sections 4 and 5, provided that you also convey the\nmachine-readable Corresponding Source under the terms of this License,\nin one of these ways:\n\n    a) Convey the object code in, or embodied in, a physical product\n    (including a physical distribution medium), accompanied by the\n    Corresponding Source fixed on a durable physical medium\n    customarily used for software interchange.\n\n    b) Convey the object code in, or embodied in, a physical product\n    (including a physical distribution medium), accompanied by a\n    written offer, valid for at least three years and valid for as\n    long as you offer spare parts or customer support for that product\n    model, to give anyone who possesses the object code either (1) a\n    copy of the Corresponding Source for all the software in the\n    product that is covered by this License, on a durable physical\n    medium customarily used for software interchange, for a price no\n    more than your reasonable cost of physically performing this\n    conveying of source, or (2) access to copy the\n    Corresponding Source from a network server at no charge.\n\n    c) Convey individual copies of the object code with a copy of the\n    written offer to provide the Corresponding Source.  This\n    alternative is allowed only occasionally and noncommercially, and\n    only if you received the object code with such an offer, in accord\n    with subsection 6b.\n\n    d) Convey the object code by offering access from a designated\n    place (gratis or for a charge), and offer equivalent access to the\n    Corresponding Source in the same way through the same place at no\n    further charge.  You need not require recipients to copy the\n    Corresponding Source along with the object code.  If the place to\n    copy the object code is a network server, the Corresponding Source\n    may be on a different server (operated by you or a third party)\n    that supports equivalent copying facilities, provided you maintain\n    clear directions next to the object code saying where to find the\n    Corresponding Source.  Regardless of what server hosts the\n    Corresponding Source, you remain obligated to ensure that it is\n    available for as long as needed to satisfy these requirements.\n\n    e) Convey the object code using peer-to-peer transmission, provided\n    you inform other peers where the object code and Corresponding\n    Source of the work are being offered to the general public at no\n    charge under subsection 6d.\n\n  A separable portion of the object code, whose source code is excluded\nfrom the Corresponding Source as a System Library, need not be\nincluded in conveying the object code work.\n\n  A \"User Product\" is either (1) a \"consumer product\", which means any\ntangible personal property which is normally used for personal, family,\nor household purposes, or (2) anything designed or sold for incorporation\ninto a dwelling.  In determining whether a product is a consumer product,\ndoubtful cases shall be resolved in favor of coverage.  For a particular\nproduct received by a particular user, \"normally used\" refers to a\ntypical or common use of that class of product, regardless of the status\nof the particular user or of the way in which the particular user\nactually uses, or expects or is expected to use, the product.  A product\nis a consumer product regardless of whether the product has substantial\ncommercial, industrial or non-consumer uses, unless such uses represent\nthe only significant mode of use of the product.\n\n  \"Installation Information\" for a User Product means any methods,\nprocedures, authorization keys, or other information required to install\nand execute modified versions of a covered work in that User Product from\na modified version of its Corresponding Source.  The information must\nsuffice to ensure that the continued functioning of the modified object\ncode is in no case prevented or interfered with solely because\nmodification has been made.\n\n  If you convey an object code work under this section in, or with, or\nspecifically for use in, a User Product, and the conveying occurs as\npart of a transaction in which the right of possession and use of the\nUser Product is transferred to the recipient in perpetuity or for a\nfixed term (regardless of how the transaction is characterized), the\nCorresponding Source conveyed under this section must be accompanied\nby the Installation Information.  But this requirement does not apply\nif neither you nor any third party retains the ability to install\nmodified object code on the User Product (for example, the work has\nbeen installed in ROM).\n\n  The requirement to provide Installation Information does not include a\nrequirement to continue to provide support service, warranty, or updates\nfor a work that has been modified or installed by the recipient, or for\nthe User Product in which it has been modified or installed.  Access to a\nnetwork may be denied when the modification itself materially and\nadversely affects the operation of the network or violates the rules and\nprotocols for communication across the network.\n\n  Corresponding Source conveyed, and Installation Information provided,\nin accord with this section must be in a format that is publicly\ndocumented (and with an implementation available to the public in\nsource code form), and must require no special password or key for\nunpacking, reading or copying.\n\n  7. Additional Terms.\n\n  \"Additional permissions\" are terms that supplement the terms of this\nLicense by making exceptions from one or more of its conditions.\nAdditional permissions that are applicable to the entire Program shall\nbe treated as though they were included in this License, to the extent\nthat they are valid under applicable law.  If additional permissions\napply only to part of the Program, that part may be used separately\nunder those permissions, but the entire Program remains governed by\nthis License without regard to the additional permissions.\n\n  When you convey a copy of a covered work, you may at your option\nremove any additional permissions from that copy, or from any part of\nit.  (Additional permissions may be written to require their own\nremoval in certain cases when you modify the work.)  You may place\nadditional permissions on material, added by you to a covered work,\nfor which you have or can give appropriate copyright permission.\n\n  Notwithstanding any other provision of this License, for material you\nadd to a covered work, you may (if authorized by the copyright holders of\nthat material) supplement the terms of this License with terms:\n\n    a) Disclaiming warranty or limiting liability differently from the\n    terms of sections 15 and 16 of this License; or\n\n    b) Requiring preservation of specified reasonable legal notices or\n    author attributions in that material or in the Appropriate Legal\n    Notices displayed by works containing it; or\n\n    c) Prohibiting misrepresentation of the origin of that material, or\n    requiring that modified versions of such material be marked in\n    reasonable ways as different from the original version; or\n\n    d) Limiting the use for publicity purposes of names of licensors or\n    authors of the material; or\n\n    e) Declining to grant rights under trademark law for use of some\n    trade names, trademarks, or service marks; or\n\n    f) Requiring indemnification of licensors and authors of that\n    material by anyone who conveys the material (or modified versions of\n    it) with contractual assumptions of liability to the recipient, for\n    any liability that these contractual assumptions directly impose on\n    those licensors and authors.\n\n  All other non-permissive additional terms are considered \"further\nrestrictions\" within the meaning of section 10.  If the Program as you\nreceived it, or any part of it, contains a notice stating that it is\ngoverned by this License along with a term that is a further\nrestriction, you may remove that term.  If a license document contains\na further restriction but permits relicensing or conveying under this\nLicense, you may add to a covered work material governed by the terms\nof that license document, provided that the further restriction does\nnot survive such relicensing or conveying.\n\n  If you add terms to a covered work in accord with this section, you\nmust place, in the relevant source files, a statement of the\nadditional terms that apply to those files, or a notice indicating\nwhere to find the applicable terms.\n\n  Additional terms, permissive or non-permissive, may be stated in the\nform of a separately written license, or stated as exceptions;\nthe above requirements apply either way.\n\n  8. Termination.\n\n  You may not propagate or modify a covered work except as expressly\nprovided under this License.  Any attempt otherwise to propagate or\nmodify it is void, and will automatically terminate your rights under\nthis License (including any patent licenses granted under the third\nparagraph of section 11).\n\n  However, if you cease all violation of this License, then your\nlicense from a particular copyright holder is reinstated (a)\nprovisionally, unless and until the copyright holder explicitly and\nfinally terminates your license, and (b) permanently, if the copyright\nholder fails to notify you of the violation by some reasonable means\nprior to 60 days after the cessation.\n\n  Moreover, your license from a particular copyright holder is\nreinstated permanently if the copyright holder notifies you of the\nviolation by some reasonable means, this is the first time you have\nreceived notice of violation of this License (for any work) from that\ncopyright holder, and you cure the violation prior to 30 days after\nyour receipt of the notice.\n\n  Termination of your rights under this section does not terminate the\nlicenses of parties who have received copies or rights from you under\nthis License.  If your rights have been terminated and not permanently\nreinstated, you do not qualify to receive new licenses for the same\nmaterial under section 10.\n\n  9. Acceptance Not Required for Having Copies.\n\n  You are not required to accept this License in order to receive or\nrun a copy of the Program.  Ancillary propagation of a covered work\noccurring solely as a consequence of using peer-to-peer transmission\nto receive a copy likewise does not require acceptance.  However,\nnothing other than this License grants you permission to propagate or\nmodify any covered work.  These actions infringe copyright if you do\nnot accept this License.  Therefore, by modifying or propagating a\ncovered work, you indicate your acceptance of this License to do so.\n\n  10. Automatic Licensing of Downstream Recipients.\n\n  Each time you convey a covered work, the recipient automatically\nreceives a license from the original licensors, to run, modify and\npropagate that work, subject to this License.  You are not responsible\nfor enforcing compliance by third parties with this License.\n\n  An \"entity transaction\" is a transaction transferring control of an\norganization, or substantially all assets of one, or subdividing an\norganization, or merging organizations.  If propagation of a covered\nwork results from an entity transaction, each party to that\ntransaction who receives a copy of the work also receives whatever\nlicenses to the work the party's predecessor in interest had or could\ngive under the previous paragraph, plus a right to possession of the\nCorresponding Source of the work from the predecessor in interest, if\nthe predecessor has it or can get it with reasonable efforts.\n\n  You may not impose any further restrictions on the exercise of the\nrights granted or affirmed under this License.  For example, you may\nnot impose a license fee, royalty, or other charge for exercise of\nrights granted under this License, and you may not initiate litigation\n(including a cross-claim or counterclaim in a lawsuit) alleging that\nany patent claim is infringed by making, using, selling, offering for\nsale, or importing the Program or any portion of it.\n\n  11. Patents.\n\n  A \"contributor\" is a copyright holder who authorizes use under this\nLicense of the Program or a work on which the Program is based.  The\nwork thus licensed is called the contributor's \"contributor version\".\n\n  A contributor's \"essential patent claims\" are all patent claims\nowned or controlled by the contributor, whether already acquired or\nhereafter acquired, that would be infringed by some manner, permitted\nby this License, of making, using, or selling its contributor version,\nbut do not include claims that would be infringed only as a\nconsequence of further modification of the contributor version.  For\npurposes of this definition, \"control\" includes the right to grant\npatent sublicenses in a manner consistent with the requirements of\nthis License.\n\n  Each contributor grants you a non-exclusive, worldwide, royalty-free\npatent license under the contributor's essential patent claims, to\nmake, use, sell, offer for sale, import and otherwise run, modify and\npropagate the contents of its contributor version.\n\n  In the following three paragraphs, a \"patent license\" is any express\nagreement or commitment, however denominated, not to enforce a patent\n(such as an express permission to practice a patent or covenant not to\nsue for patent infringement).  To \"grant\" such a patent license to a\nparty means to make such an agreement or commitment not to enforce a\npatent against the party.\n\n  If you convey a covered work, knowingly relying on a patent license,\nand the Corresponding Source of the work is not available for anyone\nto copy, free of charge and under the terms of this License, through a\npublicly available network server or other readily accessible means,\nthen you must either (1) cause the Corresponding Source to be so\navailable, or (2) arrange to deprive yourself of the benefit of the\npatent license for this particular work, or (3) arrange, in a manner\nconsistent with the requirements of this License, to extend the patent\nlicense to downstream recipients.  \"Knowingly relying\" means you have\nactual knowledge that, but for the patent license, your conveying the\ncovered work in a country, or your recipient's use of the covered work\nin a country, would infringe one or more identifiable patents in that\ncountry that you have reason to believe are valid.\n\n  If, pursuant to or in connection with a single transaction or\narrangement, you convey, or propagate by procuring conveyance of, a\ncovered work, and grant a patent license to some of the parties\nreceiving the covered work authorizing them to use, propagate, modify\nor convey a specific copy of the covered work, then the patent license\nyou grant is automatically extended to all recipients of the covered\nwork and works based on it.\n\n  A patent license is \"discriminatory\" if it does not include within\nthe scope of its coverage, prohibits the exercise of, or is\nconditioned on the non-exercise of one or more of the rights that are\nspecifically granted under this License.  You may not convey a covered\nwork if you are a party to an arrangement with a third party that is\nin the business of distributing software, under which you make payment\nto the third party based on the extent of your activity of conveying\nthe work, and under which the third party grants, to any of the\nparties who would receive the covered work from you, a discriminatory\npatent license (a) in connection with copies of the covered work\nconveyed by you (or copies made from those copies), or (b) primarily\nfor and in connection with specific products or compilations that\ncontain the covered work, unless you entered into that arrangement,\nor that patent license was granted, prior to 28 March 2007.\n\n  Nothing in this License shall be construed as excluding or limiting\nany implied license or other defenses to infringement that may\notherwise be available to you under applicable patent law.\n\n  12. No Surrender of Others' Freedom.\n\n  If conditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License.  If you cannot convey a\ncovered work so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you may\nnot convey it at all.  For example, if you agree to terms that obligate you\nto collect a royalty for further conveying from those to whom you convey\nthe Program, the only way you could satisfy both those terms and this\nLicense would be to refrain entirely from conveying the Program.\n\n  13. Use with the GNU Affero General Public License.\n\n  Notwithstanding any other provision of this License, you have\npermission to link or combine any covered work with a work licensed\nunder version 3 of the GNU Affero General Public License into a single\ncombined work, and to convey the resulting work.  The terms of this\nLicense will continue to apply to the part which is the covered work,\nbut the special requirements of the GNU Affero General Public License,\nsection 13, concerning interaction through a network will apply to the\ncombination as such.\n\n  14. Revised Versions of this License.\n\n  The Free Software Foundation may publish revised and/or new versions of\nthe GNU General Public License from time to time.  Such new versions will\nbe similar in spirit to the present version, but may differ in detail to\naddress new problems or concerns.\n\n  Each version is given a distinguishing version number.  If the\nProgram specifies that a certain numbered version of the GNU General\nPublic License \"or any later version\" applies to it, you have the\noption of following the terms and conditions either of that numbered\nversion or of any later version published by the Free Software\nFoundation.  If the Program does not specify a version number of the\nGNU General Public License, you may choose any version ever published\nby the Free Software Foundation.\n\n  If the Program specifies that a proxy can decide which future\nversions of the GNU General Public License can be used, that proxy's\npublic statement of acceptance of a version permanently authorizes you\nto choose that version for the Program.\n\n  Later license versions may give you additional or different\npermissions.  However, no additional obligations are imposed on any\nauthor or copyright holder as a result of your choosing to follow a\nlater version.\n\n  15. Disclaimer of Warranty.\n\n  THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY\nAPPLICABLE LAW.  EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT\nHOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM \"AS IS\" WITHOUT WARRANTY\nOF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,\nTHE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE.  THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM\nIS WITH YOU.  SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF\nALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n\n  16. Limitation of Liability.\n\n  IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING\nWILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS\nTHE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY\nGENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE\nUSE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF\nDATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD\nPARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),\nEVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF\nSUCH DAMAGES.\n\n  17. Interpretation of Sections 15 and 16.\n\n  If the disclaimer of warranty and limitation of liability provided\nabove cannot be given local legal effect according to their terms,\nreviewing courts shall apply local law that most closely approximates\nan absolute waiver of all civil liability in connection with the\nProgram, unless a warranty or assumption of liability accompanies a\ncopy of the Program in return for a fee.\n\n                     END OF TERMS AND CONDITIONS\n\n            How to Apply These Terms to Your New Programs\n\n  If you develop a new program, and you want it to be of the greatest\npossible use to the public, the best way to achieve this is to make it\nfree software which everyone can redistribute and change under these terms.\n\n  To do so, attach the following notices to the program.  It is safest\nto attach them to the start of each source file to most effectively\nstate the exclusion of warranty; and each file should have at least\nthe \"copyright\" line and a pointer to where the full notice is found.\n\n    <one line to give the program's name and a brief idea of what it does.>\n    Copyright (C) <year>  <name of author>\n\n    This program is free software: you can redistribute it and/or modify\n    it under the terms of the GNU General Public License as published by\n    the Free Software Foundation, either version 3 of the License, or\n    (at your option) any later version.\n\n    This program is distributed in the hope that it will be useful,\n    but WITHOUT ANY WARRANTY; without even the implied warranty of\n    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the\n    GNU General Public License for more details.\n\n    You should have received a copy of the GNU General Public License\n    along with this program.  If not, see <https://www.gnu.org/licenses/>.\n\nAlso add information on how to contact you by electronic and paper mail.\n\n  If the program does terminal interaction, make it output a short\nnotice like this when it starts in an interactive mode:\n\n    <program>  Copyright (C) <year>  <name of author>\n    This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.\n    This is free software, and you are welcome to redistribute it\n    under certain conditions; type `show c' for details.\n\nThe hypothetical commands `show w' and `show c' should show the appropriate\nparts of the General Public License.  Of course, your program's commands\nmight be different; for a GUI interface, you would use an \"about box\".\n\n  You should also get your employer (if you work as a programmer) or school,\nif any, to sign a \"copyright disclaimer\" for the program, if necessary.\nFor more information on this, and how to apply and follow the GNU GPL, see\n<https://www.gnu.org/licenses/>.\n\n  The GNU General Public License does not permit incorporating your program\ninto proprietary programs.  If your program is a subroutine library, you\nmay consider it more useful to permit linking proprietary applications with\nthe library.  If this is what you want to do, use the GNU Lesser General\nPublic License instead of this License.  But first, please read\n<https://www.gnu.org/licenses/why-not-lgpl.html>.\n"
  },
  {
    "path": "README.md",
    "content": "# Domino Robot\n\nThis is the code for Mark Rober's Domino Robot project. You can find a bunch more info about the project, including details about the software architecture [on my website](https://www.baucomrobotics.com/domino-robot).\n\nHere is the video that Mark made about the project:\n[![Marks video](https://img.youtube.com/vi/8HEfIJlcFbs/0.jpg)](https://www.youtube.com/watch?v=8HEfIJlcFbs)\n\n## Folder structure\nIf you are interested in browsing the files in this repo, here is the general structure to get you oriented.\n- `doc`: Some high level documentation stuff, much of it is probably out of date\n- `experimental_testing`: Various experimental files\n- `src`: The real stuff\n    - `master`: Code for running the master controller and GUI\n    - `robot`: Code that runs on the Raspberry Pi of the robot\n    - `robot_motor_driver`: Code that runs on the ClearCore of the robot\n    - `tools`: Some various utility scripts\n\n## Usage (Not Recommended)\nThis repository exists mostly for those who are interested in browsing the code to see how it works. It is almost certainly a bad idea to try and download this code and use it unless you have identical hardware to the robot in the video and/or really know what you are doing. You should consider this code completely unsupported as I do not plan to make future updates or fix any bugs you may find.\n\nIf for some reason you really want to try and run this code, there are some (out of date) instructions on how to run the [the master code](src/master/README.md) and also some (possibly also out of date) info on which libraries are used for compiling [the robot code](src/robot/README.md). Good luck.\n"
  },
  {
    "path": "doc/DominoRobotControllerFlowchart.drawio",
    "content": "<mxfile host=\"app.diagrams.net\" modified=\"2021-06-06T17:16:56.287Z\" agent=\"5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.212 Safari/537.36\" etag=\"0eKxN2Gj-2nmRRTKrcC-\" version=\"14.7.6\" type=\"device\"><diagram id=\"NM_vhxp-IM1YrRFfsY-A\" name=\"Page-1\">7VxZc6M4EP41rpp5SIob/Bg7x0zNzFZ2kp1j32SQbRJAXiEn9vz6FSDMIYGxjQ05kqoEGiFA/XV/3S3BQB37qxsMFvNvyIHeQJGc1UC9HCiKLJsW/RdJ1onE0OREMMOuwxplgjv3D2RCiUmXrgPDQkOCkEfcRVFooyCANinIAMboudhsirziVRdgBjnBnQ08XvrTdcg8fS5Jyg58gu5szi5t6eyAD9LGTBDOgYOecyL1aqCOMUIk2fJXY+hFg5eOS3LedcXRzY1hGJAmJ9z8a3sPo0/Sw83vv3788C2HwPBMS3p5At6SPTC7WbJORwCjZeDAqBNpoI6e5y6BdwtgR0efqc6pbE58j+7JdNMB4TxuG+1MXc8bIw/huCPVAdCa2lQeEoweYe6IYVtwMqVHZhg4Ln2e3DETAgNGV2Z3CjGBq8ohkDcDSxEJkQ8JXtMm7IRUFQyLisr2nzPNGqn+5jmlqumJgKFptuk6G3C6wcZ8h/FXTjn+OrQcTTT+ljJRDUM8/kPTkUyznfE3zO0KUM1jKSB4Ivc3n9XV31c/F2Pv/hH/M70/UzkFrAaK4ZFomJYTujmLNgkGD6mYXiZ35AP5yGkMOtSFsF2EyRzNUAC8q0w6Kuo0a/MVoQVT3gMkZM38IVgSVNSzBybQu0WhS1wUUJlNNQCpxkaRflzqvr6WGpCo4xHw3Jmw+QU7MEGEID/CDiqgQIp/qJyeiNe/2I3HO7+jnXM93b1c5Q9ertlefMMjYD/O4mdPuw5QADfY4oCkSNFQFp0/wDNIahSqiBGIoQeI+1TsTIQmduotcumtbJAr62YBuRwgQ7TENmRn5b1wqSOK7/qOkges6ShtiKbTEBLOADbPvL9NpMOTM4qnl2AUcOWSX7ntBJmmznYzZEY76zxMBYA2GyG6CxPc05QShNaofScL40zzAJM7yIErfAhDBGC1lzgeESFeOwFrBe6U14a71vDEPGiKU+ZA00ThUAc/LIUmejMHf4ExWOeaLaIGYfPrqJJUMoGkx9M7faM6EooQERlImp0Z/y2jrGUkZ5uZKXGmR/M3F1NTqTC9jTS5SocWyegjpYi8PVbSR0jhTS6iRDOyLQ+EoWun4mvXy6zdSRsxe6ESdvwlGnVTMmlq1EWbPmvJqNUtNl0Ra+1q02qtSW+9K6MnHoCP+u5phAdtgmJN3cAAYkB3ONvcKUMVpJg0w5/E6WcpYZ1aNrSFBYOJpWu6VAdQLk+tTkpL6rD4nFSWBTmpdayUVG8Ufdt+pUd9Q+nosdyX0jB20foVC3PI+Y7oEFLRmI4fRp5HR5mBZoILKEl5PBros4ToL2gDWVqsqkl+h7jgw+1n2oCaDJVI19fR9iKFQ1y9Zbf3sWlA8LKdjlKuhHXudUwOO104kio739+StYaWbPTKklU+Gq8kgbGHqMtsSAYUn6Q45M2dMKaB/B8wibuKlMLiItqvPhrol1vdca0T3509eEzUApsvXLM5I/ZEg/y0jDDxlM5VzTQKdiu3Eqdaoj6PH/DxlZMxQthxA0AgldPoLwinCPtdUcbMQ5MI2BL1EooUo/xk9DCdAj0GZ5keprZD9SWgB8ccTup9VXN60HoXlA5fKT207vbFFj7UhgWFqkdKRbnrKDXJZVt+ROVjh2qy8lGcQr6TVa2Z9ZisZPXckkX9Hp+w+KDoc0CHKozY6osbQJ8+lx0egQucoTBVmE4Vcarg0NxCN9rhArNcL+qcC2S+QtEnMtjMrGymWZpNUW/qv+dpyff3IKsMi8u/+xOP3LTEILdeY6igjhLONPM4MyBDTS9exzoFRfHxS0ZRCxAcHtjmeAyGZBvRZdL44u/8V3YtbRCgLBvFxRZn7TDgmSzk1RNU6PkC3z18DFybyr4lcCtU+lrmwUn8K+BBPfoV8aAR/7TDg1Z55lTvnAdfa81Mblo0k/tVNUvXFm9JRBL//F41q9Vsm2UzpVUvfAI/y1fGrhF+Bth5S4mGpnXuYF9r1Sn1m9sdrNUvB8svl650sO+OtR7TrUS4hqm2EtKa54ZyqihWs7owYvHqrqZrg3PLgTdFgZpSQmFVGINbfklYcobDrxvLVoV17HiUtv3OXrUCrRTzl2oF29uX34kqL8XS1YPaq3Vrt1qLRviyZ7/m6VgknUzTJZN2b2SeTtf6FjIpL9G5vhLfajUN6noV06VR/4tCjNIFZDpb5XgCTLbO94clfhZHeV+A58d1eqoQcoTiZrfrAdXeEYlm9MctbHkHo2rCb+srfL2lEq1xAbZn9QG+Ii56nZpNkPkHFwk8OCUCh+q7jpPg8UQlApwYZCURbG6Ih0St8bVTJDAlVnPYt0iQFm7TVaknqBHwFdjbbLk4xY8blWDjnQ828CGmt6RI8QTYN4CfoOe7gRPybyG/7Cmwcrqhd74UROWXglQa/LEKgjtYe0Q4LKXdy3HvQOhSidANXlWqQFNH+9CFxtfS3zUVHzV6pql0Xvu9yN6kyF6l752ZUpP1c0lSVEsa6vRvcYmjVqftk89P6vw0TLlut636p2qi6t8lorwZVfGq3xvbVtDrhVuI5CzHUJWWHHrpjS3ZFLiJocBPaC34CeG3ow4HgSwEwdiDAI8RhvurvBgFpa4gFzIxUXNoiEK1YjBX+PRVpVNo/rEqSy4qXPSxKpG+j/atMNGq2zb0/R2EiwnE8bPfum9b6dx7mcdTOt3NPv6XMEP2CUX16n8=</diagram></mxfile>"
  },
  {
    "path": "doc/DominoRobotSoftwareArchitecture.drawio",
    "content": "<mxfile host=\"app.diagrams.net\" modified=\"2021-06-24T01:50:07.329Z\" agent=\"5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.114 Safari/537.36\" etag=\"4WDjcxLQaKL4s1FWm-vA\" version=\"14.7.9\" type=\"device\"><diagram id=\"HjjDRPqPhWfrdKk6NoQq\" name=\"Page-1\">7V1bc9o4FP41mek+pCNLvj4mtElv2WabXvdlR2ABao3FCNEk/fUrgwW2JYwBG5s2yUOsY9kYfed+jpwz1Js8XHM8Hd+wkERnEIQPZ+jFGYSW4/nyT0J5TCnAtpaUEadhSlsT7ugvoiam1DkNySw3UTAWCTrNEwcsjslA5GiYc3afnzZkUf5Tp3iUfiJYE+4GOCLatC80FOMl1Xcys18ROhqrT7ZAemaC1eSUMBvjkN1nSOjlGepxxsTyaPLQI1GyempdltddbTi7ejBOYlHlgv7s/T90dh1dzV5fvjl/IaZ/2/A8vctPHM3TL3yDZ4JwSXv2Dk8Fm/6VPr14VEvC2TwOSXJXcIYu78dUkLspHiRn7yUXSNpYTCI5suThT8IFlct5EdFRLGnyjpI6ZLFIsbZcOR5xHFL5PXosYnzxKai/+E3m0ijK0IdO8ivpM8HZD5I54y5+VmcUXHKhL/W1Ul9cPh55yJDStbsmbEIEf5RT0rPnloOW1zzmOfR+zRaeoo0zLOEoIk5ZcbS69xoteZACtgN40NGwIaHk3nTIuBizEYtx9HJNvcyjt57zjiXILDD7ToR4TOHBc8HyiJIHKr5mjr8lt3rupKMXD+mdF4NHNYjl9/2aHWSuSobryxYjdV2RTzbiOGNzPiAla5UulcB8RESZQKRgJQtZyhacRFjQn3k1UTvEjiafV5RIFQbBbYSlxuOHSadB7jyCXQJ0uQsx8YcDo9wNfNIfanKH6pI7kJc7V5c7GBjkLmhK7BSLZEC5/vT6twcCAq9jOEANhw/zWNAJkcQbHEvTfKB4mI3XCWClCQ0EOlouMKDVIFyBBleiw+pFaELDcGHmxozTX9J64BIJk6D1Fw5D0cXwB2RgRK3vO7YDGkMtKIDmGETM5GKsfMz6XQxd131gfemyQvA6ln7iMAHmODLWNbQs6G1FCznHRUt35xVavShZziNBFXgh8DyDOnSIH9omqHzYRwbfvS6onAJUQUXbZXmNQaUbr4+924aAWmlFA1ah46Eg1LEibghCYsIqCFzLd5rCquhmIKRjZRmxai7O8jQ4jhhnrWOrb5kz2+KsdWj1LXuu8TgLumdtBVqLSy84x4+ZCVNGYzHL3Pk2IazZDTkwx27nDijwy/KOa+5ZPdoBDOW3ylA7BO4tMIZTN18cJvuupqc/zRoIu4dD7ABDBDEcDkLJLgY1HHpBHzTm3awCBiUXlt12vAf1AEIXozi8SLK+cjSI8GxGl44h5kInZwApsvla93rQzYjLOXgOwJZUlxzcEi6j0CSNqqllG9lZxWzJ+6EtynkxKt5yf0FUYXN9GjrDESZ/V9F2U+SapoZBwYkrutHLb55elU2EF1W+nb+R5uQtV0a7UV3aHxnYtvPuhGu7+dQtAPY+fFuUtA2yuSBf0Ug9/f7sXjnxG3TK7iCkKbsbJqTuh+AFl5+eHDzrRQTzHuPkBIs0NdgoaZMKJRlTDO6ajBRqLAZXZrIt6Yb7RQuwKN3eSUg3qireyO2UeKsqeFtc4u3FJbafiyoTb0iqzd+LT7xO8QnSC4B3ci1xcmGPTSaz2gORzpcyLFS1ltFYJIL0oFDlWT+TiA2oSL7AKxyH0Z9YoZVmue1YUYlxBqF3dLhsbLkYCMriPwigYkr1uPhga353MfjPfRO+dT7h96N/73u+oe9ICdCzD3g27RO++AK39E/1bB0nB5ljqLYbu41cpynIdEu0qi3J5eQsOliUzGDtIiybBK5MSOvRgHm4fFMcYjJRqI6ihREuFUB2P8eccTDhTlWLSu6gcXGq5qJR7cnovYoU5xbKZ2MtEBSaPbdeAVH5FcWkWGH+wYWQMhwa8m6rN4fs0jqwqf2grGWhDh0TwOdOgQkMKXnLc3Qts2pKrl3L6G5W1hiApCH8KDahe3hpJsE3tIeYwFo11tUOlq+BdctmNHWGjwZVSavHpnaRshaTOqCCMAdVYGjttlyD9a6lk8cIVdv5oeZbDvY33oFuvM0xRmupnLLHzjb8cPydDARbPMQ1iQnHiwx/IzZvlzaeTa1AZe1DdcihXZBDk32DBjlsLOYxtBi/Y3J16S+8VJv168vT9E3yzqbJ1pkdk8aQUym6090bA7dtjtlfg1qoogr169age4U/bt6TCtJxo5GJpddf3+JokvSrgysaLTOJZ9CN5PJd9uXAHSVHz64j1l/EL9K1qrssu0P80j0n69zL64gABLqOQCYd0ViKxNLDz8909se7wzAoyJvJHTZGLo25wyho1x3ep6hu5dU5Qtv0+f59XqXu71Yl31plvPSxM0J58UCldUi2eKeZhabyQifoI1sAwJadZLvdTHOnY1XLryiE7bWflT73U7S6URIR1BOy7ctiu91nO4Q9B8hU1QRQxwybngB6Mmzl4qQCv/bECT6Ztk1ro4DYbtpqzyIchKlhS3OlyH4VGjYZ2Z+koAbIEMofVU4NG5/vBBbzRLt+moZYNNOasj0PcwK9Kq4hvIemXpXmqshQT7Or3qI7wn8+gbepqGzCzrw5ujns7I2i11Lus4P1kUIG261a2GqsFQAa+srZ4AdJRO5mHgkq/34cc4JDefBFojOtWwhPGtBVoroMUHRUQPVc9sdllaXZ7swq79npWj806p4WbXdP6H7lynQT0REiDUMmu8vtmg7Ml7pcr7z1sjh/a6+mtoP5KM2aUM/MP3VrblIxnTMQ6oFOqYQGnltW7nUJycbUvTYcHqB8KmfwUd3K5zCHQM/g32AZS0UTGnfBqetecKVFxp5pm9RxI2PU7puC9twnHFgoK7XJPuHmXnJi5v6qNQJlelt2GWzlAKoOuqMYdL0i0cMTwnES/HEsw8GnxMuGxItJNaAjhwx6HvuOJk1LQIH45I1tCNc9w25KY4N/c96YYQMswfwJPFO9AWwHz/TeqebAs9s1ynvZZN9dvdFl9e6OI795DMKqRrmZOF5/pZhdiNJQgWU2vAls55dMqr6Q1avTy+N7aPtl83f1BuRw/W9GltPX/60Fvfwf</diagram></mxfile>"
  },
  {
    "path": "doc/KukaPsuedoCode.txt",
    "content": "KukaInit();\n\nint tile_count = 0;\nint num_tiles = 333;\nint num_rows = 20;\nint num_cols = 15;\n\nwhile (tile_count < num_tiles) {\n    string line = ReadLineFromFile(tile_count);\n    vector<char> colors = ParseColors(line);\n    int color_count = 0;\n    for (int row = 0; row < num_rows; row++) {\n        for (int col = 0; col < num_cols; col++) {\n            Pose pickup_pose = PickupPoseFromColor(colors[color_count]);\n            Pose dropoff_pose = DropoffPoseFromRowCol(row, col);\n            MoveToPickup(pickup_pose);\n            CloseGripper();\n            MoveToDropoff(dropoff_pose);\n            OpenGripper();\n            color_count++;\n        }\n    }\n    tile_count++;\n}\n\n"
  },
  {
    "path": "doc/NetworkPrototcols.txt",
    "content": "\nMessage spec\n- All messages are sent as json encoded byte strings\n- All messages will contain a 'type' field that will be a string denoting the command type\n- All messages will contain a 'sn' field that will be an int denoting the message serial number\n- All messages will contain a 'data' field that will house any additional data required by the message type. This field may be empty\n\nMaster to robot commands:\n\nMove:\n // Move to a waypoint in a straight line\n float x\n float y\n float a\n \nPlace:\n // Place dominoes on the floor\n null\n \nDock:\n // Dock at charging station\n \nUndock:\n // Undock from charging station\n \nDropoff:\n // Drop off pallete on conveyor\n \nPickup:\n // Pick up pallete from conveyor\n\nPosition:\n // Fallback for sending position data if usb controller for marvelmind sensor doesn't work\n float x\n float y\n float a\n float time\n\n\nRobot to master commands:\n\nRobotStatus:\n // Give a status update to server\n float est_pos_x\n float est_pos_y\n float est_pos_a\n float est_cur_vel_x\n float est_cur_vel_y\n float est_cur_vel_a\n string cur_task\n int cur_task_msg_sn\n int bat_percent //Can we even get this?\n string error_msg\n bool error_present\n //Add other fields/sensor statuses as needed\n \n\n\nMaster to base station commands:\n\nNextTile:\n // The next tile to fill\n int[tile_x*tile_y] dominoes\n int tile_id\n\n\n\nBase Station to master commands:\n\nBaseStatus:\n // Status update for server\n int cur_tile_id\n float est_time_to_tile_full\n // Other sensor statuses\n\n\nGeneral commands:\n\nStatusRequest:\n // Request status from robot\n\nKeepAlive:\n  // Keep the connection alive\n\nAck:\n //Acknowledge that command was recieved\n int sn\n\nDone:\n //Let the server know the command was done\n int sn\n\n\n\n\n\n"
  },
  {
    "path": "doc/SubsystemDiagram.drawio",
    "content": "<mxfile modified=\"2020-01-26T19:53:17.764Z\" host=\"www.draw.io\" agent=\"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.130 Safari/537.36\" etag=\"FmhtWn5DU_qlXuZeJW7k\" version=\"12.5.8\" type=\"device\"><diagram id=\"cr8PqfeKhw-YyZYMWl9d\" name=\"Page-1\">7V1dk5o6GP41zvRc2OFbvNx1t+10tmec2jltrzoIUWmReCDu6vn1J0AikGQVFQjSzuyFhBDgeZ+8eb/CDvTJevc+cjarT9ADwUBTvN1Afxhommpr2iD5U7x91jIaGVnDMvI90ilvmPn/AdKokNat74G41BFBGCB/U250YRgCF5XanCiCL+VuCxiU77pxluSOSt4wc50AcN2++h5aZa22Wej9AfjLFb2zqpAza4d2Jg3xyvHgS6FJfxzokwhClP1a7yYgSMCjuGTXvXvl7OHBIhCiKhcMnS92bEPraWY8fbPd7ce75aehno3y7ARb8sLkYdGeIgA8DAg5hBFawSUMneAxb72P4Db0QHIbBR/lfZ4g3OBGFTf+BAjtiXSdLYK4aYXWATkbIydCd4m0cIMbOHHsu7T5nR/QbvwbExBiuI1ccOQ1KXOcaAnQkX7k7ZNXLtyA4PkewDVA0R53iEDgIP+5zBGHUG156JdLA/8gAjlDOConnE9OjECE26YTTk4vKx+B2cZJgXjBs7GMsRNvsvmx8HeJrO4XGNgJDGCUXq57JrA9I0U9gr9A4YytzXXLOob/M4gQ2B1FjJyl84ZoA1Unxy+FuUWaVoVpRdvqh1gK38HOR9+Sy9+OTHL4vXDqYUeGTg/25KDiJMGyifbZ4CY9/E7HSw7y0dMjOvzlk0vjJ5cffLHv3v38gb5+/Kz9+PkhiqbKkKr0+mZXeimGxNkXOmygH6K4MPI0acgpqI3KHNQsRm8y/fXx0f74R/YE4qsNlbl6zFA5g41cxbD5gEklgotBt26G4Jcyt/LE8PhOuJFbYBh+NjwpjFuYA5Z9xhyoi7q6KZe6WsvUDWEIGN4qneWtrkvhrX4eb4dqw8Q9hmHBbvsboBcY/UoE8OIjd4WHxURNTbl6rbjFYqG5rsiK86y5ZdZkxWlqx8w4Tb89VaGWFEWuN9pQFRf5UAqvGsQdzbrtvOu4YfzhRne4YXWLGzdoYvSXG3a3uHGLntMV3KjkOREC5dy4LnRQmRta7bGDqtwQe22qXG4UqaFUpEbZM1Frp0YLzomILkdCTdLitgrnAHyGc5j4JfiWygRipGEQJNa/FeDXuJ8nv5bJrzd3kbf1Q/hX3Y5BO+FdnYnvGrIdA2N8ewu85PBXo9Gucd1qXOz3MzTUWX5lL8QFXfn4ATuSoWvlkV4J3zYd2DAkhDUoFBJzg5fbr54Tr9LHOEr0pnk5pIJiBXeKT9UjW6c4fi4zh2fH3E5c0Aw5VYNbc3m2imhSIsb5vKuiPk+yqrA6moLVkbZdSb7Xgvhna0X7QhLXJmuTk/XXFUgLT9YQwSjmJJ9roURQJ0wpNqRqu0AcUp3bpmEedYDOCKlqZUzNUTXLSW/KcqKRmgLIT/4iqz6IEdhs0l8p3j2A26KRIGlw2zzcTpopUGIQPcM+YDyy5GJMn6dUUBM9g2Dthx5uvweOC8NakfYcYC+ESFuuDeaLepA2xqzy0CQjPeaQnsLYR74AXvyOiF16i1iRtbcILGlyAn8ZJos5Rg5rI/0+Qcx3neCOnFj7npdaoSKhlS3TOqTA2D2mxatwQyAFrSkp6LxNNMOam18eeyMCXR2VVY7Kq/V2RcCbKv2fCJwULF7xNyUFYdjhUIIrJQpEfxdi8s1G8jtRSGKTlPzptI+YS+2Eam2+ALpkErxSpXGV8dVKicaQcfZUU7KBa484pI/AqpwHK2F8LYu4WcJtzK/hh+qXIm6q2RhwvGdA8wYLmLrlOYLWv1tITwzjVDHd4Q4Yk11+kuYansASJCTPxsLPlg2Xnezv6mQYZQkfdmwURDxuc2aMBW7JCbf6zOnRmstnmAy2Bq91RqKV32wKW34PxQyEcc3gtuTlseAKdFO74GocuBO43uAVs15029mZ0jnq8qbJoYy0TsXQijnCoSudu6dXVboXka6bKr+EHhL4VVZhXbgKP0BsakJ8ehpgia1Beu/ZPkZgXVia5+wdf5vleshsVRBUFBgig4zNSdRX+8MRhyYfQOhCD9Scfqi8tnCQCoCvHEFsMx4udtk5EP8U3x0pvmu5vErMpSvzkwabAmPjQVXzk6bCDDQelQdqeM8dv3D/AwLo+mjfaTV9lQJhY35tBr+FQuADrxP8zqjbwe/rdDiTf2gz+C0UAZ9B/gwSGBMDBzlZDPwGKiIvE4qECsiju66L9oq/8AdJ7t7bYiw15c3jbGprlnVrEB+KTsu+ha5V8y1GjUEu2Xjpf3W4GHZBxqGZPdfXsYM3D2h1uNZTXUgL0JiqVfm6kU+Q90w3WkbXdKMo7sIqy2brNMWK8riTV7uaklYEqpQJYdrGWybGVtXNYutJLZUhTcNu1gHxDiyzt7G1ox7+jisus5bUVZavQOuZau+YYpf7CYbLamzI9qzmrN4aI3Zi0LVbmIv0KQtz8cHHcyiM0xrymbz65qsmIPbz5NXsi4GW+zmDy+bg6PI52Hahmxh0o6U5mF567vYtZquOaZsMyxr4cJfGu1W06NUPl4P8I0g1Tvl20mTdm/GjPzO+5SXX4qf7kWyatDWXD75/SCoNbnr6cVnqFncUiVHmoxnTCMQxRpFFt79pJunV/YLNRn0XAlMUIz3bSjlxa/tcrnMAJG74EkuhQxGx3yjxRL8J0vxHU69jB++GC/LxSY9iDkq58aTTkN2GLD3rpPMJwJ6FJoea2rHopGArLK8qe512yma/rLTTUDNryztxn76pL/GUfpeZ/kOWrHv+b230x/8B</diagram></mxfile>"
  },
  {
    "path": "doc/SwClassHeirarchy.txt",
    "content": "- robot\n\t- status updater\n\t\t- status\n\t- robot server\n\t\t- socket multi thread wrapper\n\t- robot controller\n\t\t- serial comms\n\t\t- localization\n\t\t\t- Kalman filter\n\t\t- controller mode\n\t\t\t- position mode\n\t\t\t\t- trajectory generator\n\t\t\t\t- x,y,a controllers\n\t\t\t- vision mode\n\t\t\t\t- trajectory generator\n\t\t\t\t- Camera tracker *\n\t\t\t\t- Kalman filter\n\t\t\t\t- x,y,a controllers\n\t- tray controller\n\t\t- serial comms\n\t- marvelmind wrapper\n\t- camera tracker\n\t\t- side camera\n\t\t- rear camera\n\n\n- motor controller\n\t- serial comms\n\t- robot velocity handler\n\t- lifter action handler\n\t\n- master\n\t- gui\n\t- runtime\n\t\t- plan\n\t\t- robot interface\n\t\t\t- robot client\n\t\t\t\t- tcp client\n\t- field planner"
  },
  {
    "path": "doc/clearcore_api.txt",
    "content": "Tentative thoughts on api with clearcore for motor control over serial comms\n\nThere's a few options here:\n- Clearcore does all trajectory generation, kalman filtering, kinematics, and control\n    - Central processor sends it marvelmind position data and forwards command from master\n    - This requires a more extensive api to handle forwarding commands\n- Clearcore recieves global velocity command from central processor, handles frame transformation and kinematics only\n    - Sends back computed global velocity\n    - Central processor has to send global angle along with command\n    - Cnetral processor handles kalman filtering and trajectory generation\n    - Relatively simple api\n- Clearcore recieves local velocity command from central processors, handles local kinematics only\n    - Sends back computed local velocity\n    - Central processor handles kf, traj gen, frame conversion\n    - Very simple api - seems like the best starting point\n\n\nStarting with option 3\n\nSend from central controller:\n- Local velocity command [vx, vy, va]\n\nResponse from clearcore:\n- Estimated local velocity [vx_est, vy_est, va_est]\n\nClearcore outline:\nwhile true:\n    if new command:\n        decode command\n        do IK\n        send new motor commands\n        do FK\n        encode response\n        send response (est vel)\n    else:\n        pass"
  },
  {
    "path": "experimental_testing/ImageParser.py",
    "content": "import matplotlib.image as mpimg\nimport matplotlib.pyplot as plt\nimport skimage.transform as sktf\nimport numpy as np\n\n# Configuration\nimage_name = 'MR.jpg'\ndesired_width = 300\ndesired_height = 300\ndominoes = np.array(\n            [('black', (0,0,0)),\n             ('red',   (1,0,0)),\n             ('blue',  (0,0,1)),\n             ('green', (0,1,0)),\n             ('white', (1,1,1))\n             ])\n\n# Load original image\nimg = mpimg.imread(image_name)\n\n# Scaled image\nimg_scaled = sktf.resize(img, (desired_height, desired_width),anti_aliasing=False)\n\n# Parse image into domino IDs by choosing 'closest' color\nimg_parsed_color = np.zeros((desired_width*desired_height, 3))\nimg_parsed_ids = np.zeros_like(img_parsed_color,dtype=np.int_)[:,0]\ncount = 0\n\nfor row in img_scaled:\n    for px in row:\n        best_id = -1\n        best_score = 9999999\n        for id,val in enumerate(dominoes):\n            color = val[1]\n            score = np.linalg.norm(px - color)\n            if score < best_score:\n                best_score = score\n                best_id = id\n        img_parsed_ids[count] = best_id\n        img_parsed_color[count] = dominoes[best_id][1]\n        count = count + 1\n\nimg_parsed_ids = img_parsed_ids.reshape((desired_height, desired_width))\nimg_parsed_color = img_parsed_color.reshape((desired_height, desired_width, 3))\n\n# Output some metrics\nprint('Domino usage:')\nprint('Total number of dominoes: ' + str(img_parsed_ids.size))\nprint('Colors:')\nunique_colors, counts = np.unique(img_parsed_ids, return_counts=True)\nfor i,id in enumerate(unique_colors):\n    print('  ' + dominoes[id][0] + ': ' + str(counts[i]))\n\n# Plot images\nfig, axes = plt.subplots(nrows=1, ncols=3)\nax = axes.ravel()\n\nax[0].imshow(img)\nax[0].set_title('Original')\nax[1].imshow(img_scaled)\nax[1].set_title('Scaled')\nax[2].imshow(img_parsed_color)\nax[2].set_title('Dominoes')\n\nplt.tight_layout()\nplt.show()\n\n\n\n\n\n\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/StepperDriverMotion_4Motor_Accel/StepperDriverMotion_4Motor_Accel.ino",
    "content": "#include <StepperDriver.h>\n// https://github.com/DIMRobotics/ArduinoStepperDriver\n\n/* Axis identifiers inside the library; on them then\n   Can be controlled specific. You can do several axles,\n   The maximum number in the native assembly is 3. It is treated by editing\n   headerfile, there is a constant NUM_AXSIS\n*/\naxis_t bl, br, fl, fr;\n\nfloat MaxSpeed = 3.14; // rad/s\n#define PulseRev 3200\n\nconst float SECONDS_TO_MICROSECONDS = 1000000;\nconst float STEPPER_CONVERSION = SECONDS_TO_MICROSECONDS * 2 * PI / PulseRev;\n\n#define bl_pulse 49\n#define bl_dir 48\n#define br_pulse 51\n#define br_dir 50\n#define fl_pulse 47\n#define fl_dir 46\n#define fr_pulse 45\n#define fr_dir 44\n#define PIN_ENABLE_ALL 40\n\nvoid setup()\n{\n\n  Serial.begin(115200);\n  StepperDriver.init();\n  delay(100);\n//  Serial.print(\"rad conversion: \");\n//  Serial.println(RAD_PER_SEC_TO_STEPS_PER_SEC);\n//  Serial.print(\"sec conversion: \");\n//  Serial.println(SECONDS_TO_MICROSECONDS);\n\n  bl = StepperDriver.newAxis(bl_pulse, bl_dir, 255, PulseRev);\n  br = StepperDriver.newAxis(br_pulse, br_dir, 255, PulseRev);\n  fr = StepperDriver.newAxis(fr_pulse, fr_dir, 255, PulseRev);\n  fl = StepperDriver.newAxis(fl_pulse, fl_dir, 255, PulseRev);\n\n\n  StepperDriver.enable(bl);\n  StepperDriver.enable(br);\n  StepperDriver.enable(fl);\n  StepperDriver.enable(fr);\n  digitalWrite(PIN_ENABLE_ALL, LOW);\n\n  Serial.println(\"TCCR2A: \");\n  Serial.println(TCCR2A, BIN);\n  Serial.println(\"TCCR2B: \");\n  Serial.println(TCCR2B, BIN);\n  Serial.println(\"TIMSK2: \");\n  Serial.println(TIMSK2, BIN);\n  \n}\n\nvoid writeAll(float vel)\n{\n  Serial.print(\"Speed: \");\n  Serial.println(vel, 4);\n  Serial.print(\"Delays: [ \");\n  writeSpeed(fl, vel);\n  writeSpeed(fr, vel);\n  writeSpeed(br, vel);\n  writeSpeed(bl, vel);\n  Serial.println(\"]\");\n}\n\nvoid writeSpeed(axis_t motor, float vel)\n{\n  uint16_t delay_us = 0; // This works for if vel is 0\n  \n  // Compute motor direction\n  if(vel > 0)\n  {\n      StepperDriver.setDir(motor, FORWARD);\n  }\n  else\n  {\n      vel = -vel;\n      StepperDriver.setDir(motor, BACKWARD); \n  }\n  \n  // Compute delay between steps to achieve desired velocity\n  if(vel != 0)\n  {\n      delay_us = static_cast<uint16_t>(STEPPER_CONVERSION / vel);\n  }\n\n  Serial.print(delay_us);\n  Serial.print(\", \");\n  \n  // Update motor\n  StepperDriver.setDelay(motor, delay_us);\n}\n\nvoid loop()\n{\n   \n   writeAll(MaxSpeed);  \n   delay(2000);\n\n   writeAll(0);\n\n   delay(500);\n\n   writeAll(-1*MaxSpeed);\n\n   delay(2000);\n\n   writeAll(0);\n\n   delay(500);\n\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/TestMotor/TestMotor.ino",
    "content": "#define PIN_ENABLE 52\n#define PIN_BL_PWM 6\n#define PIN_BL_DIR 48\n#define FW 1\n#define BW 0\n#define SPEED_SLOW 50\n#define SPEED_FAST 255\n\n\nvoid setup() {\n  // put your setup code here, to run once:\n    pinMode(PIN_ENABLE,OUTPUT);\n    pinMode(PIN_BL_PWM,OUTPUT);\n    pinMode(PIN_BL_DIR,OUTPUT);\n    Serial.begin(9600);\n}\n\n\nvoid loop() {\n  // put your main code here, to run repeatedly:\n\n  Serial.print(\"Forward\");\n  analogWrite(PIN_BL_PWM, SPEED_SLOW);\n  digitalWrite(PIN_BL_DIR, FW);\n  digitalWrite(PIN_ENABLE, HIGH);\n\n  // Wait for a little bit\n  delay(2000);\n\n  // Stop\n  digitalWrite(PIN_ENABLE, LOW);\n  delay(500);\n  \n  Serial.print(\"Reverse\");\n  analogWrite(PIN_BL_PWM, SPEED_FAST);\n  digitalWrite(PIN_BL_DIR, BW);\n  digitalWrite(PIN_ENABLE, HIGH);\n\n  // Wait for a little bit\n  delay(2000);\n\n  // Stop\n  digitalWrite(PIN_ENABLE, LOW);\n  delay(500);\n\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/WifiServerWithESP/RobotServer.cpp",
    "content": "#include <Arduino.h> // This has to be before ArduinoJson.h to fix compiler issues\n#include \"RobotServer.h\"\n#include <ArduinoJson.h>\n\n\nRobotServer::RobotServer(HardwareSerial& serial, HardwareSerial& debug)\n: serial_(serial),\n  debug_(debug),\n  clientConnected_(false),\n  wifiConnected_(false)\n{\n    serial_.begin(115200);\n}\n\nRobotServer::oneLoop()\n{\n    COMMAND cmd = COMMAND::NONE;\n    String newMsg = getAnyIncomingMessage();\n    \n    if(newMsg.length() != 0)\n    {\n        debug_.print(\"[RobotServer] \");\n        debug_.print(\"RCV: \");\n        debug_.println(newMsg);\n        \n        if(newMsg == \"Client connected\")\n        {\n            clientConnected_ = true;\n            wifiConnected_ = true;\n        }\n        else if(newMsg == \"Client disconnected\")\n        {\n            clientConnected_ = false;\n            wifiConnected_ = true;\n        }\n        else if(newMsg == \"Connecting..\")\n        {\n            wifiConnected_ = false;\n            clientConnected_ = false;\n        }\n        else if(newMsg.startsWith(\"Connected to WiFi.\"))\n        {\n            clientConnected_ = false;\n            wifiConnected_ = true;\n        }\n        else if(newMsg == \"Waiting for client connection\")\n        {\n            clientConnected_ = false;\n            wifiConnected_ = true;\n        }\n        else\n        {\n            cmd = getCommand(cleanString(newMsg));\n        }\n    }\n    return cmd;\n}\n\nString RobotServer::cleanString(String message)\n{\n  int idx_start = message.indexOf(\"{\");\n  int idx_end = message.lastIndexOf(\"}\") + 1;\n  return message.substring(idx_start, idx_end);\n}\n\nString RobotServer::getAnyIncomingMessage()\n{\n    String msg = \"\";\n    if(serial_.available())\n    {\n        msg = serial_.readString();\n    }\n    return msg;\n}\n\nRobotServer::COMMAND RobotServer::getCommand(String message)\n{\n    COMMAND cmd = COMMAND::NONE;\n    StaticJsonDocument<256> doc;\n    DeserializationError err = deserializeJson(doc, message);\n\n    debug_.print(\"[RobotServer] GetCommand(): \");\n    debug_.println(message);\n\n    if(err)\n    {\n        debug_.print(\"[RobotServer] Error parsing JSON: \");\n        debug_.println(err.c_str());   \n        cmd = COMMAND::ERROR_BAD_JSON;\n        sendErr(\"bad_json\");\n    }\n    else\n    {\n        String type = doc[\"type\"];\n        if(type == \"move\")\n        {\n            debug_.println(\"[RobotServer] Got MOVE command \");\n            cmd = COMMAND::MOVE;\n            sendAck(type);\n        }\n        else if(type == \"place\")\n        {\n            debug_.println(\"[RobotServer] Got PLACE command \");\n            cmd = COMMAND::PLACE;\n            sendAck(type);\n        }\n        else if(type == \"dock\")\n        {\n            debug_.println(\"[RobotServer] Got DOCK command \");\n            cmd = COMMAND::DOCK;\n            sendAck(type);\n        }\n        else if(type == \"undock\")\n        {\n            debug_.println(\"[RobotServer] Got UNDOCK command \");\n            cmd = COMMAND::UNDOCK;\n            sendAck(type);\n        }\n        else if(type == \"dropoff\")\n        {\n            debug_.println(\"[RobotServer] Got DROPOFF command \");\n            cmd = COMMAND::DROPOFF;\n            sendAck(type);\n        }\n        else if(type == \"pickup\")\n        {\n            debug_.println(\"[RobotServer] Got PICKUP command \");\n            cmd = COMMAND::PICKUP;\n            sendAck(type);\n        }\n        else if(type == \"position\")\n        {\n            debug_.println(\"[RobotServer] Got POSITION command \");\n            cmd = COMMAND::POSITION;\n            sendAck(type);\n        }\n        else if(type == \"status\")\n        {\n            debug_.println(\"[RobotServer] Got STATUS command \");\n            cmd = COMMAND::STATUS;\n            sendStatus();\n        }\n        else if(type == \"\")\n        {\n            debug_.println(\"[RobotServer] ERROR: Type field empty or not specified \");\n            cmd = COMMAND::ERROR_NO_TYPE;\n            sendErr(\"no_type\");\n        }\n        else\n        {\n            debug_.println(\"[RobotServer] ERROR: Unkown type field \");\n            cmd = COMMAND::ERROR_UNKOWN_TYPE;\n            sendErr(\"unkown_type\");\n        }\n    }\n    return cmd;    \n}\n\nvoid RobotServer::sendMsg(String msg)\n{\n    serial_.print(msg + '\\0');\n    debug_.print(\"[RobotServer] Send: \");\n    debug_.println(msg);\n}\n\nvoid RobotServer::sendAck(String data)\n{\n    StaticJsonDocument<256> doc;\n    doc[\"type\"] = \"ack\";\n    doc[\"data\"] = data;\n    String msg;\n    serializeJson(doc, msg);\n    sendMsg(msg);\n}\n\nvoid RobotServer::sendErr(String data)\n{\n    StaticJsonDocument<256> doc;\n    doc[\"type\"] = \"ack\";\n    doc[\"data\"] = data;\n    String msg;\n    serializeJson(doc, msg);\n    sendMsg(msg);\n}\n\nvoid RobotServer::sendStatus()\n{\n    StaticJsonDocument<256> doc;\n    doc[\"type\"] = \"status\";\n    doc[\"data\"] = \"not implimented\";\n    String msg;\n    serializeJson(doc, msg);\n    sendMsg(msg);\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/WifiServerWithESP/RobotServer.h",
    "content": "/*\n* Robot server that handles gettings messages from master\n* and responding correctly\n*/\n\n#ifndef RobotServer_h\n#define RobotServer_h\n\n#include <HardwareSerial.h>\n\nclass RobotServer\n{\n  public:\n\n    enum COMMAND\n    {\n       NONE,\n       ERROR_NO_TYPE,\n       ERROR_UNKOWN_TYPE,\n       ERROR_BAD_JSON,\n       MOVE,\n       PLACE,\n       DOCK,\n       UNDOCK,\n       DROPOFF,\n       PICKUP,\n       POSITION,\n       STATUS\n    };\n    \n    // Constructor\n    RobotServer(HardwareSerial& serial, HardwareSerial& debug);\n\n    int oneLoop();\n\n  private:\n\n    HardwareSerial& serial_;\n    HardwareSerial& debug_;\n    bool clientConnected_;\n    bool wifiConnected_;\n\n    String getAnyIncomingMessage();\n    String cleanString(String message);\n    COMMAND getCommand(String message);\n    \n    void sendMsg(String msg);\n    void sendAck(String data);\n    void sendErr(String data);\n    void sendStatus();\n\n};\n\n\n\n#endif\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/WifiServerWithESP/WifiServerWithESP.ino",
    "content": "#include \"RobotServer.h\"\n\n\nRobotServer server = RobotServer(Serial3, Serial);\n\n\nvoid setup() {\n  // put your setup code here, to run once:\n\n  // Communication with the host computer\n  Serial.begin(9600); \n  Serial.println(\"Wifi client starting\"); \n\n}\n\nvoid loop() {\n  // put your main code here, to run repeatedly:\n\n  int cmd = server.oneLoop();\n  delay(50);\n\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/WifiTesting/WifiTesting.ino",
    "content": "\n\n//  Arduino pin 15 (RX3) to ESP8266 TX\n//  Arduino pin 14 (TX3) to voltage divider then to ESP8266 RX\n//  Connect GND from the Arduiono to GND on the ESP8266\n//  Pull ESP8266 CH_PD HIGH\n\n// When a command is entered in to the serial monitor on the computer \n// the Arduino will relay it to the ESP8266\n \nvoid setup() \n{\n \n    Serial.begin(9600);     // communication with the host computer\n \n    // Start the serial for communication with the ESP8266\n    Serial3.begin(115200);  \n \n    Serial.println(\"\");\n    Serial.println(\"Remember to to set Both NL & CR in the serial monitor.\");\n    Serial.println(\"Ready\");\n    Serial.println(\"\");    \n}\n \nvoid loop() \n{\n    // listen for communication from the ESP8266 and then write it to the serial monitor\n    if ( Serial3.available() )   {  Serial.write( Serial3.read() );  }\n \n    // listen for user input and send it to the ESP8266\n    if ( Serial.available() )       {  Serial3.write( Serial.read() );  }\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/all_motors/all_motors.ino",
    "content": "// Pinouts\n#define PIN_ENABLE 52\n#define PIN_FR_PWM 3\n#define PIN_FR_DIR 49\n#define PIN_FL_PWM 2\n#define PIN_FL_DIR 51\n#define PIN_BR_PWM 5\n#define PIN_BR_DIR 50\n#define PIN_BL_PWM 6\n#define PIN_BL_DIR 48\n\n// Motor index\n#define FR 0\n#define FL 1\n#define BR 2\n#define BL 3\n\n// Motor arrays\nconst int PWM_ARRAY[4] = {PIN_FR_PWM, PIN_FL_PWM, PIN_BR_PWM, PIN_BL_PWM};\nconst int DIR_ARRAY[4] = {PIN_FR_DIR, PIN_FL_DIR, PIN_BR_DIR, PIN_BL_DIR};\nconst int SPEED_ARRAY[4] = {130, 75, 75, 100}; \n\nconst int NUM_DIRECTIONS = 4;\n\n// Direction vectors\nconst int MOTOR_DIRECTIONS[NUM_DIRECTIONS][4] = \n{\n  {1, 1, 1, 1},   //Forward\n  {0, 0, 0, 0},   //Backward\n  {1, 0, 0, 1},   //Left\n  {0, 1, 1, 0}   //Right\n//  {1, 0, 1, 0},   //Forward-Left\n//  {1, 0, 0, 1},   //Forward-Right\n//  {0, 1, 0, 1},   //Backward-Left\n//  {0, 1, 1, 0}    //Backward-Right\n};\n\n\nvoid commandMotors(int direction)\n{\n  for(int i = 0; i < 4; i++)\n  {\n    analogWrite(PWM_ARRAY[i],SPEED_ARRAY[i]);\n    digitalWrite(DIR_ARRAY[i],MOTOR_DIRECTIONS[direction][i]);\n  } \n  digitalWrite(PIN_ENABLE,HIGH);\n}\n\n\nvoid setup() \n{\n  // Setup pin modes\n  pinMode(PIN_ENABLE,OUTPUT);\n  for(int i = 0; i < 4; i++)\n  {\n    pinMode(PWM_ARRAY[i],OUTPUT);\n    pinMode(DIR_ARRAY[i],OUTPUT);\n  }\n  \n  Serial.begin(9600);\n}\n\n\nint stage = 0;\nvoid loop() \n{\n  // Command motors on\n  commandMotors(stage);\n  Serial.print(\"On: \");\n  Serial.println(stage);\n\n  // Wait for a little bit\n  delay(1000);\n\n  // Turn motors off and let coast\n  digitalWrite(PIN_ENABLE,HIGH);\n  for(int i = 0; i < 4; i++)\n  {\n    analogWrite(PWM_ARRAY[i],0);\n  } \n  Serial.println(\"Off\");\n\n  // Wait for a little bit\n  delay(1500);\n\n  // Increment stage\n  if(stage == NUM_DIRECTIONS-1)\n  {\n    stage = 0;\n  }\n  else\n  {\n    stage++;\n  }\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/all_motors_speed_control/Motor.cpp",
    "content": "#include \"Motor.h\"\n#include <math.h>\n\nMotor::Motor(int pwmPin, int dirPin, int encPinA, int encPinB, double Kp, double Ki, double Kd, float velFilterFreq)\n: pwmPin_(pwmPin),\n  dirPin_(dirPin),\n  inputVel_(0.0),\n  currentVelRaw_(0.0),\n  currentVelFiltered_(0.0),\n  outputCmd_(0.0),\n  prevCount_(0),\n  prevMillis_(millis()),\n  enc_(encPinA, encPinB),\n  controller_(&currentVelFiltered_, &outputCmd_, &inputVel_, Kp, Ki, Kd, DIRECT),\n  velFilter_(LOWPASS, velFilterFreq)\n{\n  pinMode(pwmPin_, OUTPUT);\n  pinMode(dirPin_, OUTPUT);\n  controller_.SetMode(AUTOMATIC);\n  controller_.SetOutputLimits(-255, 255);\n}\n\nvoid Motor::setCommand(double vel)\n{\n  inputVel_ = vel;  \n}\n\nvoid Motor::runLoop()\n{\n\n  // Read current values\n  long curCount = enc_.read();\n  unsigned long curMillis = millis();\n\n  // Compute delta\n  long deltaCount = curCount - prevCount_;\n  double deltaRevs = static_cast<double>(deltaCount) * COUNTS_PER_REV;\n  unsigned long deltaMillis = curMillis - prevMillis_;\n\n  // Compute current velocity in revs/second\n  currentVelRaw_ = deltaRevs / (static_cast<double>(deltaMillis) * 1000.0);\n  currentVelFiltered_ = velFilter_.input(currentVelRaw_);\n\n  // Run PID controller\n  controller_.Compute();\n\n  // Update output values\n  if(outputCmd_ < 0)\n  {\n    digitalWrite(dirPin_,0);\n  }\n  else\n  {\n    digitalWrite(dirPin_,1);\n  }\n  analogWrite(pwmPin_, abs(outputCmd_));\n\n  // Copy current values into previous\n  prevCount_ = curCount;\n  prevMillis_ = curMillis;\n//\n//  Serial.print(curCount);\n//  Serial.print(\" \");\n//  Serial.print(currentVelRaw_);\n//  Serial.print(\" \");\n//  Serial.print(currentVelFiltered_);\n//  Serial.print(\" \");\n//  Serial.print(inputVel_);\n//  Serial.print(\" \");\n//  Serial.print(outputCmd_);\n//  Serial.println(\"\");\n  \n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/all_motors_speed_control/Motor.h",
    "content": "/* Requires libraries:\n *  Filters: https://github.com/JonHub/Filters\n *  PID: https://playground.arduino.cc/Code/PIDLibrary/\n *  Encoder: https://www.pjrc.com/teensy/td_libs_Encoder.html\n */\n\n#ifndef Motor_h\n#define Motor_h\n\n#include <Encoder.h>\n#include <PID_v1.h>\n#include <Filters.h>\n\n#define COUNTS_PER_REV 835;\n\nclass Motor\n{\n  public:\n\n    /*\n     * Motor object constructor\n     * pwmPin - which pin the motor PWM is connected to\n     * dirPin - which pin the motor direction is connected to\n     * encPinA - which pin the encoder wire A is connected to. This should be a pin capable of handling interrupts\n     * encPinB - which pin the encoder wire B is connected to. This should ideally (but not required) be a pin capable of handling interrupts\n     * Kp - Proportional gain\n     * Ki - Integral gain\n     * Kd - Derrivative gain\n     * velFilterFreq - frequency to use for velocity lowpass filter\n     */\n    Motor(int pwmPin, int dirPin, int encPinA, int encPinB, double Kp, double Ki, double Kd, float velFilterFreq);\n    \n    /*\n     * Set the desired velocity in revs/second\n     */\n    void setCommand(double vel);\n    \n    /*\n     * Run the actual controller. Make sure this gets called reasonably quickly (i.e. every 20 ms or so)\n     */\n    void runLoop();\n\n  private:\n\n    // Pins\n    int pwmPin_;\n    int dirPin_;   \n\n    // Values for computation\n    double inputVel_;              // Desired velocity in revs/sec\n    double currentVelRaw_;            // Raw current velocity in revs/sec\n    double currentVelFiltered_;\n    double outputCmd_;             // Output command in [-255, 255]\n    long prevCount_;               // Encoder count from previous loop\n\n    // Timer\n    unsigned long prevMillis_;   // Timer from previous loop\n\n    // Other objects\n    Encoder enc_;                  // Encoder object\n    PID controller_;               // PID controller\n    FilterOnePole velFilter_;      // Velocity lowpass filter\n};\n\n#endif Motor_h\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/all_motors_speed_control/all_motors_speed_control.ino",
    "content": "/* Requires libraries:\n *  Filters: https://github.com/JonHub/Filters\n *  PID: https://playground.arduino.cc/Code/PIDLibrary/\n *  Encoder: https://www.pjrc.com/teensy/td_libs_Encoder.html\n */\n\n#include \"Motor.h\"\n\n// Pinouts\n#define PIN_ENABLE 52\n#define PIN_FR_PWM 3\n#define PIN_FR_DIR 49\n#define PIN_FL_PWM 2\n#define PIN_FL_DIR 51\n#define PIN_BR_PWM 5\n#define PIN_BR_DIR 50\n#define PIN_BL_PWM 6\n#define PIN_BL_DIR 48\n\n#define PIN_FR_ENC_A 20\n#define PIN_FR_ENC_B 35\n#define PIN_FL_ENC_A 21\n#define PIN_FL_ENC_B 41\n#define PIN_BR_ENC_A 18\n#define PIN_BR_ENC_B 23\n#define PIN_BL_ENC_A 19\n#define PIN_BL_ENC_B 29\n\n#define VEL_FILTER_FREQ 0.5 // HZ\n\nconst double Kp = 100;\nconst double Ki = 1;\nconst double Kd = 1;\n\nMotor allMotors[4] = {\nMotor(PIN_FR_PWM, PIN_FR_DIR, PIN_FR_ENC_A, PIN_FR_ENC_B, Kp, Ki, Kd, VEL_FILTER_FREQ),\nMotor(PIN_FL_PWM, PIN_FL_DIR, PIN_FL_ENC_A, PIN_FL_ENC_B, Kp, Ki, Kd, VEL_FILTER_FREQ),\nMotor(PIN_BR_PWM, PIN_BR_DIR, PIN_BR_ENC_A, PIN_BR_ENC_B, Kp, Ki, Kd, VEL_FILTER_FREQ),\nMotor(PIN_BL_PWM, PIN_BL_DIR, PIN_BL_ENC_A, PIN_BL_ENC_B, Kp, Ki, Kd, VEL_FILTER_FREQ) };\n\nconst int MOTOR_DIRECTIONS[4][4] = \n{\n  {1, 1, 1, 1},   //Forward\n  {-1, -1, -1, -1},   //Backward\n  {1, -1, -1, 1},   //Left\n  {-1, 1, 1, -1}   //Right\n//  {1, 0, 1, 0},   //Forward-Left\n//  {1, 0, 0, 1},   //Forward-Right\n//  {0, 1, 0, 1},   //Backward-Left\n//  {0, 1, 1, 0}    //Backward-Right\n};\n\nconst float SPEED[4] = {2, 1, 1, 0.5}; // scale factor for rev/s\n\n\nvoid setup() \n{\n  // Setup pin modes\n  pinMode(PIN_ENABLE,OUTPUT);\n  \n  Serial.begin(250000);\n}\n\nint waitForInput()\n{\n  int motion = 0;\n  bool gotInput = false;\n  while(!gotInput)\n  {\n    if(Serial.available() > 0)\n    {\n//      Serial.println(\"Serial available\");\n       int incomingByte = Serial.read();\n//      Serial.print(\"Read: \");\n//      Serial.println(incomingByte);\n      if(incomingByte == 49)\n      {\n        motion = 1;\n        gotInput = true;  \n      }\n      else if(incomingByte == 50)\n      {\n        motion = 2;\n        gotInput = true;  \n      }\n      else if(incomingByte == 51)\n      {\n        motion = 3;\n        gotInput = true;  \n      }\n      else if(incomingByte == 52)\n      {\n        motion = 4;\n        gotInput = true;  \n      }\n      else\n      {\n        // Do nothing\n      }\n    }\n  }\n  return motion;  \n}\n\nvoid setMotorCommands(int motion)\n{\n  for(int i = 0; i < 4; i++)\n  {\n    int curDirection = MOTOR_DIRECTIONS[motion-1][i];\n    float curSpeed = curDirection * SPEED[i];\n    Serial.print(\"Motor \");\n    Serial.print(i);\n    Serial.print(\", Direction \");\n    Serial.print(curDirection);\n    Serial.print(\", Speed \");\n    Serial.println(curSpeed);\n    \n    allMotors[i].setCommand(curSpeed);\n  }\n  digitalWrite(PIN_ENABLE, 1);\n}\n\nvoid runMotors()\n{\n  unsigned long prevTime = millis();\n  unsigned long startTime = prevTime;\n  int targetDelta = 20;\n  int targetTotalTime = 2000;\n\n  while (millis() - startTime < targetTotalTime)\n  {\n    if(millis() - prevTime > targetDelta)\n    {\n      prevTime = millis();\n      for(int i = 0; i < 4; i++)\n      {\n        allMotors[i].runLoop();\n      }\n    }\n  }\n}\n\nvoid loop() \n{\n\n  int motion = waitForInput();\n\n  setMotorCommands(motion);\n\n  runMotors();\n\n  // Stop motors\n  digitalWrite(PIN_ENABLE, 0);\n  \n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/kalman_test/KalmanFilter.cpp",
    "content": "#include \"KalmanFilter.h\"\n\nKalmanFilter::KalmanFilter(\n    double dt,\n    const mat& A,\n    const mat& B,\n    const mat& C,\n    const mat& Q,\n    const mat& R,\n    const mat& P)\n  : A(A), B(B), C(C), Q(Q), R(R), P0(P),\n    m(C.get_rows()), n(A.get_rows()), dt(dt),\n    I(mat::identity(n)), x_hat(1,n), x_hat_new(1,n)\n{\n}\n\nvoid KalmanFilter::init(double t0, const mat& x0) \n{\n  x_hat = x0;\n  P = P0;\n  this->t0 = t0;\n  t = t0;\n}\n\nvoid KalmanFilter::init() \n{\n  x_hat = mat::zeros(n,1);\n  P = P0;\n  t0 = 0;\n  t = t0;\n}\n\nvoid KalmanFilter::predict(double dt, const mat& A, const mat& u)\n{\n  this->A = A;\n  this->dt = dt;\n  x_hat_new = A * x_hat + B * u;\n  P = A*P*A.t() + Q;\n  t += dt;\n  x_hat = x_hat_new;\n}\n\nvoid KalmanFilter::update(const mat& y) \n{\n  K = P*C.t()*(C*P*C.t() + R).inv();\n  x_hat_new += K * (y - C*x_hat);\n  P = (I - K*C)*P;\n  x_hat = x_hat_new;\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/kalman_test/KalmanFilter.h",
    "content": "// Loosely based on kalman-cpp: https://github.com/hmartiro/kalman-cpp\n\n#ifndef KalmanFilter_h\n#define KalmanFilter_h\n\n#include <LinearAlgebra.h>\n\nclass KalmanFilter \n{\n\n  public:\n\n    /**\n     * Create a Kalman filter with the specified matrices.\n     *   A - System dynamics matrix\n     *   B - Control matrix\n     *   C - Output matrix\n     *   Q - Process noise covariance\n     *   R - Measurement noise covariance\n     *   P - Estimate error covariance\n     */\n    KalmanFilter(\n        double dt,\n        const mat& A,\n        const mat& B,\n        const mat& C,\n        const mat& Q,\n        const mat& R,\n        const mat& P\n    );\n\n    /**\n     * Initialize the filter with initial states as zero.\n     */\n    void init();\n\n    /**\n     * Initialize the filter with a guess for initial states.\n     */\n    void init(double t0, const mat& x0);\n\n    /**\n     * Predict estimated state based on the time step, dynamics matrix, and control input\n     */\n    void predict(double dt, const mat& A, const mat& u);\n\n    /**\n     * Update the estimated state based on measured values.\n     */\n    void update(const mat& y);\n\n    /**\n     * Return the current state and time.\n     */\n    mat state() { return x_hat; };\n    double time() { return t; };\n\n  private:\n\n    // Matrices for computation\n    mat A, B, C, Q, R, P, K, P0;\n\n    // System dimensions\n    uint8_t m, n;\n\n    // Initial and current time\n    double t0, t;\n\n    // Discrete time step\n    double dt;\n\n    // n-size identity\n    mat I;\n\n    // Estimated states\n    mat x_hat, x_hat_new;\n};\n\n#endif\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/kalman_test/kalman_test.ino",
    "content": "#include \"KalmanFilter.h\"\n#include \"LinearAlgebra.h\"\n\n\nmat A = mat::identity(6);\nmat B = mat::zeros(6,3);\nmat C = mat::zeros(3,6);\nmat Q = mat::identity(6);\nmat R = mat::zeros(3,3);\nmat P = mat::identity(6);\nKalmanFilter kf = KalmanFilter(0.1, A, B, C, Q, R, P);\n\nvoid setup()\n{\n    A(3,3) = 0;\n    A(4,4) = 0;\n    A(5,5) = 0;\n    B(3,0) = 1;\n    B(4,1) = 1;\n    B(5,2) = 1;\n    C(0,0) = 1;\n    C(1,1) = 1;\n    C(2,2) = 1;\n    kf.init();\n}\n\n\nvoid loop()\n{\n    double dt = 0.1;\n    A(0,3) = dt;\n    A(1,4) = dt;\n    A(2,6) = dt;\n    mat u = mat::ones(1,3);\n\n    mat y = mat::ones(1,3);\n\n    kf.predict(dt, A, u);\n    kf.update(y);\n\n    mat x = kf.state();\n    String s;\n    x.print(s);\n\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/lifter_nano_test/lifter_nano_test.ino",
    "content": "\n#define LIFTER_PIN_0 1\n#define LIFTER_PIN_1 2\n#define LIFTER_PIN_2 3\n#define FEEDBACK_PIN 4\n\n\nvoid setup ()\n{\n    Serial.begin(115200);\n    pinMode(LIFTER_PIN_0, OUTPUT);\n    pinMode(LIFTER_PIN_1, OUTPUT);\n    pinMode(LIFTER_PIN_2, OUTPUT);\n    pinMode(FEEDBACK_PIN, INPUT_PULLUP);\n}\n\nint getCommand()\n{\n    String instr = \"\";\n    int outCmd = 0;\n    if(Serial.available())\n    {\n        instr = Serial.readString();\n        outCmd = instr.toInt();       \n    }\n    return outCmd;\n}\n\nvoid sendPos(int num)\n{\n    int p2 = num / 4;\n    int r = num % 4;\n    int p1 = r / 2;\n    r = r % 2;\n    int p0 = r / 1;\n\n    digitalWrite(LIFTER_PIN_0, p0);\n    digitalWrite(LIFTER_PIN_1, p1);\n    digitalWrite(LIFTER_PIN_2, p2);\n}\n\nbool getFeedback()\n{\n    return !digitalRead(FEEDBACK_PIN);\n}\n\n\nvoid loop()\n{\n\n    int cmd = getCommand();\n\n    bool in_progress = getFeedback();\n\n    if(!in_progress && cmd > 0)\n    {\n        sendPos(cmd);\n    }\n\n    delay(100);\n\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/motor_driver_test_script/motor_driver_test_script.ino",
    "content": "#include \"ClearCore.h\"\n\n\n#define LIFTER_MOTOR ConnectorM3\n#define INCREMENTAL_UP_PIN DI7\n#define INCREMENTAL_DOWN_PIN DI6\n#define LATCH_SERVO_PIN IO0 // Only IO0 does pwm\n#define HOMING_SWITCH_PIN IO4\n\n#define LIFTER_STEPS_PER_REV 800\n\n#define LIFTER_MAX_VEL 7    // revs/sec\n#define LIFTER_MAX_ACC 10  // rev/sec^2\n\n// Max num steps ~51000\n\n#define SAFETY_MAX_POS 120  // Revs, Sanity check on desired position to make sure it isn't larger than this\n#define SAFETY_MIN_POS 0 // Revs, Sanity check on desired position to make sure it isn't less than this\n\n#define LATCH_ACTIVE_MS 2000\n#define LATCH_OPEN_DUTY_CYCLE 50\n#define LATCH_CLOSE_DUTY_CYCLE 200\n\n\nvoid setup() \n{\n    \n    pinMode(INCREMENTAL_UP_PIN, INPUT);\n    pinMode(INCREMENTAL_DOWN_PIN, INPUT);\n    pinMode(HOMING_SWITCH_PIN, INPUT);\n    pinMode(LATCH_SERVO_PIN, OUTPUT);\n\n    // Sets the input clocking rate. This normal rate is ideal for ClearPath\n    // step and direction applications.\n    MotorMgr.MotorInputClocking(MotorManager::CLOCK_RATE_LOW);\n    // Sets all motor connectors into step and direction mode.\n    MotorMgr.MotorModeSet(MotorManager::MOTOR_ALL, Connector::CPM_MODE_STEP_AND_DIR);\n\n    \n    LIFTER_MOTOR.VelMax(LIFTER_MAX_VEL*LIFTER_STEPS_PER_REV);\n    LIFTER_MOTOR.AccelMax(LIFTER_MAX_ACC*LIFTER_STEPS_PER_REV);\n    LIFTER_MOTOR.HlfbMode(MotorDriver::HLFB_MODE_STATIC);\n    LIFTER_MOTOR.PolarityInvertSDEnable(true);\n    LIFTER_MOTOR.EnableRequest(true);\n\n    Serial.begin(115200);\n    Serial.println(\"Test script starting\");\n}\n\n\nvoid test_motor()\n{\n    bool vel_up = digitalRead(INCREMENTAL_UP_PIN);\n    bool vel_down = digitalRead(INCREMENTAL_DOWN_PIN);\n    Serial.print(\"Up: \");\n    Serial.print(vel_up);\n    Serial.print(\", down: \");\n    Serial.println(vel_down);\n\n    if(vel_up)\n    {\n        LIFTER_MOTOR.MoveVelocity(LIFTER_MAX_VEL*LIFTER_STEPS_PER_REV);\n        Serial.println(LIFTER_MOTOR.PositionRefCommanded());\n    }\n    else if(vel_down)\n    {\n        LIFTER_MOTOR.MoveVelocity(-1*LIFTER_MAX_VEL*LIFTER_STEPS_PER_REV);\n        Serial.println(LIFTER_MOTOR.PositionRefCommanded());\n    }  \n    else \n    {\n        LIFTER_MOTOR.MoveStopAbrupt();\n    }\n}\n\nvoid test_homing_switch()\n{\n    bool switch_active = !digitalRead(HOMING_SWITCH_PIN);\n\n    if(switch_active) {\n      Serial.print(\"Homing switch: \");\n     Serial.println(switch_active);\n      LIFTER_MOTOR.PositionRefSet(0);\n    }\n}\n\nvoid test_servo()\n{\n    Serial.println(\"Servo open\");\n    analogWrite(LATCH_SERVO_PIN, LATCH_OPEN_DUTY_CYCLE);\n    delay(1000);\n    Serial.println(\"Servo closed\");\n    analogWrite(LATCH_SERVO_PIN, LATCH_CLOSE_DUTY_CYCLE);\n    delay(1000);\n}\n\nvoid loop()\n{\n    test_motor();\n    test_homing_switch();\n    //test_servo();\n    delay(100);\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/new_dc_motor_tuning/Motor.cpp",
    "content": "#include \"Motor.h\"\n#include \"constants.h\"\n\nconstexpr double COUNTS_TO_RADS = 2.0 * PI / static_cast<double>(COUNTS_PER_OUTPUT_SHAFT_REV);\n\nMotor::Motor(int pwmPin, int dirPin, int encPinA, int encPinB, double Kp, double Ki, double Kd)\n: pwmPin_(pwmPin),\n  dirPin_(dirPin),\n  inputVel_(0.0),\n  currentVelRaw_(0.0),\n  currentVelFiltered_(0.0),\n  pidOut_(0.0),\n  outputCmd_(0.0),\n  prevCount_(0),\n  prevMicros_(micros()),\n  enc_(encPinA, encPinB),\n  controller_(&currentVelFiltered_, &pidOut_, &inputVel_, Kp, Ki, Kd, DIRECT),\n  velFilter_(LOWPASS, VEL_FILTER_FREQ)\n{\n  pinMode(pwmPin_, OUTPUT);\n  pinMode(dirPin_, OUTPUT);\n  controller_.SetMode(AUTOMATIC);\n  controller_.SetOutputLimits(-255, 255);\n  controller_.SetSampleTime(5);\n}\n\nvoid Motor::setCommand(double vel)\n{\n  inputVel_ = vel;  \n}\n\nfloat Motor::getCurrentVelocity()\n{\n  return static_cast<float>(currentVelFiltered_);\n}\n\nlong Motor::getCounts()\n{\n  return enc_.read();\n}\n\nvoid Motor::runLoop(bool pub)\n{\n\n  // Read current values\n  long curCount = -1*enc_.read();\n  unsigned long curMicros = micros();\n\n  // Compute delta\n  long deltaCount = curCount - prevCount_;\n  double deltaRads = static_cast<double>(deltaCount) * COUNTS_TO_RADS;\n  unsigned long deltaMicros = curMicros - prevMicros_;\n\n  if(deltaMicros == 0)\n  {\n    // This will break things due to div by 0, so just skip\n    return;\n  }\n\n  // Copy current values into previous\n  prevCount_ = curCount;\n  prevMicros_ = curMicros;\n\n  // Do a check for large time deltas. This likely means the controller hasn't run for a while and we should ignore this value\n  if(deltaMicros > 100000)\n  {\n    return;\n  }\n\n  // Compute current velocity in rads/second\n  currentVelRaw_ = 1000000.0 * deltaRads / static_cast<double>(deltaMicros);\n  currentVelFiltered_ = currentVelRaw_; // velFilter_.input(currentVelRaw_);\n\n  // Run PID controller\n  controller_.Compute();\n  \n  /* Use output from PID to update our current command. Since this is a velocity controller, when the error is 0\n  *  and the PID controller drives the output to 0, we actually want to maintain a certian PWM value. Hence, the \n  *  PID output is used to modify our control singal and not drive it directly.\n  */\n  outputCmd_ += int(pidOut_);\n  //outputCmd_ = static_cast<int>(pidOut_ + inputVel_ * 10);\n\n  // Set a deadband region based on input vel to avoid integral windup\n  if(fabs(inputVel_) < 0.001)\n  {\n    outputCmd_ = 0;\n  }\n\n  // Make sure we don't exceed max/min power\n  if(outputCmd_ > 255)\n  {\n    outputCmd_ = 255;\n  }\n  else if(outputCmd_ < -255)\n  {\n    outputCmd_ = -255;\n  }\n\n  // Update output direction\n  if(outputCmd_ < 0)\n  {\n    digitalWrite(dirPin_,1);\n  }\n  else\n  {\n    digitalWrite(dirPin_,0);\n  }\n  \n  // Actually write out the motor power\n  analogWrite(pwmPin_, abs(outputCmd_));\n\nif (pub)\n{\n  // Debugging prints\n  //Serial.print(deltaMicros);\n  //Serial.print(\" \");\n  //Serial.print(curCount);\n  //Serial.print(\" \");\n  Serial.print(\"currentVelRaw_:\");\n  Serial.print(currentVelRaw_, 5);\n  Serial.print(\" \");\n  Serial.print(\"currentVelFiltered_:\");\n  Serial.print(currentVelFiltered_, 5);\n  Serial.print(\" \");\n  Serial.print(\"inputVel_:\");\n  Serial.print(inputVel_, 5);\n  Serial.print(\" \");\n  //Serial.print(pidOut_, 5);\n  //Serial.print(\" \");\n  Serial.print(\"outputCmd_:\");\n  Serial.print(outputCmd_/10.0);\n  Serial.println(\"\"); \n}\n  \n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/new_dc_motor_tuning/Motor.h",
    "content": "#ifndef Motor_h\n#define Motor_h\n\n#include <Encoder.h>\n#include <PID_v1.h>\n#include <Filters.h>\n\nclass Motor\n{\n  public:\n\n    /*\n     * Motor object constructor\n     * pwmPin - which pin the motor PWM is connected to\n     * dirPin - which pin the motor direction is connected to\n     * encPinA - which pin the encoder wire A is connected to. This should be a pin capable of handling interrupts\n     * encPinB - which pin the encoder wire B is connected to. This should ideally (but not required) be a pin capable of handling interrupts\n     * Kp - Proportional gain\n     * Ki - Integral gain\n     * Kd - Derrivative gain\n     */\n    Motor(int pwmPin, int dirPin, int encPinA, int encPinB, double Kp, double Ki, double Kd);\n    \n    /*\n     * Set the desired velocity in rad/second\n     */\n    void setCommand(double vel);\n    \n    /*\n     * Run the actual controller. Make sure this gets called reasonably quickly (i.e. every 20 ms or so)\n     */\n    void runLoop(bool pub);\n\n    /*\n    * Get the current measured motor velocity in rad/s\n    */\n    float getCurrentVelocity();\n\n    // Get number of counts from encoder - for debugging\n    long getCounts();\n\n  private:\n\n    // Pins\n    int pwmPin_;\n    int dirPin_;   \n\n    // Values for computation\n    double inputVel_;              // Desired velocity in rad/sec\n    double currentVelRaw_;         // Raw current velocity in rad/sec\n    double currentVelFiltered_;    // Filtered velocity in rad/sec\n    double pidOut_;                // Output from PID controller\n    int outputCmd_;                // Output command in [-255, 255]\n    long prevCount_;               // Encoder count from previous loop\n\n    // Timer\n    unsigned long prevMicros_;   // Timer from previous loop\n\n    // Other objects\n    Encoder enc_;                  // Encoder object\n    PID controller_;               // PID controller\n    FilterOnePole velFilter_;      // Velocity lowpass filter\n};\n\n#endif Motor_h\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/new_dc_motor_tuning/constants.h",
    "content": "#ifndef Constants_h\n#define Constants_h\n\n// Pins\n#define PIN_PWM_0 3   \n#define PIN_PWM_1 4\n#define PIN_PWM_2 5\n#define PIN_PWM_3 6\n\n#define PIN_DIR_0 41\n#define PIN_DIR_1 40\n#define PIN_DIR_2 39\n#define PIN_DIR_3 38\n\n#define PIN_ENABLE_0 49\n#define PIN_ENABLE_1 48\n#define PIN_ENABLE_2 47\n#define PIN_ENABLE_3 46\n\n#define PIN_ENC_A_0 21\n#define PIN_ENC_B_0 25\n#define PIN_ENC_A_1 20\n#define PIN_ENC_B_1 23\n#define PIN_ENC_A_2 19\n#define PIN_ENC_B_2 24\n#define PIN_ENC_A_3 18\n#define PIN_ENC_B_3 22\n\n#define PIN_LATCH_SERVO_PIN 12\n#define PIN_TRAY_STEPPER_LEFT_PULSE 46\n#define PIN_TRAY_STEPPER_LEFT_DIR 47\n#define PIN_TRAY_STEPPER_RIGHT_PULSE 44\n#define PIN_TRAY_STEPPER_RIGHT_DIR 43\n#define PIN_TRAY_HOME_SWITCH 53\n\n// Velocitiy limits\n#define MAX_TRANS_SPEED_FINE   0.08  // m/s\n#define MAX_ROT_SPEED_FINE     0.5   // rad/2\n#define MAX_TRANS_SPEED_COARSE 0.5  // m/s\n#define MAX_ROT_SPEED_COARSE   1.0   // rad/2\n\n// Acceleration limits\n#define MAX_TRANS_ACC_FINE   0.1    // m/s^2\n#define MAX_ROT_ACC_FINE     0.5    // rad/s^2\n#define MAX_TRANS_ACC_COARSE 0.5    // m/s^2\n#define MAX_ROT_ACC_COARSE   1.0    // rad/s^2\n\n// Cartesian control gains\n#define CART_TRANS_KP 2\n#define CART_TRANS_KI 0.1\n#define CART_TRANS_KD 0\n#define CART_ROT_KP 3\n#define CART_ROT_KI 0.5\n#define CART_ROT_KD 0\n\n// Motor control gains\n#define MOTOR_KP 1\n#define MOTOR_KI 0\n#define MOTOR_KD 0\n\n// Motor control constants\n#define VEL_FILTER_FREQ 20            // HZ\n#define COUNTS_PER_OUTPUT_SHAFT_REV 36000 // Manually measrued/estimated\n\n// Physical dimensions\n#define WHEEL_DIAMETER 0.152 // meters\n#define WHEEL_DIST_FROM_CENTER 0.4572 // meters\n\n// Scaling factors\n#define TRAJ_MAX_FRACTION 0.7  // Only generate a trajectory to this fraction of max speed to give motors headroom to compensate\n\n// Kalman filter scales\n#define PROCESS_NOISE_SCALE 0.08\n#define MEAS_NOISE_SCALE 0.01\n#define MEAS_NOISE_VEL_SCALE_FACTOR 10000\n\n// Possition accuracy targets\n#define TRANS_POS_ERR_COARSE 0.10 // m\n#define ANG_POS_ERR_COARSE   0.08 // rad\n#define TRANS_VEL_ERR_COARSE 0.05 // m/s\n#define ANG_VEL_ERR_COARSE   0.05 // rad/s\n#define TRANS_POS_ERR_FINE   0.01 // m\n#define ANG_POS_ERR_FINE     0.02 // rad\n#define TRANS_VEL_ERR_FINE   0.01 // m/s\n#define ANG_VEL_ERR_FINE     0.01 // rad/s\n\n// Set debug printing, comment out to skip debug printing\n#define PRINT_DEBUG true\n\n// Tray stepper control values\n#define TRAY_STEPPER_STEPS_PER_REV 200  // Steps per rev for tray servos\n#define TRAY_DIST_PER_REV 1.8           // mm of linear travel per stepper revolution\n#define TRAY_MAX_LINEAR_TRAVEL 300      // mm of total linear travel possible\n// Note: all tray positions measured in mm from home pos\n#define TRAY_DEFAULT_POS_MM 100         // Default position for driving in mm\n#define TRAY_LOAD_POS_MM 50             // Loading position in mm\n#define TRAY_PLACE_POS_MM 250           // Placing position in mm\n#define TRAY_MAX_SPEED 1000             // Max tray speed in steps/sec\n#define TRAY_MAX_ACCEL 2000             // Max tray acceleration in steps/sec/sec\n\n// Tray servo control values\n#define LATCH_CLOSE_POS 20              // Latch servo position for close in degrees\n#define LATCH_OPEN_POS 150              // Latch servo position for open in degrees\n#define TRAY_SERVO_MIN_PW 1000          // Min pulse witdh in microseconds corresponding to 0 position\n#define TRAY_SERVO_MAX_PW 2000          // Max pulse witdh in microseconds corresponding to 180 position\n\n#define TRAY_PLACEMENT_PAUSE_TIME 3000  // How many ms to wait after opening the latch for placement\n\nenum COMMAND\n{\n    NONE,\n    MOVE,\n    MOVE_REL,\n    MOVE_FINE,\n    PLACE_TRAY,\n    LOAD_TRAY,\n    INITIALIZE_TRAY,\n    POSITION,\n    ESTOP,\n    LOAD_COMPLETE,\n};\n\n#endif\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/new_dc_motor_tuning/new_dc_motor_tuning.ino",
    "content": "#include \"constants.h\"\n#include \"Motor.h\"\n\ndouble Kp = 14;\ndouble Ki = 0.1;\ndouble Kd = 0.08;\nMotor m = Motor(PIN_PWM_2, PIN_DIR_2, PIN_ENC_A_2, PIN_ENC_B_2, Kp, Ki, Kd);\n\nint step = 0;\nunsigned long prevStepTime = millis();\nunsigned long prevLoopTime = millis();\ndouble cur_speed = 0;\n\ndouble max_vel = 2;\ndouble max_acc = 5;\ndouble max_acc_time = 1000 * max_vel / max_acc;\n\ntemplate <typename T> int sgn(T val) {\n    return (T(0) < val) - (val < T(0));\n}\n\nvoid setup()\n{\n    pinMode(PIN_ENABLE_3, OUTPUT);\n    digitalWrite(PIN_ENABLE_3, HIGH);\n    Serial.begin(115200);\n    //Serial.println(\"Start motor tuning\");\n}\n\ndouble computeAcc(double dt, double acc)\n{\n  cur_speed += dt/1000.0 * acc;\n  if (fabsf(cur_speed) > max_vel)\n  {\n    cur_speed = sgn(cur_speed) * max_vel;\n  }\n}\n\nvoid runAccStep(int waitTime, double acc)\n{\n    unsigned long curTime = millis();\n    unsigned long stepTime = curTime - prevStepTime;\n    if (stepTime > waitTime)\n    {\n      step += 1;\n      prevStepTime = millis();\n      prevLoopTime = millis();\n    }\n    else\n    {\n        unsigned long loopTime = curTime - prevLoopTime;\n        computeAcc(loopTime, acc);\n        prevLoopTime = millis();\n    }\n}\n\ndouble runCVStep(int waitTime, double vel)\n{\n    unsigned long curTime = millis();\n    unsigned long stepTime = curTime - prevStepTime;\n    if (stepTime > waitTime)\n    {\n      step += 1;\n      prevStepTime = millis();\n      prevLoopTime = millis();\n    }\n    else\n    {\n       cur_speed = vel;\n    }\n  \n}\n\ndouble runSteps()\n{\n    if (step == 0)\n    {   \n      int stepTime = max_acc_time;\n      double stepAcc = max_acc;\n      runAccStep(stepTime, stepAcc);\n    }\n    else if (step == 1)\n    {\n      int stepTime = 2000;\n      runCVStep(stepTime, max_vel);\n    }\n    else if (step == 2)\n    {\n      int stepTime = max_acc_time;\n      double stepAcc = -1*max_acc;\n      runAccStep(stepTime, stepAcc);\n    }\n    else if (step == 3)\n    {\n      int stepTime = 2000;\n      runCVStep(stepTime, 0);\n    }\n    else if (step == 4)\n    {   \n      int stepTime = max_acc_time;\n      double stepAcc = -1 * max_acc;\n      runAccStep(stepTime, stepAcc);\n    }\n    else if (step == 5)\n    {\n      int stepTime = 2000;\n      runCVStep(stepTime, -1*max_vel);\n    }\n    else if (step == 6)\n    {\n      int stepTime = max_acc_time;\n      double stepAcc = max_acc;\n      runAccStep(stepTime, stepAcc);\n    }\n    else if (step == 7)\n    {\n      int stepTime = 2000;\n      runCVStep(stepTime, 0);\n    }\n    else if (step >= 8)\n    {\n      step = 0;\n    }\n}\n\n\nvoid loop()\n{\n  runSteps();\n  m.setCommand(cur_speed);\n\n  bool motor_pub = true;\n  if(step == 3 || step == 7) {motor_pub  = false;}\n  m.runLoop(motor_pub);\n//  analogWrite(PIN_PWM_2, s);\n//  Serial.println(s);\n//  s += 1;\n delay(15);\n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/single_motor_speed_control/Motor.cpp",
    "content": "#include \"Motor.h\"\n#include <math.h>\n\nMotor::Motor(int pwmPin, int dirPin, int encPinA, int encPinB, double Kp, double Ki, double Kd, float velFilterFreq)\n: pwmPin_(pwmPin),\n  dirPin_(dirPin),\n  inputVel_(0.0),\n  currentVelRaw_(0.0),\n  currentVelFiltered_(0.0),\n  outputCmd_(0.0),\n  prevCount_(0),\n  prevMillis_(millis()),\n  enc_(encPinA, encPinB),\n  controller_(&currentVelFiltered_, &outputCmd_, &inputVel_, Kp, Ki, Kd, DIRECT),\n  velFilter_(LOWPASS, velFilterFreq)\n{\n  pinMode(pwmPin_, OUTPUT);\n  pinMode(dirPin_, OUTPUT);\n  controller_.SetMode(AUTOMATIC);\n  controller_.SetOutputLimits(-255, 255);\n  controller_.SetSampleTime(5);\n}\n\nvoid Motor::setCommand(double vel)\n{\n  inputVel_ = vel;  \n}\n\nvoid Motor::runLoop()\n{\n\n  // Read current values\n  long curCount = enc_.read();\n  unsigned long curMillis = millis();\n\n  // Compute delta\n  long deltaCount = curCount - prevCount_;\n  double deltaRevs = static_cast<double>(deltaCount) / COUNTS_PER_SHAFT_REV;\n  unsigned long deltaMillis = curMillis - prevMillis_;\n\n  // Copy current values into previous\n  prevCount_ = curCount;\n  prevMillis_ = curMillis;\n\n  // Do a check for large time deltas. This likely means the controller hasn't run for a while and we should ignore this value\n  if(deltaMillis > 100)\n  {\n    return;\n  }\n\n  // Compute current velocity in revs/second\n  currentVelRaw_ = 1000.0 * deltaRevs / static_cast<double>(deltaMillis);\n  currentVelFiltered_ = velFilter_.input(currentVelRaw_);\n\n  // Run PID controller\n  controller_.Compute();\n\n  // Update output values\n  if(outputCmd_ < 0)\n  {\n    digitalWrite(dirPin_,0);\n  }\n  else\n  {\n    digitalWrite(dirPin_,1);\n  }\n  analogWrite(pwmPin_, abs(outputCmd_));\n\n//  Serial.print(curCount);\n//  Serial.print(\" \");\n//  Serial.print(currentVelRaw_);\n//  Serial.print(\" \");\n  Serial.print(currentVelFiltered_);\n  Serial.print(\" \");\n  Serial.print(inputVel_);\n  Serial.print(\" \");\n  Serial.print(outputCmd_/255.0);\n//  Serial.print(\" \");\n//  Serial.print(deltaMillis);\n  Serial.println(\"\");\n  \n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/single_motor_speed_control/Motor.h",
    "content": "/* Requires libraries:\n *  Filters: https://github.com/JonHub/Filters\n *  PID: https://playground.arduino.cc/Code/PIDLibrary/\n *  Encoder: https://www.pjrc.com/teensy/td_libs_Encoder.html\n */\n\n#ifndef Motor_h\n#define Motor_h\n\n#include <Encoder.h>\n#include <PID_v1.h>\n#include <Filters.h>\n\nconst unsigned int COUNTS_PER_MOTOR_REV = 44; \nconst unsigned int MOTOR_GEAR_RATIO = 40;\nconst double FUDGE_FACTOR = 1.5;\nconst double COUNTS_PER_SHAFT_REV = COUNTS_PER_MOTOR_REV * MOTOR_GEAR_RATIO * FUDGE_FACTOR;\n\nclass Motor\n{\n  public:\n\n    /*\n     * Motor object constructor\n     * pwmPin - which pin the motor PWM is connected to\n     * dirPin - which pin the motor direction is connected to\n     * encPinA - which pin the encoder wire A is connected to. This should be a pin capable of handling interrupts\n     * encPinB - which pin the encoder wire B is connected to. This should ideally (but not required) be a pin capable of handling interrupts\n     * Kp - Proportional gain\n     * Ki - Integral gain\n     * Kd - Derrivative gain\n     * velFilterFreq - frequency to use for velocity lowpass filter\n     */\n    Motor(int pwmPin, int dirPin, int encPinA, int encPinB, double Kp, double Ki, double Kd, float velFilterFreq);\n    \n    /*\n     * Set the desired velocity in revs/second\n     */\n    void setCommand(double vel);\n    \n    /*\n     * Run the actual controller. Make sure this gets called reasonably quickly (i.e. every 20 ms or so)\n     */\n    void runLoop();\n\n  private:\n\n    // Pins\n    int pwmPin_;\n    int dirPin_;   \n\n    // Values for computation\n    double inputVel_;              // Desired velocity in revs/sec\n    double currentVelRaw_;            // Raw current velocity in revs/sec\n    double currentVelFiltered_;\n    double outputCmd_;             // Output command in [-255, 255]\n    long prevCount_;               // Encoder count from previous loop\n\n    // Timer\n    unsigned long prevMillis_;   // Timer from previous loop\n\n    // Other objects\n    Encoder enc_;                  // Encoder object\n    PID controller_;               // PID controller\n    FilterOnePole velFilter_;      // Velocity lowpass filter\n};\n\n#endif Motor_h\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/single_motor_speed_control/single_motor_speed_control.ino",
    "content": "/* Requires libraries:\n *  Filters: https://github.com/JonHub/Filters\n *  PID: https://playground.arduino.cc/Code/PIDLibrary/\n *  Encoder: https://www.pjrc.com/teensy/td_libs_Encoder.html\n */\n\n#include \"Motor.h\"\n\n// Pinouts\n#define PIN_ENABLE 52\n\n#define PIN_BL_PWM 6\n#define PIN_BL_DIR 48\n\n\n#define PIN_BL_ENC_A 19\n#define PIN_BL_ENC_B 29\n\n#define VEL_FILTER_FREQ 10 // HZ\n\nconst double Kp = 200;\nconst double Ki = 2000;\nconst double Kd = 0;\n\nMotor myMotor = Motor(PIN_BL_PWM, PIN_BL_DIR, PIN_BL_ENC_A, PIN_BL_ENC_B, Kp, Ki, Kd, VEL_FILTER_FREQ);\n\n\nvoid setup() \n{\n  // Setup pin modes\n  pinMode(PIN_ENABLE,OUTPUT);\n  \n  Serial.begin(250000);\n}\n\nvoid setMotorCommands(float rps)\n{\n  int curDirection = 1;\n  float curSpeed = curDirection * rps;\n//  Serial.print(\"Direction \");\n//  Serial.print(curDirection);\n//  Serial.print(\", Speed \");\n//  Serial.println(curSpeed);\n  \n  myMotor.setCommand(curSpeed);\n  digitalWrite(PIN_ENABLE, 1);\n}\n\nvoid runMotors()\n{\n  unsigned long prevTime = millis();\n  unsigned long startTime = prevTime;\n  int targetDelta = 15;\n  int targetTotalTime = 3000;\n\n  while (millis() - startTime < targetTotalTime)\n  {\n    if(millis() - prevTime > targetDelta)\n    {\n      prevTime = millis();\n      myMotor.runLoop();\n    }\n  }\n}\n\nvoid loop() \n{\n\n//  // Get revolutions per second\n//  Serial.println(\"Please enter target revolutions per second\");\n//  bool gotInput = false;\n//  float rps = 0.0;\n//  while(!gotInput)\n//  {\n//    if(Serial.available() > 0)\n//    {\n//      rps = Serial.parseFloat();\n//      if(rps == 0)\n//      {\n//        continue;\n//      }\n//      else\n//      {\n//        break;\n//      }\n//    }\n//    delay(10);\n//  }\n\n  delay(1000);\n  float rps = 0.5;\n\n  setMotorCommands(rps);\n\n  runMotors();\n\n  // Stop motors\n  digitalWrite(PIN_ENABLE, 0);\n\n  \n\n  \n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/stepper_nano_test/stepper_nano_test.ino",
    "content": "// Uses stepper driver library from here: https://www.airspayce.com/mikem/arduino/AccelStepper/classAccelStepper.html\n#include <AccelStepper.h>\n\n#define STEP_PIN 2\n#define DIR_PIN 1   // Note that the DIR pin is actually controlled by the mster arduino so this won't be connected to anything\n\n#define MILLIS_BETWEEN_PRINTING 100\nconst int max_vel = 800; //steps/second\n\nAccelStepper motor;\nunsigned long count = 0;\nunsigned long prevMillisPrint;\n\nvoid setup() \n{\n    Serial.begin(115200);\n    Serial.println(\"Starting...\");\n    motor = AccelStepper(AccelStepper::DRIVER, STEP_PIN, DIR_PIN);\n    motor.setMaxSpeed(max_vel);\n    motor.setAcceleration(max_vel);\n    motor.setMinPulseWidth(10); // Min from docs is 2.5 microseconds\n    prevMillisPrint = millis();\n\n    motor.move(800);\n}\n\nvoid loop()\n{\n\n    if(motor.isRunning() && millis() - prevMillisPrint > MILLIS_BETWEEN_PRINTING)\n    {       \n       Serial.print(motor.speed());\n       Serial.print(\", \");\n       Serial.println(motor.currentPosition());\n       prevMillisPrint = millis();\n       count = 0;      \n    }\n    \n    motor.run();\n    count++;\n    \n}\n"
  },
  {
    "path": "experimental_testing/RobotArduinoTesting/test_encoders/test_encoders.ino",
    "content": "// Pins\n#define PIN_ENCA_1 21\n#define PIN_ENCA_2 20\n#define PIN_ENCA_3 19\n#define PIN_ENCA_4 18\n\n#define PIN_ENCB_1 25\n#define PIN_ENCB_2 23\n#define PIN_ENCB_3 24\n#define PIN_ENCB_4 22\n\n#include <Encoder.h>\n\nEncoder e[4] = { Encoder(PIN_ENCA_1, PIN_ENCB_1),\n                 Encoder(PIN_ENCA_2, PIN_ENCB_2), \n                 Encoder(PIN_ENCA_3, PIN_ENCB_3), \n                 Encoder(PIN_ENCA_4, PIN_ENCB_4) } ;\n\nvoid setup() {\n  // put your setup code here, to run once:\n  Serial.begin(115200);\n  Serial.println(\"Encoder Test:\");\n}\n\nvoid loop() {\n  \n  Serial.print(\"Pos: [\");\n  for (int i = 0; i < 4; ++i )\n  {\n    Serial.print(e[i].read());\n    Serial.print(\", \");\n  }\n  Serial.println(\"]\");\n\n\n  // if a character is sent from the serial monitor,\n  // reset both back to zero.\n  if (Serial.available())\n  {\n    Serial.read();\n    Serial.println(\"Reset position to zero\");\n    e[0].write(0);\n    e[1].write(0);\n    e[2].write(0);\n    e[3].write(0);\n  }\n\n  delay(100);\n\n\n}\n"
  },
  {
    "path": "experimental_testing/TrajGen.py",
    "content": "# Testing out trajectory generation where I have a gui\n\nimport math\nimport numpy as np\nfrom matplotlib import pyplot as plt\n\n# Parameters\nMAX_VEL = 3.0  # m/s\nMAX_ACC = 2.0    # m/s^2\n\n# Find min time and position change to reach full speed\nT_ACC = MAX_VEL / MAX_ACC\nprint(\"T_ACC: {}\".format(T_ACC))\nP_ACC = 0.5 * MAX_ACC * T_ACC * T_ACC\nprint(\"P_ACC: {}\".format(P_ACC))\n\n\ndef sign(n):\n    if n > 0:\n        return 1\n    elif n < 0:\n        return -1\n    else:\n        return 0\n\n\ndef gen_traj2(p1, p2, dt):\n\n    position_change = abs(p2 - p1)\n\n    # Determine if position change is less than required for steady state\n    if position_change < 2*P_ACC:\n        return gen_triangle_traj(p1, p2, dt)\n    else:\n        return gen_trap_traj(p1, p2, dt)\n\ndef gen_triangle_traj(p1, p2, dt):\n    turn_pt = (p2 - p1)/2\n    dir = sign(p2 - p1)\n    traj = {'time': [0], 'pos': [p1], 'vel': [0], 'acc': [dir * MAX_ACC]}\n    gen_const_acc_until_pos(traj, dt, dir*MAX_ACC, turn_pt, dir)\n    gen_const_acc_until_pos(traj, dt, -1*dir*MAX_ACC, p2, dir)\n    return traj\n\ndef gen_trap_traj(p1, p2, dt):\n    dir = sign(p2 - p1)\n    max_vel_pt = dir*P_ACC\n    slow_down_pt = p2 - dir * P_ACC\n    traj = {'time': [0], 'pos': [p1], 'vel': [0], 'acc': [dir * MAX_ACC]}\n    gen_const_acc_until_pos(traj, dt, dir*MAX_ACC, max_vel_pt, dir)\n    gen_const_acc_until_pos(traj, dt, 0, slow_down_pt, dir)\n    gen_const_acc_until_pos(traj, dt, -1*dir*MAX_ACC, p2, dir)\n    return traj\n\n\ndef gen_const_acc_until_pos(traj, dt, acc, stop_pos, stop_dir):\n    \"\"\"\n    Generate constant acceleration trajectory\n    \"\"\"\n    n_iter = 0\n    while True:\n        new_vel = traj['vel'][-1] + acc * dt\n        new_pos = traj['pos'][-1] + traj['vel'][-1] * dt\n        new_time = traj['time'][-1] + dt\n\n        if stop_dir > 0 and new_vel < 0:\n            break\n        if stop_dir < 0 and new_vel > 0:\n            break\n\n        if stop_dir > 0 and new_pos >= stop_pos:\n            break\n        elif stop_pos < 0 and new_pos <= stop_pos:\n            break\n        elif n_iter > 100000:\n            raise StopIteration(\"Too many iterations\")\n\n        traj['time'].append(new_time)\n        traj['pos'].append(new_pos)\n        traj['vel'].append(new_vel)\n        traj['acc'].append(acc)\n        n_iter += 1\n\n    return traj\n\n\ndef plot_traj(pvt_data):\n    \"\"\" \n    Makes a plot of position, velocity, and acceleration versus time\n    \"\"\"\n    ax = plt.axes()\n    ax.plot(pvt_data[\"time\"], pvt_data[\"pos\"],'r',label='Position')\n    ax.plot(pvt_data[\"time\"], pvt_data[\"vel\"],'g',label='Velocity')\n    ax.plot(pvt_data[\"time\"], pvt_data[\"acc\"],'b',label='Acceleration')\n    ax.legend()\n    plt.show()\n\n\n\nif __name__ == '__main__':\n    p1 = 0\n    p2 = 20\n    dt = 0.05\n    pvt = gen_traj2(p1, p2, dt)\n    plot_traj(pvt)"
  },
  {
    "path": "experimental_testing/TrajGenv2.py",
    "content": "import math\nimport numpy as np\nfrom matplotlib import pyplot as plt\n\np_target = 20.0 # m\nv_max = 1.0  #m/s\na_max = 2.0  #m/s^2\nj_max = 8.0 #m/s^3\nalpha = 0.8 # velocity decay\nbeta = 0.8 # acceleation decay\nloop_limit = 10 # max loops for convergence\n\nplot_timestep = 0.01\n\n\ndef generate(target_position):\n\n    v_lim = v_max\n    a_lim = a_max\n    j_lim = j_max\n\n    output = {}\n    output['done'] = False\n\n    for i in range(loop_limit):\n        print(\"Starting loop {}\".format(i))\n        output = generate_once(target_position, v_lim, a_lim, j_lim)\n\n        if output['done']:\n            break\n        else:\n            v_lim = output['v_lim']\n            a_lim = output['a_lim']\n            j_lim = output['j_lim']\n            \n\n    if output['done']:\n        print(\"Trajectory found\")\n        return output\n    else:\n        print(\"Trajectory not found\")\n        return {}\n\n    \n\n\ndef generate_once(p, v_lim, a_lim, j_lim):\n\n    output = {}\n    output['done'] = False\n    output['v_lim'] = v_lim\n    output['a_lim'] = a_lim\n    output['j_lim'] = j_lim\n    output['t'] = []\n\n    # Constant jerk regions\n    dt_j = a_lim / j_lim\n    print(\"dt_j = {}\".format(dt_j))\n    dv_j = 0.5 * j_lim * dt_j ** 2  # Positive jerk region\n    dp_j1 = 1/6.0 * j_lim * dt_j ** 3 # Positive jerk region\n    dp_j2 = (v_lim - dv_j) * dt_j + 0.5 * a_lim * dt_j ** 2 - 1/6 * j_lim * dt_j ** 3  # Negative jerk region\n    print(\"dv_j: {}, dp_j1: {}, dp_j2: {}\".format(dv_j, dp_j1, dp_j2))\n\n    # Constant accel region\n    dt_a = (v_lim - 2*dv_j ) / a_lim\n    print(\"dt_a = {}\".format(dt_a))\n    if dt_a <= 0:\n        output['a_lim'] = a_lim * beta\n        print(\"New a_lim = {}\".format(output['a_lim'] ))\n        return output\n    dp_a = dv_j * dt_a + 0.5 * a_lim * dt_a ** 2\n    print(\"dp_a: {}\".format(dp_a))\n\n    # Constant vel region\n    dt_v = (p - 2 * dp_j1 - 2 * dp_j2 - 2 * dp_a) / v_lim\n    print(\"dt_v = {}\".format(dt_v))\n    if dt_v <= 0:\n        output['v_lim'] = alpha * v_lim\n        print(\"New v_lim = {}\".format(output['v_lim'] ))\n        return output\n    dp_v = v_lim * dt_v\n    print(\"dp_v: {}\".format(dp_v))\n\n    # If we get here, the genertation should be correct, so compute time regions and return\n    output['done'] = True\n    t = [0,0,0,0,0,0,0,0]\n    t[0] = 0\n    t[1] = t[0] + dt_j   \n    t[2] = t[1] + dt_a\n    t[3] = t[2] + dt_j                \n    t[4] = t[3] + dt_v\n    t[5] = t[4] + dt_j\n    t[6] = t[5] + dt_a\n    t[7] = t[6] + dt_j\n    output['t'] = t\n\n    print(\"Times: {}\".format(t))\n    \n    return output\n\ndef generate_inverse(p, dt_j, dt_a, dt_v):\n    output = {}\n    output['done'] = True\n    t = [0,0,0,0,0,0,0,0]\n    t[0] = 0\n    t[1] = t[0] + dt_j   \n    t[2] = t[1] + dt_a\n    t[3] = t[2] + dt_j                \n    t[4] = t[3] + dt_v\n    t[5] = t[4] + dt_j\n    t[6] = t[5] + dt_a\n    t[7] = t[6] + dt_j\n    output['t'] = t\n\n    # Solve system of equations to get kinematic limits\n    A = np.array([\n        [dt_j,                                          -1,         0    ],\n        [dt_j ** 2,                                     dt_a,      -1    ],\n        [(dt_j ** 2) * (dt_a - dt_j),     dt_j ** 2 + dt_a ** 2,     dt_v + 2* dt_j ]])\n    b = np.array([0, 0, p])\n    lims = np.linalg.solve(A, b)\n\n    output['v_lim'] = lims[2]\n    output['a_lim'] = lims[1]\n    output['j_lim'] = lims[0]\n\n    return output\n\ndef generate_profile_from_params(output, timestep):\n\n    j_lim = output['j_lim']\n    a_lim = output['a_lim']\n    v_lim = output['v_lim']\n\n    T = output['t']\n    t_vals = np.arange(0, T[7], timestep)\n    p = [0]\n    v = [0]\n    a = [0]\n    j = [j_lim]\n\n    for t in t_vals[1:]:\n        if t >= T[0] and t < T[1]:\n            p.append(p[-1] + v[-1] * timestep + 0.5 * a[-1] * timestep **2 + 1/6.0 * j[-1] * timestep ** 3)\n            v.append(v[-1] + a[-1] * timestep + 0.5* j[-1] * timestep ** 2)\n            a.append(a[-1] + j[-1] * timestep)\n            j.append(j_lim)        \n        elif t >= T[1] and t < T[2]:\n            p.append(p[-1] + v[-1] * timestep + 0.5 * a[-1] * timestep **2 + 1/6.0 * j[-1] * timestep ** 3)\n            v.append(v[-1] + a[-1] * timestep + 0.5* j[-1] * timestep ** 2)\n            a.append(a_lim)\n            j.append(0)\n        elif t >= T[2] and t < T[3]:\n            p.append(p[-1] + v[-1] * timestep + 0.5 * a[-1] * timestep **2 + 1/6.0 * j[-1] * timestep ** 3)\n            v.append(v[-1] + a[-1] * timestep + 0.5* j[-1] * timestep ** 2)\n            a.append(a[-1] + j[-1] * timestep)\n            j.append(-j_lim)\n        elif t >= T[3] and t < T[4]:\n            p.append(p[-1] + v[-1] * timestep + 0.5 * a[-1] * timestep **2 + 1/6.0 * j[-1] * timestep ** 3)\n            v.append(v_lim)\n            a.append(0)\n            j.append(0)\n        elif t >= T[4] and t < T[5]:\n            p.append(p[-1] + v[-1] * timestep + 0.5 * a[-1] * timestep **2 + 1/6.0 * j[-1] * timestep ** 3)\n            v.append(v[-1] + a[-1] * timestep + 0.5* j[-1] * timestep ** 2)\n            a.append(a[-1] + j[-1] * timestep)\n            j.append(-j_lim)\n        elif t >= T[5] and t < T[6]:\n            p.append(p[-1] + v[-1] * timestep + 0.5 * a[-1] * timestep **2 + 1/6.0 * j[-1] * timestep ** 3)\n            v.append(v[-1] + a[-1] * timestep + 0.5* j[-1] * timestep ** 2)\n            a.append(-a_lim)\n            j.append(0)\n        elif t >= T[6] and t < T[7]:\n            p.append(p[-1] + v[-1] * timestep + 0.5 * a[-1] * timestep **2 + 1/6.0 * j[-1] * timestep ** 3)\n            v.append(v[-1] + a[-1] * timestep + 0.5* j[-1] * timestep ** 2)\n            a.append(a[-1] + j[-1] * timestep)\n            j.append(j_lim)\n   \n    return (t_vals, p, v, a, j)\n\n\ndef plot_data(t,p,v,a,j, output):\n    ax = plt.axes()\n    ax.plot(t, p,'r',label='Position', linewidth=4)\n    ax.plot(t, v,'g',label='Velocity', linewidth=4)\n    ax.plot(t, a,'b',label='Acceleration', linewidth=4)\n    ax.plot(t, j,'k',label='Jerk', linewidth=4)\n    ax.legend()\n    ax.set_xlabel('Time (s)')\n    ax.set_ylabel(\"Value (m, m/s, m/s^2, m/s^3)\")\n    ax.set_title(\"Trajectory Generation\\nDist: {} m, Vel limit: {} m/s, Acc limit: {} m/s^2, Jerk limit: {} m/s^3\".format(p_target, v_max, a_max, j_max))\n\n    # Plot vertical lines\n    # ax.set_ylabel(\"Velocity (m/s)\")\n    # ax.set_ylim((-0.5, 2))\n    # t_vert = output['t']\n    # y_vert = np.linspace(-10.0, 10.0, 100)\n    # for new_t in t_vert:\n    #     ax.plot([new_t]*100, y_vert, 'k--', linewidth=3)\n    # ax.grid(True)\n    # ax.text(t_vert[0] + (t_vert[1] - t_vert[0])/6.0, -0.2, \"Constant Jerk\", fontsize='x-large')\n    # ax.text(t_vert[1] + (t_vert[2] - t_vert[1])/6.0, -0.1, \"Constant Accel\", fontsize='x-large')\n    # ax.text(t_vert[2] + (t_vert[3] - t_vert[2])/6.0, -0.2, \"Constant Jerk\", fontsize='x-large')\n    # ax.text(t_vert[3] + (t_vert[4] - t_vert[3])/3.0, 0.0, \"Constant Velocity\", fontsize='x-large')\n    # ax.text(t_vert[4] + (t_vert[5] - t_vert[4])/6.0, -0.2, \"Constant Jerk\", fontsize='x-large')\n    # ax.text(t_vert[5] + (t_vert[6] - t_vert[5])/6.0, -0.1, \"Constant Accel\", fontsize='x-large')\n    # ax.text(t_vert[6] + (t_vert[7] - t_vert[6])/6.0, -0.2, \"Constant Jerk\", fontsize='x-large')\n\n    plt.show()\n\n\ndef plot_trapazoidal_profile():\n\n    n_divisions = 91\n    a_steps = int(n_divisions/3)\n    t = np.linspace(0, 5, n_divisions)\n    a = [1]*a_steps + [0]*a_steps + [-1]*a_steps + [0]\n    v = [0]\n    p = [0]\n    for i in range(1, n_divisions):\n        dt = t[i] - t[i-1]\n        new_v = v[i-1] + a[i-1] * dt\n        new_p = p[i-1] + v[i-1] * dt + 0.5 * a[i-1] * dt * dt\n        v.append(new_v)\n        p.append(new_p)\n\n    ax = plt.axes()\n    # ax.plot(t, p,'r',label='Position')\n    ax.plot(t, v,'g',label='Velocity',linewidth=5)\n    # ax.plot(t, a,'b',label='Acceleration')\n    ax.set_xlabel('Time (s)')\n    ax.set_ylabel(\"Velocity (m/s)\")\n    ax.legend()\n\n    # Plot vertical lines\n    t_vert = [t[0], t[a_steps], t[2*a_steps], t[3*a_steps]]\n    y_vert = np.linspace(-10.0, 10.0, 100)\n    for new_t in t_vert:\n        ax.plot([new_t]*100, y_vert, 'k--', linewidth=3)\n    ax.set_ylim((-0.5, 3))\n    ax.grid(True)\n    ax.text(t_vert[0] + (t_vert[1] - t_vert[0])/3.0, -0.2, \"Constant Accel\", fontsize='x-large')\n    ax.text(t_vert[1] + (t_vert[2] - t_vert[1])/3.0, -0.1, \"Constant Velocity\", fontsize='x-large')\n    ax.text(t_vert[2] + (t_vert[3] - t_vert[2])/3.0, -0.2, \"Constant Accel\", fontsize='x-large')\n\n\n\n    plt.show()\n\n    \n\nif __name__ == '__main__':\n\n    output = generate(p_target)\n    if output:\n\n        # Test inverse\n        dt_j = output['t'][1]\n        dt_a = output['t'][2] - output['t'][1]\n        dt_v = output['t'][4] - output['t'][3]\n        output2 = generate_inverse(p_target, dt_j, dt_a, dt_v)\n\n        eps = 0.01\n        all_valid = True\n        if output2:\n            if abs(output2['v_lim'] - output['v_lim']) / output['v_lim'] > eps:\n                print(\"v_lim not close. Expected: {}, Actual: {}, Percent diff: {}\".format(output['v_lim'], output2['v_lim'], abs(output2['v_lim'] - output['v_lim']) / output['v_lim'] ))\n                all_valid = False\n            if abs(output2['a_lim'] - output['a_lim']) / output['a_lim'] > eps:\n                print(\"a_lim not close. Expected: {}, Actual: {}, Percent diff: {}\".format(output['a_lim'], output2['a_lim'], abs(output2['a_lim'] - output['a_lim']) / output['a_lim'] ))\n                all_valid = False\n            if abs(output2['j_lim'] - output['j_lim']) / output['j_lim'] > eps:\n                print(\"j_lim not close. Expected: {}, Actual: {}, Percent diff: {}\".format(output['j_lim'], output2['j_lim'], abs(output2['j_lim'] - output['j_lim']) / output['j_lim'] ))\n                all_valid = False\n\n            if all_valid:\n                print(\"All inverse values valid\")\n        else:\n            print(\"Inverse failed\")\n\n        # Generate plot\n        data = generate_profile_from_params(output, plot_timestep)\n        plot_data(*data, output)\n\n    # plot_trapazoidal_profile()\n\n        \n"
  },
  {
    "path": "src/master/FieldPlanner.py",
    "content": "import matplotlib.image as mpimg\nimport matplotlib.pyplot as plt\nimport skimage.transform as sktf\nimport numpy as np\nimport matplotlib.patches as patches\nimport math\nimport enum\nimport logging\nimport os\nimport Utils\nfrom Utils import ActionTypes\nimport config\nimport pickle\nimport csv\nimport copy\n\n\ndef generateVisionOffsetMap(cfg, max_x, max_y):\n    \n    vision_offset_map = { (x,y): cfg.default_vision_offset for x in range(max_x) for y in range(max_y) }\n    with open(cfg.vision_offset_file) as csvfile:\n        reader = csv.reader(csvfile)\n        for idx,row in enumerate(reader):\n            if idx == 0:\n                continue\n            if row[0] == 'X':\n                tile_x = [x for x in range(max_x)]\n            else:\n                tile_x = [int(row[0])]\n            \n            if row[1] == 'X':\n                tile_y = [ y for y in range(max_y)]\n            else:\n                tile_y = [int(row[1])]\n\n            offset_x_meters = int(row[2]) / 1000.0\n            offset_y_meters = int(row[3]) / 1000.0\n            offset_a_degrees = float(row[4])\n\n            for x in tile_x:\n                for y in tile_y:\n                    key = (x,y)\n                    add_value = np.array((offset_x_meters,offset_y_meters,offset_a_degrees))\n                    vision_offset_map[key] = vision_offset_map[key] + add_value\n\n    return vision_offset_map\n\n\nclass DominoField:\n    \"\"\"\n    Data and methods for parsing and image and generating a field of tiles of dominos\n    \"\"\"\n\n    def __init__(self, cfg):\n        self.cfg = cfg\n        self.img = None\n        self.img_scaled = None\n        self.img_parsed_ids = None\n        self.img_parsed_color = None\n        self.n_tiles_x = 0\n        self.n_tiles_y = 0\n        self.tiles = []\n\n    def generate(self):\n\n        logging.info('Generating domino field from image...')\n        self._generateField()\n        logging.info('done.')\n\n        logging.info('Generating tiles from field...')\n        self._generateTiles()\n        logging.info('done.')\n\n\n    def printStats(self):\n        # Output some metrics\n        logging.info(\"Original image size: {}\".format(self.img.shape[:2]))\n        logging.info(\"Scaled image size: {}\".format(self.img_scaled.shape[:2]))\n        logging.info('Domino usage:')\n        logging.info('Total number of dominos: ' + str(self.img_parsed_ids.size))\n        logging.info('Colors:')\n        unique_colors, counts = np.unique(self.img_parsed_ids, return_counts=True)\n        for i, id in enumerate(unique_colors):\n            logging.info('  ' + self.cfg.dominos[id][0] + ': ' + str(counts[i]))\n\n    def show_image_parsing(self):\n        # Plot images\n        fig, axes = plt.subplots(nrows=1, ncols=3)\n        ax = axes.ravel()\n\n        ax[0].imshow(self.img)\n        ax[0].set_title('Original')\n        ax[1].imshow(self.img_scaled)\n        ax[1].set_title('Scaled')\n        ax[2].imshow(self.img_parsed_color)\n        ax[2].set_title('Dominos')\n\n        plt.tight_layout()\n        figManager = plt.get_current_fig_manager()\n        figManager.window.state('zoomed')\n        plt.show()\n\n    def render_domino_image_tiles(self):\n\n        # Allocate memory for image\n        tile_size_x_in_px = (self.cfg.domino_width_px + self.cfg.domino_spacing_width_px) * self.cfg.tile_width\n        tile_size_y_in_px = (self.cfg.domino_height_px + self.cfg.domino_spacing_height_px) * self.cfg.tile_height\n        array_size_x = tile_size_x_in_px * self.n_tiles_x\n        array_size_y = tile_size_y_in_px * self.n_tiles_y\n        image_array = np.zeros((array_size_x, array_size_y, 3))\n\n        # Generate image\n        for tile in self.tiles:\n            tile.draw(image_array)\n\n        # Modify array to show image correctly\n        image_array = np.transpose(image_array, (1, 0, 2))\n        image_array = np.flip(image_array, 0)\n\n        # Actually show image\n        plt.imshow(image_array)\n        figManager = plt.get_current_fig_manager()\n        figManager.window.state('zoomed')\n        plt.show()\n\n    def show_tile_ordering(self):\n\n        # Build array of order to show and write\n        order_array = np.zeros((self.n_tiles_x,self.n_tiles_y))\n        # print(self.n_tiles_x)\n        # print(self.n_tiles_y)\n        for tile in self.tiles:\n            # print(tile.coordinate)\n            order_array[tile.coordinate] = tile.order\n\n        # Modify array to show image correctly\n        order_array = np.transpose(order_array, (1, 0))\n\n        # Show figure with colors\n        fig = plt.figure()\n        ax = fig.add_subplot(111)\n        im = ax.imshow(order_array, origin='lower', cmap='cool', interpolation='None')\n\n        for x in range(self.n_tiles_x):\n            for y in range(self.n_tiles_y):\n                label = int(order_array[y, x])\n                ax.text(x, y, label, color='black', ha='center', va='center')\n\n        fig.colorbar(im)\n        plt.show()\n\n    def _generateField(self):\n        # Load original image\n        img = mpimg.imread(self.cfg.image_name)\n\n        # Skip A value for RGBA files\n        if img.shape[2] is 4:\n            img = img[:,:,:3]\n\n        # Do some modifications to MR logo to get it the right shape\n        if self.cfg.MR_LOGO_PLAN:\n\n            # Crop vertical\n            vert_size = img.shape[0]\n            crop_factor = 0.4\n            lb = int(vert_size/2 - crop_factor*vert_size)\n            ub = int(vert_size/2 + crop_factor*vert_size)\n            img = img[lb:ub,:,:]\n\n            # Padd horizontal\n            hz_size = img.shape[1]\n            pad_amount = int(0.1 * hz_size)\n            img = np.pad(img,  pad_width=[(0, 0), (pad_amount, pad_amount),(0, 0)], mode='constant')\n\n        # Scaled image\n        img_scaled = sktf.resize(img, (self.cfg.desired_height_dominos, self.cfg.desired_width_dominos), anti_aliasing=False)\n\n        # Parse image into domino IDs by choosing 'closest' color\n        img_parsed_color = np.zeros((self.cfg.desired_width_dominos * self.cfg.desired_height_dominos, 3))\n        img_parsed_ids = np.zeros_like(img_parsed_color, dtype=np.int_)[:, 0]\n        count = 0\n        for row in img_scaled:\n            for px in row:\n                best_id = -1\n                best_score = 9999999\n                for id, val in enumerate(self.cfg.dominos):\n                    color = val[1]\n                    score = np.linalg.norm(px - color)\n                    if score < best_score:\n                        best_score = score\n                        best_id = id\n                img_parsed_ids[count] = best_id\n                img_parsed_color[count] = self.cfg.dominos[best_id][1]\n                count = count + 1\n\n        img_parsed_ids = img_parsed_ids.reshape((self.cfg.desired_height_dominos, self.cfg.desired_width_dominos))\n        img_parsed_color = img_parsed_color.reshape((self.cfg.desired_height_dominos, self.cfg.desired_width_dominos, 3))\n\n        self.img = img\n        self.img_scaled = img_scaled\n        self.img_parsed_ids = img_parsed_ids\n        self.img_parsed_color = img_parsed_color\n\n    def _addTile(self, tile_coordinate, tile_values, tile_order, vision_offset_map):\n\n        vision_offset = vision_offset_map[tile_coordinate]\n        # print(\"Tile: order {}, coord {}, vision offset: {}\".format(tile_order, tile_coordinate, vision_offset))\n        new_tile = Tile(self.cfg, tile_coordinate, tile_values, tile_order, vision_offset)\n        self.tiles.append(new_tile)\n\n    def _generateTiles(self):\n\n        # Check sizes and make sure things line up\n        if self.cfg.desired_height_dominos % self.cfg.tile_height != 0:\n            raise ValueError('Field height is not evenly divisible by tile height!')\n        if self.cfg.desired_width_dominos % self.cfg.tile_width != 0:\n            raise ValueError('Field width is not evenly divisible by tile width!')\n\n        # Determine number of tiles needed in x and y\n        self.n_tiles_x = int(self.cfg.desired_width_dominos / self.cfg.tile_width)\n        self.n_tiles_y = int(self.cfg.desired_height_dominos / self.cfg.tile_height)\n        print(\"Generating tiles {} x {}\".format(self.n_tiles_x, self.n_tiles_y))\n\n        order_map = self._generateTileOrdering(self.n_tiles_x, self.n_tiles_y)\n        vision_offset_map = generateVisionOffsetMap(self.cfg, self.n_tiles_x, self.n_tiles_y)\n\n        # Loop over tiles and assign id and colors to each\n        for i in range(self.n_tiles_x):\n            x_start_idx = i * self.cfg.tile_width\n            x_end_idx = (i + 1) * self.cfg.tile_width\n\n            for j in range(self.n_tiles_y):\n                # Need to account for flipped y axis with array\n                y_start_idx =  - j * self.cfg.tile_height - 1\n                y_end_idx = - (j + 1) * self.cfg.tile_height - 1\n\n                tile_values = np.copy(self.img_parsed_ids[y_start_idx:y_end_idx:-1, x_start_idx:x_end_idx])\n                tile_coord = (i,j)\n                tile_order = order_map[tile_coord]\n                self._addTile(tile_coord, tile_values, tile_order, vision_offset_map)\n\n        # Sort tile array so they are in order\n        self.tiles = sorted(self.tiles, key = lambda tile: tile.order)\n\n    @classmethod\n    def _generateTileOrdering(cls, num_x, num_y):\n        \"\"\"\n        Generates and ordering that maps x y tile coordinate -> order number\n        \"\"\"\n\n        # return cls._generateTileOrderingDiagonal(num_x, num_y)\n        return cls._generateTileOrderingColumns(num_x, num_y)\n\n    @classmethod\n    def _generateTileOrderingColumns(cls, num_x, num_y):\n        def coordToOrderLR(coord, num_x, num_y):\n            return coord[0]*num_y + num_y - coord[1] -1\n        def coordToOrderRL(coord, num_x, num_y):\n            return (num_x - coord[0] - 1)*num_y + num_y - coord[1] -1\n        all_coords = [(x,y) for x in range(num_x) for y in range(num_y)]\n        order_map = {coord: coordToOrderRL(coord, num_x, num_y) for coord in all_coords}\n        return order_map\n\n    @classmethod\n    def _generateTileOrderingDiagonal(cls, num_x, num_y):\n\n        coord = [num_x - 1, num_y - 1] # Starting at top right\n        order = 0\n        order_map = {tuple(coord): order}\n        done = False\n        next_pass = np.copy(coord)\n        next_pass[1] = coord[1] - 1 # Next pass starts at far right, next row down\n        order = order + 1\n        while not done:\n            # If you are at the top or left, go to next pass\n            if coord[1] + 1 >= num_y or coord[0] - 1 < 0:\n\n                coord = np.copy(next_pass)\n\n                # Compute new value for next pass\n                if next_pass[1] > 0:\n                    next_pass[1] = next_pass[1] - 1 # Next pass starts at far right, next row down\n                elif next_pass[1] == 0:\n                    next_pass[0] = next_pass[0] - 1  # If we reach the bottom row, shift to the left instead\n                else:\n                    raise ValueError(\"Whoops! Shouldn't ever get here!\")\n\n            else:\n                coord[0] = coord[0] - 1 # Move up and left\n                coord[1] = coord[1] + 1\n\n            # Save the order in the map for this coordinate\n            order_map[tuple(coord)] = order\n            order = order + 1\n\n            # If we got to 0,0 then we are done\n            if coord[0] == 0 and coord[1] == 0:\n                done = True\n\n        return order_map\n\n\nclass Tile:\n    \"\"\"\n    Holds info and useful methods related to an individual domino tile\n    \"\"\"\n\n    def __init__(self, cfg, coordinate, values, order, vision_offset):\n        self.coordinate = coordinate\n        self.vision_offset = vision_offset\n        self.values = values\n        self.cfg = cfg\n        self.order = order\n\n    def getPlacementPositionInMeters(self):\n        # Returns bottom left corner position of the tile relative to the origin of the field\n\n        tile_start_x = self.coordinate[0] * self.cfg.tile_size_width_meters\n        tile_start_y = self.coordinate[1] * self.cfg.tile_size_height_meters\n        return (tile_start_x, tile_start_y)\n\n\n    def draw(self, array):\n        # Note that this function draws pixels assuming the array is indexed as x,y instead of rows and columns\n        # the array is flipped to plot as an image in the parent function\n\n\n        # Determine tile location\n        tile_size_x_in_px = (self.cfg.domino_width_px + self.cfg.domino_spacing_width_px) * self.cfg.tile_width\n        tile_size_y_in_px = (self.cfg.domino_height_px + self.cfg.domino_spacing_height_px) * self.cfg.tile_height\n        tile_start_x_px = self.coordinate[0] * tile_size_x_in_px\n        tile_start_y_px = self.coordinate[1] * tile_size_y_in_px\n        tile_end_x_px = tile_start_x_px + tile_size_x_in_px\n        tile_end_y_px = tile_start_y_px + tile_size_y_in_px\n\n        self.draw_single(array, tile_start_x_px, tile_start_y_px, tile_end_x_px, tile_end_y_px)\n\n    def draw_single(self, array, tile_start_x_px, tile_start_y_px, tile_end_x_px, tile_end_y_px, draw_edge=True):\n\n        # Fill in tile with background color\n        array[tile_start_x_px:tile_end_x_px, tile_start_y_px:tile_end_y_px, 0] = self.cfg.tile_background_color[0]\n        array[tile_start_x_px:tile_end_x_px, tile_start_y_px:tile_end_y_px, 1] = self.cfg.tile_background_color[1]\n        array[tile_start_x_px:tile_end_x_px, tile_start_y_px:tile_end_y_px, 2] = self.cfg.tile_background_color[2]\n\n        # Fill in tile edge color (only have to do start locations since next tile over will fill in end locations)\n        if draw_edge:\n            array[tile_start_x_px:tile_end_x_px, tile_start_y_px, 0] = self.cfg.tile_edge_color[0]\n            array[tile_start_x_px:tile_end_x_px, tile_start_y_px, 1] = self.cfg.tile_edge_color[1]\n            array[tile_start_x_px:tile_end_x_px, tile_start_y_px, 2] = self.cfg.tile_edge_color[2]\n            array[tile_start_x_px, tile_start_y_px:tile_end_y_px, 0] = self.cfg.tile_edge_color[0]\n            array[tile_start_x_px, tile_start_y_px:tile_end_y_px, 1] = self.cfg.tile_edge_color[1]\n            array[tile_start_x_px, tile_start_y_px:tile_end_y_px, 2] = self.cfg.tile_edge_color[2]\n\n        # Draw dominos\n        for i in range(self.cfg.tile_width):\n            domino_start_x = tile_start_x_px + self.cfg.domino_spacing_width_px + (self.cfg.domino_width_px + self.cfg.domino_spacing_width_px) * i\n            domino_end_x = domino_start_x + self.cfg.domino_width_px\n            for j in range(self.cfg.tile_height):\n                domino_start_y = tile_start_y_px + self.cfg.domino_spacing_height_px + (self.cfg.domino_height_px + self.cfg.domino_spacing_height_px) * j\n                domino_end_y = domino_start_y + self.cfg.domino_height_px\n                domino_id = self.values[j, i]\n                domino_color = self.cfg.dominos[domino_id][1]\n                array[domino_start_x:domino_end_x, domino_start_y:domino_end_y, 0] = domino_color[0]\n                array[domino_start_x:domino_end_x, domino_start_y:domino_end_y, 1] = domino_color[1]\n                array[domino_start_x:domino_end_x, domino_start_y:domino_end_y, 2] = domino_color[2]\n\n\nclass Action:\n\n    def __init__(self, action_type, name):\n        self.action_type = action_type\n        self.name = name\n\n    def draw(self, ax, text=\"\"):\n        pass\n\nclass SetPoseAction(Action):\n    def __init__(self, action_type, name, x, y, a):\n        # action_type (enum)\n        # string name\n        # X position [m]\n        # Y position [m]\n        # Angle [deg]\n\n        super().__init__(action_type, name)\n\n        self.x = float(x)\n        self.y = float(y)\n        self.a = math.radians(float(a))\n\nclass WaitAction(Action):\n    def __init__(self,action_type, name, time):\n        super().__init__(action_type, name)\n        self.time = time\n\nclass MoveConstVelAction(Action):\n\n    def __init__(self, action_type, name, vx, vy, va, t):\n        # action_type (enum)\n        # string name\n        # X velocity [m/s]\n        # Y velocity [m/s]\n        # Angle [rad/s]\n        # time [sec]\n\n        super().__init__(action_type, name)\n\n        self.vx = float(vx)\n        self.vy = float(vy)\n        self.va = float(va)\n        self.t = float(t)\n\nclass MoveAction(Action):\n\n    def __init__(self, action_type, name, x, y, a):\n        # action_type (enum)\n        # string name\n        # X position [m]\n        # Y position [m]\n        # Angle [deg]\n\n        super().__init__(action_type, name)\n\n        self.x = float(x)\n        self.y = float(y)\n        self.a = math.radians(float(a))\n\n    def getPos(self):\n        return np.array([self.x, self.y])\n\n    def getAngleDegrees(self):\n        return math.degrees(self.a)\n\n    def getAngleRadians(self):\n        return self.a\n\n    def draw(self, ax, text=\"\", show_label=True):\n\n        if self.action_type in [ActionTypes.MOVE_WITH_VISION, ActionTypes.MOVE_REL, ActionTypes.MOVE_REL_SLOW]:\n            return\n\n        # Base triangle at 0 degrees\n        scale = 0.3\n        p1 = np.array([scale, 0])\n        s = math.sin(45*math.pi/180.0)\n        c = math.cos(45 * math.pi / 180.0)\n        p2 = np.array([-scale*s, scale*c])\n        p3 = np.array([-scale*s, -scale*c])\n        points = np.vstack((p1, p2 ,p3))\n\n        # Rotate for orientation\n        s = math.sin(self.getAngleRadians())\n        c = math.cos(self.getAngleRadians())\n        R = np.array([[c, -s],[s, c]])\n\n        for i in range(3):\n            # Do local rotation\n            points[i,:] = np.matmul(R, np.reshape(points[i,:],(2,1))).ravel()\n\n            # Then offset for position\n            points[i, :] = points[i, :] + self.getPos()\n\n        ax.add_patch(patches.Polygon(points,\n                                    fill=True,\n                                    edgecolor='c',\n                                    facecolor='c'))\n        text_point = points[0]\n        text_to_show = self.name\n        if text is not \"\":\n            text_to_show = text\n        if show_label:\n            ax.annotate(text_to_show, xy=text_point[:2], xytext=[1, 1], textcoords=\"offset points\", fontsize=8, color=\"green\")\n\n\ndef generate_full_action_sequence(cfg, tile):\n    \"\"\"\n    Standard sequence:\n    - Move to load\n    - Do load\n    - Move out of load\n    - Move to field entry\n    - Move to coarse drop off\n    - Wait for localization\n    - Move to fine drop off\n    - Drop off\n    - Move to coarse drop off\n    - Move to field exit\n    - Move to near load\n    \"\"\"\n\n    # Setup positions in field frame\n    tile_pos_in_field_frame = np.array(tile.getPlacementPositionInMeters())\n    robot_placement_pos_field_frame = tile_pos_in_field_frame + Utils.TransformPos(cfg.tile_to_robot_offset, [0,0], cfg.field_to_robot_frame_angle)\n    robot_placement_fine_pos_field_frame = robot_placement_pos_field_frame + Utils.TransformPos(cfg.tile_placement_fine_offset, [0,0], cfg.field_to_robot_frame_angle)\n    robot_placement_coarse_pos_field_frame = robot_placement_fine_pos_field_frame + Utils.TransformPos(cfg.tile_placement_coarse_offset, [0,0], cfg.field_to_robot_frame_angle)\n\n    # Convert positions to global frame\n    robot_placement_coarse_pos_global_frame = Utils.TransformPos(robot_placement_coarse_pos_field_frame, cfg.domino_field_origin, cfg.domino_field_angle)\n    robot_placement_fine_pos_global_frame = Utils.TransformPos(robot_placement_fine_pos_field_frame, cfg.domino_field_origin, cfg.domino_field_angle)\n    robot_field_angle = cfg.domino_field_angle + cfg.field_to_robot_frame_angle\n\n    # Setup various entry and intermediate positions\n    intermediate_entry_pos_global_frame = np.array([cfg.highway_x, cfg.intermediate_entry_hz_y])\n    entry_y = robot_placement_coarse_pos_global_frame[1]+cfg.enter_position_distance\n    field_entry_pos_global_frame = np.array([cfg.highway_x+0.75, entry_y])\n    intermediate_place_pos_global_frame = np.array([robot_placement_coarse_pos_global_frame[0], entry_y])\n\n    # Make pose adjustments based on config\n    coord = tile.coordinate\n    x_offset_row = cfg.x_offset_rows[coord[1]]\n    y_offset_row = cfg.y_offset_rows[coord[1]]\n    y_offset_col = cfg.y_offset_cols[coord[0]]\n    extra_y_offset_fine = y_offset_col + y_offset_row\n    robot_placement_fine_pos_global_frame[0] += x_offset_row\n    robot_placement_fine_pos_global_frame[1] += extra_y_offset_fine\n    robot_placement_coarse_pos_global_frame[1] += extra_y_offset_fine\n    intermediate_place_pos_global_frame[1] += extra_y_offset_fine\n    field_entry_pos_global_frame[1] += extra_y_offset_fine\n\n    # Figure out if intermediate steps are needed\n    intermediate_hz = robot_placement_coarse_pos_global_frame[1] < cfg.intermediate_entry_hz_y - 1\n    intermediate_vt = robot_placement_coarse_pos_global_frame[0] > cfg.intermediate_place_vt_x + 1\n\n    # Tiles near the back wall don't have enough space for backwards offset\n    relative_tile_offset = copy.deepcopy(cfg.tile_placement_coarse_offset)\n    if robot_placement_coarse_pos_global_frame[0] < 2:\n        intermediate_place_pos_global_frame[0] = robot_placement_fine_pos_global_frame[0]\n        robot_placement_coarse_pos_global_frame[0] = robot_placement_fine_pos_global_frame[0]\n        relative_tile_offset = np.asarray([0, 1])\n\n    actions = []\n\n    name = \"Move to load waypoint - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, cfg.load_waypoint[0], cfg.load_waypoint[1], cfg.highway_angle))\n\n    # name = \"Wait for localization\"\n    # actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    name = \"Move to near load area - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, cfg.base_station_prep_pos[0], cfg.base_station_prep_pos[1], cfg.base_station_target_angle))\n\n    # name = \"Pause plan for load\"\n    # actions.append(Action(ActionTypes.PAUSE_PLAN, name))\n\n    # name = \"Start cameras\"\n    # actions.append(Action(ActionTypes.START_CAMERAS, name))\n\n    # name = \"Wait for localization\"\n    # actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    # name = \"Move to load prep - fine\"\n    # actions.append(MoveAction(ActionTypes.MOVE_FINE, name, cfg.base_station_prep_pos[0], cfg.base_station_prep_pos[1], cfg.base_station_target_angle))\n\n    # name = \"Move to load prep - vision\"\n    # actions.append(MoveAction(ActionTypes.MOVE_WITH_VISION, name, cfg.base_station_prep_vision_offset[0], cfg.base_station_prep_vision_offset[1], cfg.base_station_prep_vision_offset[2]))\n    \n    # name = \"Move to load - relative slow\"\n    # actions.append(MoveAction(ActionTypes.MOVE_REL_SLOW, name, cfg.base_station_relative_offset[0], cfg.base_station_relative_offset[1], cfg.base_station_relative_offset[2]))\n\n    # name = \"Align with load\"\n    # actions.append(MoveAction(ActionTypes.MOVE_WITH_VISION, name, cfg.base_station_vision_offset[0], cfg.base_station_vision_offset[1], cfg.base_station_vision_offset[2]))\n\n    # name = \"Stop cameras\"\n    # actions.append(Action(ActionTypes.STOP_CAMERAS, name))\n\n    name = \"Load tile\"\n    actions.append(Action(ActionTypes.LOAD, name))\n\n    # name = \"Move away from load - relative slow\"\n    # actions.append(MoveAction(ActionTypes.MOVE_REL_SLOW, name, -1.3*cfg.base_station_relative_offset[0], -1.3*cfg.base_station_relative_offset[1], -1.3*cfg.base_station_relative_offset[2]))\n\n    # name = \"Pause plan for QC\"\n    # actions.append(Action(ActionTypes.PAUSE_PLAN, name))\n\n    name = \"Move to load waypoint - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, cfg.load_waypoint[0], cfg.load_waypoint[1], cfg.highway_angle))\n\n    # name = \"Wait for localization\"\n    # actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    # if intermediate_hz:\n    #     name = \"Move to intermediate enter - coarse\"\n    #     actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, intermediate_entry_pos_global_frame[0], intermediate_entry_pos_global_frame[1], cfg.highway_angle))\n\n    #     name = \"Wait for motor cooldown (long)\"\n    #     actions.append(WaitAction(ActionTypes.WAIT, name, 20))\n\n    name = \"Move to enter - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, field_entry_pos_global_frame[0], field_entry_pos_global_frame[1], robot_field_angle))\n\n    name = \"Wait for localization\"\n    actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    # name = \"Wait for motor cooldown\"\n    # actions.append(WaitAction(ActionTypes.WAIT, name, 10))\n\n    # if intermediate_vt:\n    # name = \"Move to intermediate place - coarse\"\n    # actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, intermediate_place_pos_global_frame[0], intermediate_place_pos_global_frame[1], robot_field_angle))\n\n    # name = \"Wait for localization\"\n    # actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    #     name = \"Wait for motor cooldown\"\n    #     actions.append(WaitAction(ActionTypes.WAIT, name, 10))\n\n    name = \"Move to near place - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, robot_placement_coarse_pos_global_frame[0], robot_placement_coarse_pos_global_frame[1], robot_field_angle))\n\n    name = \"Start cameras\"\n    actions.append(Action(ActionTypes.START_CAMERAS, name))\n\n    name = \"Wait for localization\"\n    actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    name = \"Move to place - fine\"\n    actions.append(MoveAction(ActionTypes.MOVE_FINE_STOP_VISION, name, robot_placement_fine_pos_global_frame[0], robot_placement_fine_pos_global_frame[1], robot_field_angle + cfg.angle_adjust_fine))\n\n    name = \"Move to place - vision\"\n    actions.append(MoveAction(ActionTypes.MOVE_WITH_VISION, name, tile.vision_offset[0], tile.vision_offset[1], tile.vision_offset[2]))\n\n    name = \"Stop cameras\"\n    actions.append(Action(ActionTypes.STOP_CAMERAS, name))\n\n    # name = \"Pause plan for QC\"\n    # actions.append(Action(ActionTypes.PAUSE_PLAN, name))\n\n    name = \"Place tile\"\n    actions.append(Action(ActionTypes.PLACE, name))\n\n    name = \"Move away from place - relative slow\"\n    actions.append(MoveAction(ActionTypes.MOVE_REL_SLOW, name, relative_tile_offset[0], relative_tile_offset[1], 0))\n\n    # if intermediate_vt:\n    #     name = \"Move to intermediate place - coarse\"\n    #     actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, intermediate_place_pos_global_frame[0], intermediate_place_pos_global_frame[1], robot_field_angle))\n\n    #     name = \"Wait for motor cooldown\"\n    #     actions.append(WaitAction(ActionTypes.WAIT, name, 10))\n\n    name = \"Move to intermediate place - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, intermediate_place_pos_global_frame[0], intermediate_place_pos_global_frame[1], robot_field_angle))\n\n    name = \"Move to exit - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, field_entry_pos_global_frame[0], field_entry_pos_global_frame[1], cfg.highway_angle))\n\n    name = \"Wait for localization\"\n    actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    # name = \"Wait for motor cooldown\"\n    # actions.append(WaitAction(ActionTypes.WAIT, name, 10))\n\n    # if intermediate_hz:\n    #     name = \"Move to intermediate exit - coarse\"\n    #     actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, intermediate_entry_pos_global_frame[0], intermediate_entry_pos_global_frame[1], cfg.highway_angle))\n\n    #     name = \"Wait for motor cooldown (long)\"\n    #     actions.append(WaitAction(ActionTypes.WAIT, name, 20))\n\n    return actions\n\n\ndef generate_hax_action_sequence(cfg, tile):\n\n    actions = []\n\n    x_pose = 7\n\n    name = \"Move 1 - coarse 90\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, x_pose, 5, 90))\n    name = \"Move 1 - coarse 0\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, x_pose, 5, 0))\n    name = \"Move 1 - coarse -90\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, x_pose, 5, -90))\n    name = \"Wait 1\"\n    actions.append(WaitAction(ActionTypes.WAIT, name, 5))\n    name = \"Move 2 - coarse -90\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, x_pose, -3.5, -90))\n    name = \"Move 2 - coarse 0 \"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, x_pose, -3.5, 0))\n    name = \"Move 2 - coarse 90\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, x_pose, -3.5, 90))\n    name = \"Wait 2\"\n    actions.append(WaitAction(ActionTypes.WAIT, name, 5))\n\n    return actions\n\n\ndef generate_small_testing_action_sequence(cfg, tile):\n    \"\"\"\n    Short sequence for testing\n    Load\n    Move to near place - coarse\n    Wait for localization\n    Move to place - fine\n    Place\n    Move away from place - fine\n    Move to load - coarse\n    \"\"\"\n\n    # Setup positions in field frame\n    tile_pos_in_field_frame = np.array(tile.getPlacementPositionInMeters())\n    robot_placement_fine_pos_field_frame = tile_pos_in_field_frame + Utils.TransformPos(cfg.tile_to_robot_offset, [0,0], cfg.field_to_robot_frame_angle)\n    robot_placement_coarse_pos_field_frame = robot_placement_fine_pos_field_frame + Utils.TransformPos(cfg.tile_placement_coarse_offset, [0,0], cfg.field_to_robot_frame_angle)\n    enter_field_prep_pos_field_frame = [robot_placement_coarse_pos_field_frame[0], - cfg.prep_position_distance]\n    exit_field_prep_pos_field_frame = [-cfg.exit_position_distance, robot_placement_coarse_pos_field_frame[1]]\n\n    # Convert positions to global frame\n    tile_pos_in_global_frame = Utils.TransformPos(tile_pos_in_field_frame, cfg.domino_field_origin, cfg.domino_field_angle)\n    robot_placement_coarse_pos_global_frame = Utils.TransformPos(robot_placement_coarse_pos_field_frame, cfg.domino_field_origin, cfg.domino_field_angle)\n    robot_placement_fine_pos_global_frame = Utils.TransformPos(robot_placement_fine_pos_field_frame, cfg.domino_field_origin, cfg.domino_field_angle)\n    robot_field_angle = cfg.domino_field_angle + cfg.field_to_robot_frame_angle\n    enter_field_prep_global_frame = Utils.TransformPos(enter_field_prep_pos_field_frame, cfg.domino_field_origin, cfg.domino_field_angle)\n    exit_field_prep_global_frame = Utils.TransformPos(exit_field_prep_pos_field_frame, cfg.domino_field_origin, cfg.domino_field_angle)\n\n    # print(tile.order)\n    # print(tile.coordinate)\n    # print(tile_pos_in_field_frame)\n    # print(tile_pos_in_global_frame)\n    # print(robot_placement_coarse_pos_global_frame)\n    # print(robot_placement_fine_pos_global_frame)\n\n    actions = []\n\n    name = \"Load tile\"\n    actions.append(Action(ActionTypes.LOAD, name))\n\n    name = \"Move to prep - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, enter_field_prep_global_frame[0], enter_field_prep_global_frame[1], robot_field_angle))\n\n    name = \"Wait for localization\"\n    actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    name = \"Move to near place - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, robot_placement_coarse_pos_global_frame[0], robot_placement_coarse_pos_global_frame[1], robot_field_angle))\n\n    name = \"Wait for localization\"\n    actions.append(Action(ActionTypes.WAIT_FOR_LOCALIZATION, name))\n\n    name = \"Start cameras\"\n    actions.append(Action(ActionTypes.START_CAMERAS, name))\n\n    name = \"Move to place - fine\"\n    actions.append(MoveAction(ActionTypes.MOVE_FINE, name, robot_placement_fine_pos_global_frame[0], robot_placement_fine_pos_global_frame[1], robot_field_angle))\n\n    name = \"Move to place - vision\"\n    actions.append(MoveAction(ActionTypes.MOVE_WITH_VISION, name, tile.vision_offset[0], tile.vision_offset[1], tile.vision_offset[2]))\n\n    name = \"Stop cameras\"\n    actions.append(Action(ActionTypes.STOP_CAMERAS, name))\n\n    name = \"Place tile\"\n    actions.append(Action(ActionTypes.PLACE, name))\n\n    name = \"Move away from place - fine\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, robot_placement_coarse_pos_global_frame[0], robot_placement_coarse_pos_global_frame[1], robot_field_angle))\n\n    name = \"Move to exit - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, enter_field_prep_global_frame[0], enter_field_prep_global_frame[1], robot_field_angle))\n\n    name = \"Move to near load - coarse\"\n    actions.append(MoveAction(ActionTypes.MOVE_COARSE, name, cfg.load_pose[0], cfg.load_pose[1], cfg.load_pose[2]))\n\n    return actions\n\ndef draw_env(cfg):\n    \"\"\"\n    Draws a figure of the environment\n    \"\"\"\n\n    fig,ax = plt.subplots(1)\n\n    # Draw overall map boundaries\n    ax.add_patch(patches.Rectangle(cfg.robot_boundaries[0],\n                                    cfg.robot_boundaries[1][0] - cfg.robot_boundaries[0][0],\n                                    cfg.robot_boundaries[1][1] - cfg.robot_boundaries[0][1],\n                                    fill=False,\n                                    edgecolor='b'))\n\n    # Draw field boundaries\n    ax.add_patch(patches.Rectangle(cfg.domino_field_origin,\n                                    cfg.domino_field_top_right[0] - cfg.domino_field_origin[0],\n                                    cfg.domino_field_top_right[1] - cfg.domino_field_origin[1],\n                                    fill=False,\n                                    edgecolor='r'))\n\n    # Draw base station\n    ax.add_patch(patches.Rectangle(cfg.base_station_boundaries[0],\n                                    cfg.base_station_boundaries[1][0] - cfg.base_station_boundaries[0][0],\n                                    cfg.base_station_boundaries[1][1] - cfg.base_station_boundaries[0][1],\n                                    fill=True,\n                                    edgecolor='k',\n                                    facecolor='k'))\n\n\n    ax.set_xlim(left=-1, right=cfg.robot_boundaries[1][0]+1)\n    ax.set_ylim(bottom=-1, top=cfg.robot_boundaries[1][1]+1)\n    ax.axis('equal')\n\n    return ax\n\n\nclass Cycle:\n\n    def __init__(self, id, cfg, robot_id, tile, action_sequence):\n        self.id = id\n        self.cfg = cfg\n        self.robot_id = robot_id\n        self.tile = tile\n        self.action_sequence = action_sequence\n        \n\n    def draw_cycle(self, ax):\n        for action in self.action_sequence:\n            action.draw(ax)\n\n    def draw_action(self, ax, idx, text=\"\"):\n        self.action_sequence[idx].draw(ax, text)\n\n\ndef generate_standard_cycles(cfg, field, cycle_generator_fn):\n    start_num = 1\n    robot_num = start_num\n    n_robots = len(cfg.ip_map)\n    cycles = []\n\n    for tile in field.tiles:\n        action_sequence = cycle_generator_fn(cfg, tile)\n        cycles.append(Cycle(tile.order, cfg, \"robot{}\".format(robot_num), tile, action_sequence))\n        robot_num += 1\n        if robot_num > n_robots:\n            robot_num = start_num\n\n    return cycles\n\n\nclass BasePlan:\n\n    def __init__(self, cycles):\n        self.cycles = cycles\n\n    def get_cycle(self, cycle_num):\n        try:\n            return self.cycles[cycle_num]\n        except IndexError:\n            return None\n\n    def get_action(self, cycle_id, action_id):\n        cycle = self.get_cycle(cycle_id)\n        if cycle:\n            try:\n                return cycle.action_sequence[action_id]\n            except IndexError:\n                return None\n        else:\n            return None\n\nclass RealPlan(BasePlan):\n\n    def __init__(self, cfg, field, cycles):\n        self.cfg = cfg\n        self.field = field\n        super().__init__(cycles)\n\n    def draw_cycle(self, cycle_num):\n        ax = draw_env(self.cfg)\n        self.get_cycle(cycle_num).draw_cycle(ax)\n        plt.show()\n\n    def find_pose_move_idx(self, cycle):\n        # Figure out what id the tile pose is\n        place_idx = -1\n        for i,action in enumerate(cycle.action_sequence):\n            if action.action_type == ActionTypes.PLACE:\n                place_idx = i\n                break\n        if place_idx == -1:\n            raise ValueError(\"Couldn't find placement index\")\n        tile_pose_move_idx = -1\n        for j in range(place_idx, 0, -1):\n            action = cycle.action_sequence[j]\n            if action.action_type == ActionTypes.MOVE_FINE or action.action_type == ActionTypes.MOVE_COARSE :\n                tile_pose_move_idx = j\n                break\n        if tile_pose_move_idx == -1:\n            raise ValueError(\"Couldn't find movement index\")\n\n        return tile_pose_move_idx\n            \n    def draw_all_tile_poses(self):\n        ax = draw_env(self.cfg)\n        for cycle in self.cycles:\n            tile_pose_move_idx = self.find_pose_move_idx(cycle)\n            cycle.draw_action(ax, tile_pose_move_idx, text=cycle.tile.order)\n        plt.show()\n\nclass Plan(RealPlan):\n\n    def __init__(self, cfg, cycle_generator_fn):\n        field = DominoField(cfg)\n        field.generate()\n        logging.info('Generating robot actions...')\n        cycles = generate_standard_cycles(cfg, field, cycle_generator_fn)\n        logging.info('done.')\n        super().__init__(cfg, field, cycles)\n\n\nclass SubsectionPlan(RealPlan):\n\n    def __init__(self, full_plan):\n        self.full_plan = full_plan\n        self.start_coords = full_plan.cfg.start_coords\n        self.end_coords = full_plan.cfg.end_coords\n        self.delta_coords = (self.end_coords[0] - self.start_coords[0],\n                             self.end_coords[1] - self.start_coords[1] )\n        new_field = DominoField(full_plan.cfg)\n        new_field.n_tiles_x = self.delta_coords[0]\n        new_field.n_tiles_y = self.delta_coords[1]\n        counter = 0\n        new_field.tiles = []\n        new_cycles = []\n        for i in range(len(full_plan.field.tiles)):\n            tile = full_plan.field.tiles[i]\n            cycle = full_plan.cycles[i]\n            tile_coords = tile.coordinate\n            if tile_coords[0] >= self.start_coords[0] and \\\n               tile_coords[0] <= self.end_coords[0] and \\\n               tile_coords[1] >= self.start_coords[1] and \\\n               tile_coords[1] <= self.end_coords[1]:\n                # Make new tile\n                new_tile = copy.deepcopy(tile)\n                new_tile.order = counter\n                new_field.tiles.append(new_tile)\n                # Make new cycle\n                new_cycle = copy.deepcopy(cycle)\n                new_cycle.id = counter\n                new_cycle.tile = new_tile\n                new_cycles.append(new_cycle)\n                counter += 1\n\n        super().__init__(self.full_plan.cfg, new_field, new_cycles)\n\n\n\nclass TestPlan(BasePlan):\n    \"\"\"\n    Test plan used for debugging and testing various action sequences\n    \"\"\"\n\n    def __init__(self):\n        actions = []\n        actions.append(MoveAction(ActionTypes.MOVE_COARSE, \"TestMoveCoarse\", 0.5, 0.5, 0))\n        actions.append(MoveAction(ActionTypes.MOVE_FINE, 'TestMoveFine', 1,1,0))\n        actions.append(MoveAction(ActionTypes.MOVE_COARSE, 'Blah', 0,0,3.14))\n\n        cycles = [Cycle(i,None,'robot1','TestCycle{}'.format(i), actions) for i in range(3)]\n        super().__init__(cycles)\n\n\ndef RunFieldPlanning(autosave=False):\n    cfg = config.Config()\n\n    logging.basicConfig(\n    level=logging.INFO,\n    format=\"%(asctime)s [%(levelname)s] %(message)s\",\n    handlers=[\n        logging.FileHandler(os.path.join(cfg.log_folder,\"planner.log\")),\n        logging.StreamHandler()\n        ]\n    )\n\n    plan = None\n    if cfg.USE_SMALL_TESTING_CONFIG:\n        plan = Plan(cfg, generate_small_testing_action_sequence)\n    else:\n        plan = Plan(cfg, generate_full_action_sequence)\n        # plan = Plan(cfg, generate_hax_action_sequence)\n\n    if cfg.USE_SUBSECTION:\n        plan = SubsectionPlan(plan)\n\n    if autosave:\n        fname = os.path.join(cfg.plans_dir,\"autosaved.p\")\n        with open(fname, 'wb') as f:\n            pickle.dump(plan, f)\n            logging.info(\"Saved plan to {}\".format(fname))\n\n    return plan\n    \n\ndef GeneratePDF(plan):\n\n    from reportlab.lib.pagesizes import letter\n    from reportlab.pdfgen import canvas\n    from PIL import Image\n\n    logging.info(\"Generating PDF\")\n\n    # Initialize pdf\n    name = \"domino_plan.pdf\"\n    if plan.cfg.MR_LOGO_PLAN:\n        name = \"domnino_plan_logo.pdf\"\n    full_path = os.path.join(plan.cfg.plans_dir, name)\n    page_height, page_width = letter # Flipped to get landscape\n    c = canvas.Canvas(full_path, pagesize=(page_width, page_height))\n\n    # Pre allocate image array\n    tile_size_x_in_px = (plan.cfg.domino_width_px + plan.cfg.domino_spacing_width_px) * plan.cfg.tile_width\n    tile_size_y_in_px = (plan.cfg.domino_height_px + plan.cfg.domino_spacing_height_px) * plan.cfg.tile_height\n\n    for i in range(len(plan.field.tiles)):\n        # Get tile to draw on this page\n        tile = plan.field.tiles[i]\n\n        # Draw title\n        text_width = 0.5 * page_width\n        text_height = 0.9 * page_height\n        text = \"Tile {}, Coordinate: ({}, {})\".format(i, tile.coordinate[0], tile.coordinate[1])\n        c.setFont(\"Helvetica\", 20)\n        c.drawCentredString(text_width,text_height,text)\n\n        # Draw orientation note\n        text_width = 0.5 * page_width\n        text_height = 0.1 * page_height\n        text = \"This side towards robot body\"\n        c.drawCentredString(text_width,text_height,text)\n\n        # Draw image\n        image_array = np.zeros((tile_size_x_in_px, tile_size_y_in_px, 3))\n        tile.draw_single(image_array, 0, 0, tile_size_x_in_px, tile_size_y_in_px, draw_edge=False)\n        image_array = np.transpose(image_array, (1, 0, 2))\n        image_array = np.flip(image_array, 0)\n        im = Image.fromarray(np.uint8(255*image_array), mode='RGB')\n        image_fraction = 0.7\n        start_width = (1-image_fraction)/2.0 * page_width\n        start_height = (1-image_fraction)/2.0 * page_height\n        image_width = image_fraction * page_width\n        image_height = image_fraction * page_height\n        c.drawInlineImage(im, start_width ,  start_height, width=image_width, height=image_height) \n\n        # Complete page\n        c.showPage()\n\n    c.save()\n\n\n\n\nif __name__ == '__main__':\n\n    import PySimpleGUI as sg\n\n    plan = RunFieldPlanning(autosave=False)\n\n    # plan.field.printStats()\n    # plan.field.show_image_parsing()\n    # plan.field.render_domino_image_tiles()\n    # plan.field.show_tile_ordering()\n    plan.draw_cycle(0)\n    plan.draw_cycle(10)\n    plan.draw_cycle(19)\n    plan.draw_all_tile_poses()\n\n    # GeneratePDF(plan)\n\n    # sg.change_look_and_feel('Dark Blue 3')\n    # clicked_value = sg.popup_yes_no('Save plan to file?')\n    # if clicked_value == \"Yes\":\n    #     fname = sg.popup_get_file(\"Location to save\", save_as=True)\n    #     with open(fname, 'wb') as f:\n    #         pickle.dump(plan, f)\n    #         logging.info(\"Saved plan to {}\".format(fname))\n\n"
  },
  {
    "path": "src/master/MarvelMindHandler.py",
    "content": "\"\"\"\nPython wrapper for Marvelmind C API\n\"\"\"\n\nimport ctypes\nimport logging\nfrom collections import defaultdict\nfrom Utils import NonBlockingTimer\n\n\nclass MarvelmindWrapper:\n\n    def __init__(self, cfg):\n        self.lib = ctypes.windll.LoadLibrary(cfg.mm_api_path)\n        self.devices = defaultdict(dict)\n        self.expected_devices = []\n        self.wake_timer = None\n        for name,beacons in cfg.device_map.items():\n            for b in beacons:\n                self.expected_devices.append(b)\n        self._open_serial_port()\n\n    def __del__(self):\n        self._close_serial_port()\n\n    def _open_serial_port(self):\n        \"\"\"\n        Establish communication with marvelmind router. Required for all other functions.\n        \"\"\"\n\n        logging.info(\"Opening communication with MarvelMind\")\n        fn = self.lib.mm_open_port\n        fn.restype = ctypes.c_bool\n        fn.argtypes = [ctypes.POINTER(ctypes.c_void_p)]\n        status = fn(None)\n\n    def _close_serial_port(self):\n        \"\"\"\n        Close communication with marvelmind router.\n        \"\"\"\n\n        fn = self.lib.mm_close_port\n        fn.restype = ctypes.c_bool\n        fn.argtypes = [ctypes.POINTER(ctypes.c_void_p)]\n        status = fn(None)\n\n    def check_all_devices_status(self):\n        \"\"\"\n        Checks to see if all devices are awake and ready\n        \"\"\"\n\n        logging.info(\"Getting device list\")\n        fn = self.lib.mm_get_devices_list\n        fn.restype = ctypes.c_bool\n        fn.argtypes = [ctypes.POINTER(ctypes.c_uint8)]\n\n        buff_size = 255\n        dataptr = (ctypes.c_uint8*buff_size)()\n        dataptr = ctypes.cast(dataptr, ctypes.POINTER(ctypes.c_uint8))\n        status = fn(dataptr)\n\n        num_devices = dataptr[0]\n\n        logging.info(\"Num devices found: \" + str(num_devices))\n        idx = 1\n        device_offset = 9\n        addr_offset = 0\n        sleep_offset = 2\n        for i in range(num_devices):\n            addr = dataptr[idx + addr_offset]\n            sleep = dataptr[idx + sleep_offset]\n\n            self.devices[addr] = {'sleep': bool(sleep)}\n\n            idx += device_offset\n\n        ready = True\n        if self.wake_timer:\n            if not self.wake_timer.check():\n                ready = False\n        else:\n            for d in self.expected_devices:\n                if d not in self.devices.keys():\n                    ready = False\n                    logging.warning(\"Could not find expected device {}\".format(d))\n                    continue\n\n                if self.devices[d]['sleep'] is False:\n                    ready = False\n                    logging.warning(\"Device {} is not awake\".format(d))\n                    continue\n\n\n        return ready\n\n    def wake_all_devices(self):\n        for d in self.expected_devices:\n            self.wake_device(d)\n\n    def wake_all_devices_only_if_needed(self):\n        ok = self.check_all_devices_status()\n        if not ok:\n            logging.info(\"Triggering marvelmind device wakeup. Will wait for a moment to let them warmup\")\n            self.wake_timer = NonBlockingTimer(30)\n            self.wake_all_devices()\n\n    def sleep_all_devices(self):\n        for d in self.expected_devices:\n            self.sleep_device(d)\n\n    def wake_device(self, address):\n        \"\"\"\n        Wake up the device with the given address\n        \"\"\"\n\n        logging.info(\"Waking device: {}\".format(address))\n        fn = self.lib.mm_wake_device\n        fn.restype = ctypes.c_bool\n        fn.argtypes = [ctypes.c_uint8]\n\n        status = fn(address)\n        if not status:\n            logging.info(\"Warning: unable to wake device {}\".format(address))\n\n\n    def sleep_device(self, address):\n        \"\"\"\n        Sleep the device with the given address\n        \"\"\"\n\n        logging.info(\"Sleeping device: {}\".format(address))\n        fn = self.lib.mm_send_to_sleep_device\n        fn.restype = ctypes.c_bool\n        fn.argtypes = [ctypes.c_uint8]\n\n        status = fn(address)\n        if not status:\n            logging.info(\"Warning: unable to sleep device {}\".format(address))\n\n    def get_metrics(self):\n        return self.devices\n\n\nclass MockMarvelmindWrapper:\n    def __init__(self, cfg):    \n        pass\n    def check_all_devices_status(self):\n        return True\n    def wake_all_devices_only_if_needed(self):\n        pass\n    def wake_all_devices(self):\n        pass\n    def sleep_all_devices(self):\n        pass\n    def wake_device(self, address):\n        pass\n    def sleep_device(self, address):\n        pass\n    def get_metrics(self):\n        return {}\n"
  },
  {
    "path": "src/master/MasterMain.py",
    "content": "import time\nimport math\nimport copy\nimport config\nimport os\nimport sys\nimport logging\nimport PySimpleGUI as sg\nimport traceback\n\nfrom FieldPlanner import *\nfrom Runtime import RuntimeManager, PlanStatus\nimport Utils\n\n\nSTATUS_PANEL_OK_COLOR = \"green\"\nSTATUS_PANEL_BAD_COLOR = \"red\"\n\ndef status_panel(name):\n    width = 40\n    height = 20\n    return [[sg.Text(\"{} status\".format(name))], [sg.Text(\"{} offline\".format(name), size=(width, height), \\\n        relief=sg.RELIEF_RIDGE, key='_{}_STATUS_'.format(name.upper()), background_color=STATUS_PANEL_BAD_COLOR) ]]\n\n\ndef setup_gui_layout(config, panel_names, target_names):\n    plan_button_size = [9,2]\n    plan_button_pad = (2, 10)\n\n    # Left hand column with status panels\n    col1 = []\n    for name in panel_names:\n        col1 += status_panel(name)\n\n    # Plan cycle/action modification buttons\n    incremenet_cycle_button = sg.Button('Cycle +', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_INC_CYCLE_', disabled=True) \n    decremenet_cycle_button = sg.Button('Cycle -', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_DEC_CYCLE_', disabled=True) \n    incremenet_action_button = sg.Button('Action +', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_INC_ACTION_', disabled=True) \n    decremenet_action_button = sg.Button('Action -', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_DEC_ACTION_', disabled=True) \n    set_cycle_field = sg.Input(key='_CYCLE_FIELD_', size=(5,1), pad=(0,20))\n    set_action_field = sg.Input(key='_ACTION_FIELD_', size=(5,1), pad=(0,20))\n    set_cycle_button = sg.Button('Set Cycle', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_SET_CYCLE_', disabled=True) \n    set_action_button = sg.Button('Set Action', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_SET_ACTION_', disabled=True)\n    cycle_buttons = [[sg.Column([[decremenet_cycle_button], [decremenet_action_button]]), sg.Column([[incremenet_cycle_button], [incremenet_action_button]]),  \n                     sg.Column([[set_cycle_field], [set_action_field]]), sg.Column([[set_cycle_button], [set_action_button]])  ]]\n    col1 += cycle_buttons\n\n    # Middle column with plot and buttons\n    target_element = [ [sg.Text(\"Target: \")], [sg.Combo(target_names, key='_TARGET_', default_value='robot1')] ]\n\n    actions = [a for a in ActionTypes]\n    action_element = [ [sg.Text(\"Action: \")], [sg.Combo(actions, key='_ACTION_')] ]\n\n    data_element = [ [sg.Text('Data:')], [sg.Input(key='_ACTION_DATA_')] ]\n\n    plan_file_field = sg.Input(key='_PLAN_FILE_', visible=False, enable_events=True)\n    load_plan_button = sg.FileBrowse(button_text='Load Plan', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, \\\n        key='_LOAD_PLAN_', file_types=(('Robot Plans', ('*.json','*.p')),)) \n    run_plan_button = sg.Button('Run Plan', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_RUN_PLAN_', disabled=True) \n    pause_plan_button = sg.Button('Pause Plan', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_PAUSE_PLAN_', disabled=True) \n    abort_plan_button = sg.Button('Abort Plan', button_color=('white','blue'), size=plan_button_size, pad=plan_button_pad, key='_ABORT_PLAN_', disabled=True)\n    plan_buttons = [[sg.Column([[plan_file_field, load_plan_button], [run_plan_button]]), sg.Column([[pause_plan_button], [abort_plan_button]])]]\n\n    button_size = [20,6]\n    button_pad = (2,10)\n    estop_button = [[sg.Button('ESTOP', button_color=('white','red'), size=button_size, pad=button_pad) ]]\n    manual_button = [[sg.Button('Send Command', button_color=('white','green'), size=button_size, pad=button_pad) ]]\n    clear_error_button = [[sg.Button('Clear Error', button_color=('white','orange red'), size=[10,6], pad=button_pad)]]\n\n    col2 = [[sg.Graph(canvas_size=(700,700), graph_bottom_left=config.graph_bottom_left, graph_top_right=config.graph_top_right, float_values=True, key=\"_GRAPH_\", background_color=\"light grey\") ],\n            [sg.Column(target_element), sg.Column(action_element), sg.Column(data_element)],\n            [sg.Column(plan_buttons), sg.Column(estop_button), sg.Column(manual_button), sg.Column(clear_error_button)]  ]\n\n    # Right hand column with text ouput\n    col3 = [[sg.Output(size=(70, 50), echo_stdout_stderr=True)]]\n    \n    return [[ sg.Column(col1), sg.Column(col2), sg.Column(col3)]]\n\n\nclass CmdGui:\n\n    def __init__(self, config):\n\n        self.config = config\n        \n        sg.change_look_and_feel('DarkBlack')\n\n        panel_names = [\"{}\".format(n) for n in self.config.ip_map]\n        panel_names += ['plan']\n        target_names = copy.deepcopy(panel_names)\n        target_names.remove('plan')\n        layout = setup_gui_layout(config, panel_names, target_names)\n\n        self.window = sg.Window('Robot Controller', layout, return_keyboard_events=True, use_default_focus=False)\n        self.window.finalize()\n\n        self.viz_figs = {}\n        self.manual_action_debounce_timer = Utils.NonBlockingTimer(1.0)\n        self.drawn_plan_grid = False\n        self.last_cycle_number_drawn = -1\n\n        self._draw_environment()\n\n    def close(self):\n        self.window.close()\n\n\n    def update(self):\n\n        event, values = self.window.read(timeout=20)\n        # if event != \"__TIMEOUT__\":\n        #     print(event)\n        #     print(values)\n\n        # At exit, check if we should keep marvelmind on\n        if event is None or event == 'Exit':\n            if self.config.OFFLINE_TESTING or self.config.SKIP_MARVELMIND:\n                return 'Exit', None\n            else:\n                clicked_value = sg.popup_yes_no('Do you want to keep the Marvelmind running')\n                if clicked_value == \"Yes\":\n                    return \"ExitMM\", None\n                else:\n                    return 'Exit', None\n        \n        # Sending a manual action (via button or pressing enter)\n        if event in ('Send Command', \"Clear Error\", '\\r', '\\n'):\n            manual_action = None\n            if self.manual_action_debounce_timer.check():\n                if event == \"Clear Error\":\n                    manual_action = (values['_TARGET_'], Action(ActionTypes.CLEAR_ERROR, \"ClearError\"))\n                else:\n                    manual_action = self._parse_manual_action(values)\n            if manual_action is not None:\n                self.window['_ACTION_DATA_'].update(\"\")\n                self.manual_action_debounce_timer = Utils.NonBlockingTimer(1.0)\n                return 'Action', manual_action\n\n        # Pressing the run plan button\n        if event == \"_RUN_PLAN_\":\n            clicked_value = sg.popup_yes_no('Ready to start plan?')\n            if clicked_value == \"Yes\":\n                return \"Run\", None\n\n        if event == \"_PLAN_FILE_\":\n            return \"Load\", values[\"_PLAN_FILE_\"]\n\n        if event == \"_PAUSE_PLAN_\":\n            return \"Pause\", None\n\n        if event == \"_ABORT_PLAN_\":\n            clicked_value = sg.popup_yes_no('Abort plan?')\n            if clicked_value == \"Yes\":\n                return \"Abort\", None\n\n        if event in [\"_INC_CYCLE_\",\"_DEC_CYCLE_\",\"_INC_ACTION_\",\"_DEC_ACTION_\"]:\n            return event, None\n\n        if event in [\"_SET_CYCLE_\", \"_SET_ACTION_\"]:\n            return event, (values['_CYCLE_FIELD_'], values[\"_ACTION_FIELD_\"])\n\n        if event == \"ESTOP\" or event == \" \":\n            return \"ESTOP\", None\n\n        return None, None\n    \n    def update_status_panels(self, metrics):\n        for key, metric in metrics.items():\n            if key == 'mm' or key == 'base':\n                continue\n            elif key == 'plan':\n                self._update_plan_panel(metric)\n            else:\n                self._update_robot_panel(key, metric) \n        \n\n    def _parse_manual_action(self, values):\n        target = values['_TARGET_']\n        action_type = ActionTypes(values['_ACTION_'])\n        data_str = values['_ACTION_DATA_']\n        name = 'ManualAction'\n\n        action = None\n        if action_type in [ActionTypes.MOVE_COARSE, ActionTypes.MOVE_REL, ActionTypes.MOVE_REL_SLOW, \n                           ActionTypes.MOVE_FINE, ActionTypes.MOVE_FINE_STOP_VISION]:\n            data = data_str.split(',')\n            data = [x.strip() for x in data]\n            if len(data) != 3:\n                logging.warning(\"Invalid data: {}\".format(data))\n                return None\n            action = MoveAction(action_type, name, data[0], data[1], data[2])\n        elif action_type in [ActionTypes.MOVE_CONST_VEL]:\n            data = data_str.split(',')\n            data = [x.strip() for x in data]\n            if len(data) != 4:\n                logging.warning(\"Invalid data: {}\".format(data))\n                return None\n            action = MoveConstVelAction(action_type, name, data[0], data[1], data[2], data[3])\n        elif action_type in [ActionTypes.SET_POSE]:\n            data = data_str.split(',')\n            data = [x.strip() for x in data]\n            if len(data) != 3:\n                logging.warning(\"Invalid data: {}\".format(data))\n                return None\n            action = SetPoseAction(action_type, name, data[0], data[1], data[2])\n        elif action_type == ActionTypes.MOVE_WITH_VISION:\n            data = data_str.split(',')\n            data = [x.strip() for x in data]\n            if len(data) != 3:\n                data = (0,0,0)\n                logging.warning(\"Assuming position of (0,0,0) for vision move\")\n            action = MoveAction(action_type, name, data[0], data[1], data[2])\n        elif action_type == ActionTypes.WAIT:\n            data = data_str.split(',')\n            data = [x.strip() for x in data]\n            action = WaitAction(action_type, name, float(data[0]))\n        else:\n            action = Action(action_type, name)\n\n        return (target, action)\n\n    def _udpate_cycle_button_status(self, disabled):\n        self.window['_INC_CYCLE_'].update(disabled=disabled)\n        self.window['_DEC_CYCLE_'].update(disabled=disabled)\n        self.window['_INC_ACTION_'].update(disabled=disabled)\n        self.window['_DEC_ACTION_'].update(disabled=disabled)\n        self.window['_CYCLE_FIELD_'].update(disabled=disabled)\n        self.window['_ACTION_FIELD_'].update(disabled=disabled)\n        self.window['_SET_CYCLE_'].update(disabled=disabled)\n        self.window['_SET_ACTION_'].update(disabled=disabled)\n\n    def _update_plan_button_status(self, plan_status):\n        if plan_status == PlanStatus.NONE:\n            self.window['_RUN_PLAN_'].update(text='Run', disabled=True)\n            self.window['_LOAD_PLAN_'].update(disabled=False)\n            self.window['_PAUSE_PLAN_'].update(disabled=True)\n            self.window['_ABORT_PLAN_'].update(disabled=True)\n            self._udpate_cycle_button_status(disabled=True)\n        elif plan_status == PlanStatus.LOADED or plan_status == PlanStatus.DONE:\n            self.window['_RUN_PLAN_'].update(text='Run', disabled=False)\n            self.window['_LOAD_PLAN_'].update(disabled=False)\n            self.window['_PAUSE_PLAN_'].update(disabled=True)\n            self.window['_ABORT_PLAN_'].update(disabled=True)\n            self._udpate_cycle_button_status(disabled=False)\n        elif plan_status == PlanStatus.RUNNING:\n            self.window['_RUN_PLAN_'].update(text='Run', disabled=True)\n            self.window['_LOAD_PLAN_'].update(disabled=True)\n            self.window['_PAUSE_PLAN_'].update(disabled=False)\n            self.window['_ABORT_PLAN_'].update(disabled=False)\n            self._udpate_cycle_button_status(disabled=True)\n        elif plan_status == PlanStatus.PAUSED:\n            self.window['_RUN_PLAN_'].update(text='Resume', disabled=False)\n            self.window['_LOAD_PLAN_'].update(disabled=True)\n            self.window['_PAUSE_PLAN_'].update(disabled=True)\n            self.window['_ABORT_PLAN_'].update(disabled=False)\n            self._udpate_cycle_button_status(disabled=False)\n        elif plan_status == PlanStatus.ABORTED:\n            self.window['_RUN_PLAN_'].update(text='Restart', disabled=False)\n            self.window['_LOAD_PLAN_'].update(disabled=False)\n            self.window['_PAUSE_PLAN_'].update(disabled=True)\n            self.window['_ABORT_PLAN_'].update(disabled=True)\n            self._udpate_cycle_button_status(disabled=False)\n        else:\n            logging.warning(\"Unhandled plan status for button state: {}\".format(plan_status))\n\n    def _update_plan_panel(self, status_dict):\n        status_str = \"Plan is not running\"\n        color_str = STATUS_PANEL_BAD_COLOR\n        if status_dict:\n            try:\n                self._update_plan_button_status(status_dict['status'])\n\n                status_str = \"\"\n                plan_state_str = \"{}\".format(status_dict['status']).split('.')[1]\n                status_str += \"Plan status: {}\\n\".format(plan_state_str)\n                status_str += \"Plan filename: {}\\n\".format(status_dict['filename'])\n                status_str += \"Next cycle num: {}\\n\".format(status_dict['next_cycle_number'])\n                status_str += \"Idle bots: {}\\n\".format(status_dict['idle_bots'])\n                for id, data in status_dict['robots'].items():\n                    needs_restart_str = ''\n                    if data['needs_restart']:\n                        needs_restart_str = \"(Needs Restart)\"\n                    status_str += \"{}{}:\\n\".format(id, needs_restart_str)\n                    status_str += \"  Cycle id: {}\\n\".format(data[\"cycle_id\"])\n                    status_str += \"  Action id: {}\\n\".format(data[\"action_id\"])\n                    status_str += \"  Action name: {}\\n\".format(data[\"action_name\"])\n                    status_str += \"  Tile Coordinate: {}\\n\".format(data[\"tile_coordinate\"])\n                    status_str += \"  Vision offset: {}\\n\".format(data[\"vision_offset\"])\n                \n                # Set panel coloring based on state\n                if plan_state_str == \"PAUSED\" or plan_state_str == \"ABORTED\":\n                    color_str = STATUS_PANEL_BAD_COLOR\n                elif plan_state_str != \"NONE\":\n                    color_str = STATUS_PANEL_OK_COLOR\n            except Exception as e:\n                status_str = \"Bad dict: \" + str(status_dict)\n\n        self.window['_PLAN_STATUS_'].update(status_str, background_color=color_str)\n\n    def _update_robot_panel(self, robot_id, status_dict):\n        status_str = \"Cannot get {} status\".format(robot_id)\n        color_str = STATUS_PANEL_BAD_COLOR\n        if status_dict:\n            try:\n                robot_pose = [status_dict['pos_x'], status_dict['pos_y'], math.degrees(status_dict['pos_a'])]\n                last_mm_pose = [status_dict['last_mm_x'],status_dict['last_mm_y'], math.degrees(status_dict['last_mm_a']), status_dict['last_mm_used']]\n                status_str = \"\"\n                status_str += \"Position: [{0:.3f} m, {1:.3f} m, {2:.2f} deg]\\n\".format(robot_pose[0], robot_pose[1], robot_pose[2])\n                status_str += \"Last MM Pose: [{:.3f} m, {:.3f} m, {:.2f} deg]\\n\".format(\n                        last_mm_pose[0], last_mm_pose[1], last_mm_pose[2])\n                status_str += \"Velocity: [{0:.3f} m/s, {1:.3f} m/s, {2:.2f} deg/s]\\n\".format(status_dict['vel_x'],status_dict['vel_y'], math.degrees(status_dict['vel_a']))\n                status_str += \"Vision Pose : [{0:.3f} m, {1:.3f} m, {2:.2f} deg]\\n\".format(status_dict['vision_x'],status_dict['vision_y'], math.degrees(status_dict['vision_a']))\n                status_str += \"Camera Raw Pose : [{0:.3f} m, {1:.3f} m, {2:.2f} deg]\\n\".format(status_dict['cam_pose_x'],status_dict['cam_pose_y'], math.degrees(status_dict['cam_pose_a']))\n                status_str += \"Cam Status: Both-{} | Side-{} | Rear-{}\\n\".format(status_dict[\"cam_both_ok\"],status_dict[\"cam_side_ok\"], status_dict[\"cam_rear_ok\"])\n                status_str += \"Raw Camera px: Side [{:d}, {:d}] Rear: [{:d}, {:d}]\\n\".format( \\\n                    int(status_dict['cam_side_u']),int(status_dict['cam_side_v']), int(status_dict['cam_rear_u']), int(status_dict['cam_rear_v']))\n                status_str += \"Localization total confidence: {:.1f}%\\n\".format(status_dict['localization_total_confidence']*100)\n                status_str += \"  Axes confidence: [{:.1f}%, {:.1f}%, {:.1f}%]\\n\".format(\n                    status_dict['localization_confidence_x']*100,status_dict['localization_confidence_y']*100,status_dict['localization_confidence_a']*100)\n                status_str += \"Localization position uncertainty: {:.2f}\\n\".format(status_dict['last_position_uncertainty'])\n                status_str += \"Controller timing: {} ms\\n\".format(status_dict['controller_loop_ms'])\n                status_str += \"Position timing:   {} ms\\n\".format(status_dict['position_loop_ms'])\n                status_str += \"Camera timing:   {} ms\\n\".format(status_dict['cam_loop_ms'])\n                status_str += \"Current action:   {}\\n\".format(status_dict['current_action'].split('.')[-1])\n                status_str += \"Motion in progress: {}\\n\".format(status_dict[\"in_progress\"])\n                status_str += \"Has error: {}\\n\".format(status_dict[\"error_status\"])\n                status_str += \"Counter:   {}\\n\".format(status_dict['counter'])\n\n                # Also update the visualization position\n                self._update_robot_viz_position(robot_id, robot_pose)\n                self._update_last_mm_pose(robot_id, last_mm_pose)\n                # If there is target position data populated, draw the target too\n                if 'current_move_data' in status_dict.keys():\n                    self._update_target_viz_position(robot_id, robot_pose, status_dict['current_move_data'])\n                color_str = STATUS_PANEL_OK_COLOR\n                if status_dict[\"error_status\"]:\n                    color_str = STATUS_PANEL_BAD_COLOR\n\n            except Exception as e:\n                if \"offline\" in str(status_dict):\n                    status_str = str(status_dict)\n                else:\n                    status_str = \"Bad dict: \" + str(status_dict)\n\n        self.window['_{}_STATUS_'.format(robot_id.upper())].update(status_str, background_color=color_str)\n\n    def _update_robot_viz_position(self, robot_id, robot_pose):\n        if robot_id in self.viz_figs.keys():\n            for f in self.viz_figs[robot_id]:\n                self.window['_GRAPH_'].DeleteFigure(f)\n        self.viz_figs[robot_id] = self._draw_robot(robot_pose, use_target_color=False)\n\n    def _update_target_viz_position(self, robot_id, robot_pose, target_pose):\n        viz_key = \"{}_target\".format(robot_id)\n        if viz_key in self.viz_figs.keys():\n            for f in self.viz_figs[viz_key]:\n                self.window['_GRAPH_'].DeleteFigure(f)\n\n        if robot_pose and target_pose:\n            target_line = self.window['_GRAPH_'].draw_line(point_from=robot_pose[:2], point_to=target_pose[:2], color='yellow', width=2)\n            self.viz_figs[viz_key] = self._draw_robot(target_pose, use_target_color=True)\n            self.viz_figs[viz_key].append(target_line)  \n\n    def _update_last_mm_pose(self, robot_id, last_mm_pose):\n        viz_key = \"{}_last_mm\".format(robot_id)\n        if viz_key in self.viz_figs.keys():\n            for f in self.viz_figs[viz_key]:\n                self.window['_GRAPH_'].DeleteFigure(f)\n\n        used = last_mm_pose[3]\n        if used:\n            color = 'green'\n            thickness = 2\n        else:\n            color = 'red'\n            thickness = 2\n        global_pose_2d = np.array(last_mm_pose[:2])\n        angle = last_mm_pose[2]\n        viz_handles = []\n\n        direction_pt = np.array([self.config.robot_direction_indicator_length, 0])\n        direction_arrow_x = self.config.robot_direction_indicator_arrow_length*math.cos(math.radians(self.config.robot_direction_indicator_arrow_angle))\n        direction_arrow_y = self.config.robot_direction_indicator_arrow_length*math.sin(math.radians(self.config.robot_direction_indicator_arrow_angle))\n        direction_arrow_left =  direction_pt + np.array([-direction_arrow_x,  direction_arrow_y])\n        direction_arrow_right = direction_pt + np.array([-direction_arrow_x, -direction_arrow_y])\n        global_direction_pt = Utils.TransformPos(direction_pt, global_pose_2d, angle)\n        viz_handles.append(self.window['_GRAPH_'].draw_line(point_from=list(global_pose_2d), point_to=list(global_direction_pt), color=color, width=thickness))\n        global_direction_arrow_left = Utils.TransformPos(direction_arrow_left, global_pose_2d, angle)\n        viz_handles.append(self.window['_GRAPH_'].draw_line(point_from=list(global_direction_pt), point_to=list(global_direction_arrow_left), color=color, width=thickness))\n        global_direction_arrow_right = Utils.TransformPos(direction_arrow_right, global_pose_2d, angle)\n        viz_handles.append(self.window['_GRAPH_'].draw_line(point_from=list(global_direction_pt), point_to=list(global_direction_arrow_right), color=color, width=thickness))\n        self.viz_figs[viz_key] = viz_handles\n\n    def _draw_robot(self, robot_pose, use_target_color):\n\n        robot_line_color = 'black'\n        robot_line_thickness = 2\n        direction_line_color = 'red'\n        direction_line_thickness = 1\n        if use_target_color:\n            robot_line_color = 'green'\n            robot_line_thickness = 2\n            direction_line_color = 'tomato'\n            direction_line_thickness = 1\n\n\n        robot_global_pos_2d = np.array([robot_pose[0],robot_pose[1]])\n        robot_angle = robot_pose[2]\n        robot_fig_handles = []\n        \n        # Robot chassis points in robot frame\n        chassis_fl = np.array([self.config.robot_rotation_center_offset,  self.config.robot_chassis_size/2.0])\n        chassis_fr = np.array([self.config.robot_rotation_center_offset, -self.config.robot_chassis_size/2.0])\n        chassis_bl = np.array([self.config.robot_rotation_center_offset - self.config.robot_chassis_size,  self.config.robot_chassis_size/2.0])\n        chassis_br = np.array([self.config.robot_rotation_center_offset - self.config.robot_chassis_size, -self.config.robot_chassis_size/2.0])\n\n        # Tile points in robot frame\n        tile_bl = -self.config.tile_to_robot_offset\n        tile_br = tile_bl + np.array([0, -self.config.tile_size_width_meters])\n        tile_fl = tile_bl + np.array([self.config.tile_size_height_meters, 0])\n        tile_fr = tile_br + np.array([self.config.tile_size_height_meters, 0])\n        tile_front_center = (tile_fl + tile_fr) / 2.0\n\n        # Robot direction indicator\n        scale_factor = 0.7\n        direction_pt = np.array([scale_factor*self.config.robot_direction_indicator_length, 0])\n        direction_arrow_x = scale_factor*self.config.robot_direction_indicator_arrow_length*math.cos(math.radians(self.config.robot_direction_indicator_arrow_angle))\n        direction_arrow_y = scale_factor*self.config.robot_direction_indicator_arrow_length*math.sin(math.radians(self.config.robot_direction_indicator_arrow_angle))\n        direction_arrow_left =  direction_pt + np.array([-direction_arrow_x,  direction_arrow_y])\n        direction_arrow_right = direction_pt + np.array([-direction_arrow_x, -direction_arrow_y])\n\n        # Collect, transform, and draw chassis points\n        chassis_points_robot_frame = [chassis_fl, chassis_fr, chassis_br, chassis_bl] # order matters for drawing\n        chassis_points_global_frame = []\n        for pt in chassis_points_robot_frame:\n            global_pt = Utils.TransformPos(pt, robot_global_pos_2d, robot_angle)\n            chassis_points_global_frame.append(global_pt)\n        for i in range(4):\n            start_pt = list(chassis_points_global_frame[i])\n            end_pt = list(chassis_points_global_frame[(i+1)%4])\n            robot_fig_handles.append(self.window['_GRAPH_'].draw_line(point_from=start_pt, point_to=end_pt, color=robot_line_color, width = robot_line_thickness))\n\n        # Collect, transform, and draw tile points\n        tile_points_robot_frame = [tile_bl, tile_br, tile_fr, tile_fl] # order matters for drawing\n        tile_points_global_frame = []\n        for pt in tile_points_robot_frame:\n            global_pt = Utils.TransformPos(pt, robot_global_pos_2d, robot_angle)\n            tile_points_global_frame.append(global_pt)\n        for i in range(4):\n            start_pt = list(tile_points_global_frame[i])\n            end_pt = list(tile_points_global_frame[(i+1)%4])\n            robot_fig_handles.append(self.window['_GRAPH_'].draw_line(point_from=start_pt, point_to=end_pt, color=robot_line_color, width = robot_line_thickness))\n\n        # Draw angle indicator line\n        angle_root_pt = Utils.TransformPos(tile_front_center, robot_global_pos_2d, robot_angle)\n        global_direction_pt = Utils.TransformPos(direction_pt, angle_root_pt, robot_angle)\n        robot_fig_handles.append(self.window['_GRAPH_'].draw_line(point_from=list(angle_root_pt), point_to=list(global_direction_pt), color=direction_line_color, width=direction_line_thickness))\n        global_direction_arrow_left = Utils.TransformPos(direction_arrow_left, angle_root_pt, robot_angle)\n        robot_fig_handles.append(self.window['_GRAPH_'].draw_line(point_from=list(global_direction_pt), point_to=list(global_direction_arrow_left), color=direction_line_color, width=direction_line_thickness))\n        global_direction_arrow_right = Utils.TransformPos(direction_arrow_right, angle_root_pt, robot_angle)\n        robot_fig_handles.append(self.window['_GRAPH_'].draw_line(point_from=list(global_direction_pt), point_to=list(global_direction_arrow_right), color=direction_line_color, width=direction_line_thickness))\n\n        return robot_fig_handles\n\n    def _draw_environment(self):\n\n        # Robot boundaries\n        self.viz_figs[\"boundaires\"] = self.window['_GRAPH_'].draw_rectangle(self.config.robot_boundaries[0], self.config.robot_boundaries[1], line_color='red')\n\n        # Domino field boundaries\n        bottom_left = self.config.domino_field_origin\n        rect_width_height = Utils.TransformPos(np.array([self.config.field_width, self.config.field_height]), [0,0], self.config.domino_field_angle)\n        top_left = (bottom_left[0], bottom_left[1]+rect_width_height[1])\n        bottom_right = (bottom_left[0] + rect_width_height[0], bottom_left[1])\n        self.viz_figs[\"field\"] = self.window['_GRAPH_'].draw_rectangle(top_left, bottom_right, line_color='green', line_width=2)\n\n        # Base station\n        base_station_top_left = (self.config.base_station_boundaries[0][0], self.config.base_station_boundaries[1][1])\n        base_station_bottom_right = (self.config.base_station_boundaries[1][0], self.config.base_station_boundaries[0][1])\n        self.viz_figs[\"base\"] = self.window['_GRAPH_'].draw_rectangle(base_station_top_left, base_station_bottom_right, line_color='blue')\n\n        # X axis\n        left_side = (self.config.graph_bottom_left[0], 0)\n        right_side = (self.config.graph_top_right[0], 0)\n        self.viz_figs[\"xaxis\"] = self.window['_GRAPH_'].draw_line(left_side, right_side, color=\"black\", width=2)\n        x_text_location = (right_side[0] - self.config.axes_label_offset, right_side[1] + self.config.axes_label_offset)\n        self.viz_figs[\"xaxis_label\"] = self.window['_GRAPH_'].draw_text(\"X\", x_text_location)\n        x_ticks = [i for i in range(0, int(right_side[0]), self.config.axes_tick_spacing)] + [i for i in range(0, int(left_side[0]), -self.config.axes_tick_spacing)]\n        x_ticks = np.sort(np.unique(np.array(x_ticks)))\n        for val in x_ticks:\n            bottom = (val, -self.config.axes_tick_size/2)\n            top = (val, self.config.axes_tick_size/2)\n            self.viz_figs[\"xtick_{}\".format(val)] = self.window['_GRAPH_'].draw_line(bottom, top, color=\"black\")\n            label_location = (bottom[0] + self.config.axes_label_offset, bottom[1])\n            self.viz_figs[\"xtick_label_{}\".format(val)] = self.window['_GRAPH_'].draw_text(\"{}\".format(val), label_location)\n\n        # Y axis\n        bottom_side = (0, self.config.graph_bottom_left[1])\n        top_side = (0, self.config.graph_top_right[1])\n        self.viz_figs[\"yaxis\"] = self.window['_GRAPH_'].draw_line(bottom_side, top_side, color=\"black\", width=2)\n        y_text_location = (top_side[0] + self.config.axes_label_offset, top_side[1] - self.config.axes_label_offset)\n        self.viz_figs[\"yaxis_label\"] = self.window['_GRAPH_'].draw_text(\"Y\", y_text_location)\n        y_ticks = [i for i in range(0, int(top_side[1]), self.config.axes_tick_spacing)] + [i for i in range(0, int(bottom_side[1]), -self.config.axes_tick_spacing)]\n        y_ticks = np.sort(np.unique(np.array(y_ticks)))\n        for val in y_ticks:\n            left = (-self.config.axes_tick_size/2, val)\n            right = (self.config.axes_tick_size/2, val)\n            self.viz_figs[\"ytick_{}\".format(val)] = self.window['_GRAPH_'].draw_line(left, right, color=\"black\")\n            label_location = (right[0], right[1]+self.config.axes_label_offset)\n            self.viz_figs[\"ytick_label_{}\".format(val)] = self.window['_GRAPH_'].draw_text(\"{}\".format(val), label_location)\n\n\n    def update_plan_display(self, plan_info):\n        if plan_info is None:\n            return\n        plan = plan_info[0]\n        plan_status = plan_info[1]\n        cycle_number = plan_info[2]\n\n        if not self.drawn_plan_grid:\n            rect_width_height = Utils.TransformPos(np.array([self.config.field_width, self.config.field_height]), [0,0], self.config.domino_field_angle)\n            bottom_left = self.config.domino_field_origin\n            bottom_right = (bottom_left[0], bottom_left[1]+rect_width_height[1])\n            top_left = (bottom_left[0] + rect_width_height[0], bottom_left[1])\n            for i in range(1, self.config.num_tiles_width):\n                dx = Utils.TransformPos(np.array([self.config.tile_size_width_meters, 0]), [0,0], self.config.domino_field_angle)\n                p1 = list(bottom_left + dx * i)\n                p2 = list(top_left + dx * i)\n                self.viz_figs[\"plan_vert_{}\".format(i)] = self.window['_GRAPH_'].draw_line(p1, p2, color = \"grey17\")\n            for i in range(1, self.config.num_tiles_height):\n                dx = Utils.TransformPos(np.array([0, self.config.tile_size_height_meters]), [0,0], self.config.domino_field_angle)\n                p1 = list(bottom_left + dx * i)\n                p2 = list(bottom_right + dx * i)\n                self.viz_figs[\"plan_hz_{}\".format(i)] = self.window['_GRAPH_'].draw_line(p1, p2, color = \"grey17\")\n\n            if type(plan) is SubsectionPlan:\n                origin = self.config.domino_field_origin\n                bottom_left = Utils.TransformPos(np.array(plan.field.tiles[-1].getPlacementPositionInMeters()), origin, self.config.domino_field_angle)\n                top_right = Utils.TransformPos(np.array(plan.field.tiles[0].getPlacementPositionInMeters())+\n                    np.array((self.config.tile_size_width_meters,self.config.tile_size_height_meters)), origin, self.config.domino_field_angle)\n                self.viz_figs[\"subfield\"] = self.window['_GRAPH_'].draw_rectangle(bottom_left, top_right, line_color='green', line_width=3)\n\n            self.drawn_plan_grid = True\n\n        if cycle_number != self.last_cycle_number_drawn:\n            tile_color = \"red\"\n            if plan_status in [PlanStatus.RUNNING, PlanStatus.PAUSED]:\n                tile_color = \"blue\"\n\n            if cycle_number <= len(plan.field.tiles):\n                if cycle_number < 0:\n                    cycle_number = 0\n                if \"plan_marker\" in self.viz_figs:\n                    self.window['_GRAPH_'].DeleteFigure(self.viz_figs[\"plan_marker\"])\n                tile = plan.field.tiles[cycle_number]\n                location_field_frame = np.array(tile.getPlacementPositionInMeters()) + np.array([self.config.tile_size_width_meters, self.config.tile_size_height_meters])/2.0\n                location_global_frame = list(Utils.TransformPos(location_field_frame, self.config.domino_field_origin, self.config.domino_field_angle))\n                self.viz_figs[\"plan_marker\"] = self.window['_GRAPH_'].draw_circle(location_global_frame, 0.25, fill_color=tile_color)\n                self.last_cycle_number_drawn = cycle_number\n\n\n\nclass Master:\n\n    def __init__(self, cfg, gui_handle):\n\n        self.cfg = cfg\n        self.cmd_gui = gui_handle\n        self.keep_mm_running = False\n\n        logging.info(\"Initializing Master\")\n        self.runtime_manager = RuntimeManager(self.cfg)\n        self.runtime_manager.initialize()\n\n    def loop(self):\n\n        ready_to_exit = False\n        while not ready_to_exit:\n            try:\n                self.runtime_manager.update()   \n                ready_to_exit = self.update_gui_and_handle_input()\n            except Exception as e:\n                logging.warning(\"Unhandled exception {}\".format(str(e)))\n                if self.runtime_manager.get_plan_status() is PlanStatus.RUNNING:\n                    self.runtime_manager.set_plan_status(PlanStatus.PAUSED)\n\n        # Clean up whenever loop exits\n        self.runtime_manager.shutdown(self.keep_mm_running)\n        self.cmd_gui.close()\n\n    def update_gui_and_handle_input(self):\n        # Handle any input from gui\n        event_type, event_data = self.cmd_gui.update()\n        if event_type == \"Exit\":\n            return True\n        if event_type == \"ExitMM\":\n            self.keep_mm_running = True\n            return True\n        if event_type == \"Run\":\n            if self.runtime_manager.get_plan_status() == PlanStatus.PAUSED:\n                logging.info(\"PLAN RESUMED\")\n            self.runtime_manager.set_plan_status(PlanStatus.RUNNING)\n        if event_type == \"Load\":\n            if event_data is not '':\n                self.runtime_manager.load_plan(event_data)\n        if event_type == \"Pause\":\n            logging.info(\"PLAN PAUSED\")\n            self.runtime_manager.estop()\n            self.runtime_manager.set_plan_status(PlanStatus.PAUSED)\n        if event_type == \"Abort\":\n            logging.warning(\"PLAN ABORTED\")\n            self.runtime_manager.estop()\n            self.runtime_manager.set_plan_status(PlanStatus.ABORTED)\n        if event_type == \"Action\":\n            self.runtime_manager.run_manual_action(event_data)\n        if event_type == \"_INC_CYCLE_\":\n            self.runtime_manager.increment_robot_cycle(\"robot1\")\n        if event_type == \"_DEC_CYCLE_\":\n            self.runtime_manager.decrement_robot_cycle(\"robot1\")\n        if event_type == \"_INC_ACTION_\":\n            self.runtime_manager.increment_robot_action(\"robot1\")\n        if event_type == \"_DEC_ACTION_\":\n            self.runtime_manager.decrement_robot_action(\"robot1\")\n        if event_type == \"_SET_CYCLE_\":\n            self.runtime_manager.set_cycle(\"robot1\", int(event_data[0]))\n        if event_type == \"_SET_ACTION_\":\n            self.runtime_manager.set_action(\"robot1\", int(event_data[1]))\n        if event_type == \"ESTOP\":\n            self.runtime_manager.estop()\n            if self.runtime_manager.get_plan_status() == PlanStatus.RUNNING:\n                self.runtime_manager.set_plan_status(PlanStatus.PAUSED)\n                logging.warning(\"Pausing plan due to ESTOP event\")\n\n        # Get metrics and update the displayed info\n        metrics = self.runtime_manager.get_all_metrics()\n        self.cmd_gui.update_status_panels(metrics)\n        self.cmd_gui.update_plan_display(self.runtime_manager.get_plan_info())\n\n        return False\n\n\n\ndef configure_logging(path):\n\n    rootLogger = logging.getLogger()\n    rootLogger.setLevel(logging.INFO)\n\n    if not os.path.isdir(path):\n        os.mkdir(path)\n\n    fileHandler = logging.FileHandler(os.path.join(path,\"master.log\"), 'w+')\n    fileFormatter = logging.Formatter(\"%(asctime)s [%(levelname)s] %(message)s\")\n    fileHandler.setFormatter(fileFormatter)\n    rootLogger.addHandler(fileHandler)\n\n    consoleHandler = logging.StreamHandler()\n    consoleFormatter = logging.Formatter(\"%(message)s\")\n    consoleHandler.setFormatter(consoleFormatter)\n    rootLogger.addHandler(consoleHandler)\n\n\nif __name__ == '__main__':\n    # Setup config and gui\n    cfg = config.Config()\n    gui = CmdGui(cfg)\n    # Need to setup gui before logging to ensure that output pane captures logs correctly\n    configure_logging(cfg.log_folder)\n    # Startup master and loop forever\n    m = Master(cfg, gui)\n    m.loop()\n\n\n"
  },
  {
    "path": "src/master/README.md",
    "content": "# **Warning! This is probably out of date and might be inaccurate. Use at your own risk.**\n\n# Master\nThis is the top level 'Master' software for controlling the Domino Robot System. It is the central hub for communication and control of all other robots, provides a GUI for visualization and debugging, and handles executing the top level plan to set up all the dominos.\n\nThere are two programs that can be run within the master software:\n- `MasterMain.py` is the main program that runs the GUI and controls everything and is the one the rest of this README will primarily focus on.\n- `FieldPlanner.py` will generate a plan using an input image, available domino colors, and many other configuration parameters (all defined in `config.py`). This plan can then be saved to a file (in the `src/master/plans/` folder) and loaded at a later point for execution. You probably shouldn't ever need to run this.\n\n# Table of Contents\n<!-- TOC -->\n\n- [**Warning! This is probably out of date and might be inaccurate. Use at your own risk.**](#warning-this-is-probably-out-of-date-and-might-be-inaccurate-use-at-your-own-risk)\n- [Master](#master)\n- [Table of Contents](#table-of-contents)\n- [Installation](#installation)\n- [Usage](#usage)\n    - [Robot Setup](#robot-setup)\n    - [Running MasterMain](#running-mastermain)\n    - [Using the MasterMain GUI](#using-the-mastermain-gui)\n        - [Window Overview](#window-overview)\n        - [Running Commands](#running-commands)\n        - [Summary of available commands](#summary-of-available-commands)\n\n<!-- /TOC -->\n\n# Installation\n\n1. Ensure you have a version of Python 3 installed by installing from https://www.python.org/. This project was developed using Python 3.7, but any recent Python version should be fine. I've only used it on Windows 10 so the instructions here are for Windows. It may or may not work on other operating systems.\n2. Open a new terminal window and navigate to the folder where you downloaded the DominoRobot repository\n```\ncd path\\to\\DominoRobot\n```\n3. Create a python virtual environment in the DominoRobot folder\n```\npython -m venv venv\n```\n4. Activate the virtual environment (your terminal should show `(venv)` at the front of the line to indicate it is active):\n```\n.\\venv\\Scripts\\activate\n```\n5. Install required python packages in the virtual environment:\n```\npip install -r src\\master\\requirements.txt\n```\n\n# Usage\n\n## Robot Setup\nAny robot you wish to connect to and control must be setup and running before the master program can do anything.\n1. Ensure you are connect to the DominoNet wifi. Talk to Alex if you need help getting setup on the network.\n2. Power on the robot by plugging in both sets of batteries and turning on all power switches. You can leave the hardware ESTOP button pressed for now while finishing the setup, but just make sure you release it before trying to send movement commands or it won't go anywhere!\n3. Connect to the robot over SSH.\n    - If this is your first time using SSH, you may need to install an SSH program like [PuTTY](https://www.putty.org/) or setup [Windows SSH](https://www.pugetsystems.com/labs/hpc/How-To-Use-SSH-Client-and-Server-on-Windows-10-1470/)\n    - Figure out the IP address of the robot you are interested in connecting to (If this list becomes out of date, the up to date info will be in the `ip_map` variable in `config.py`)\n    \n    | Target        | IP           |\n    |---------------|--------------|\n    | robot1        | 192.168.1.5  |\n\n    - SSH to `pi@<ip_addr>` and enter the password to connect\n\n4. Once you are connected to the robot move to the DominoRobot directory:\n```\ncd DominoRobot\n```\n5. Ensure that the robot has the latest version of the correct software\n```\ngit checkout master && git pull\n```\n6. Recompile the robot software (this could take a minute or two if it has to compile a lot of stuff)\n```\nmake\n```\n7. Start the robot software\n```\nrun_robot\n```\n8. This should start printing out a bunch of info on the screen and the robot should be ready to connect to the master (See the next section on [Running MasterMain](#running-mastermain) ). Note that you must keep this terminal window running as closing it will stop the robot software. If at any point you want to stop the software, just use `Ctrl+C` to stop it. You can then restart it with `run_robot` again.\n9. **IMPORTANT**: When you are done and want to shut everything down, make sure you shut down the raspberry pi by running the shutdown command BEFORE turning off the power. If you just turn off the power without shutting down, there is a possibility that the SD card can get corrupted.\n```\nsudo shutdown -h now\n```\n\n## Running MasterMain\n1. Open a new terminal window and navigate to the folder where you downloaded the DominoRobot repository (if you aren't there already from the installation step)\n```\ncd path\\to\\DominoRobot\n```\n2. Ensure that your virtual environment is active (your terminal should show `(venv)` at the front of the line to indicate it is active):\n```\n.\\venv\\Scripts\\activate\n```\n3. Run the program:\n```\npython src\\master\\MasterMain.py\n```\n4. It may take a second for the window to show up, but as long as a window pops up and the commandline doesn't spit out a bunch of errors, you should be good to go. You can stop the program at any time with the X in the top right of the window, or using `Ctrl+C` on the commandline.\n5. If you need to stop and re-run the program, you only need to re-run step 3 above as long as you don't close the terminal window (In most terminals you can just push the up arrow to get back the previous command you ran).\n\n## Using the MasterMain GUI\n### Window Overview\nThe window will look something like this when you launch it:\n\n![GUI window](readme_images/gui_start.png)\n\nOn the left are the status panels for each component connected to the system. They are all red right now because nothing is connected and will turn green and display information when that component is active, like this:\n\n![Active robot panel](readme_images/robot_status_green.png)\n\nIn the middle is a display that shows where the robot and other objects are in the scene. As components are connected and move around, this will update to show their location. On the bottom of the middle section are the control buttons which is your main way of interacting with the system:\n\n![Control buttons](readme_images/control_panel.png)\n\nOn the right side of the window, the program prints out information about what it is doing. This information is also mirrored to the console and logged into `DominRobot\\log\\master.log` (which is overwritten on each new run)\n\n### Running Commands\nIn order to send a command to a robot a few prerequisites must be met. \n\n1. The robot has to be connected to the master (i.e. the box should be green and the counter value should be changing). If you followed the [setup](##robot-setup) steps above, this probably should be the case as soon as you run `MasterMain` on your computer. If not, you'll have to debug why (check that your SSH connection is okay and that the robot program hasn't crashed)\n2. The robot is programmed to only run a single command at a time, so it will ignore any commands that are sent while another one is active, so make sure a previous command isn't still running. If there is no motion from the robot/lifter and the \"Motion in progress\" value is False, it should let you run a command.\n\nTo actually run the command, you need 3 pieces of information.\n1. The `Target` field is which robot/device to send the command to. Make sure you have the right target selected from the drop down menu.\n\n![Target menu](readme_images/control_target_select.png)\n\n2. The `Action` field is which action you would like to run on the target. See below for a summary of what the various actions do.\n\n![Action menu](readme_images/control_action_type.png)\n\n3. Some actions require some additional data to be specified. If additional data is needed for an action, you must specify it in the `Data` field by typing it in. See the table below for which actions require additional data. If an action does not require additional data any information (blank or otherwise) in the data field is ignored.\n\nOnce all the prerequisites are met and the required fields are filled in, you may use the `<Enter>` key or the `Send Command` button to run the command on the target. If everything is successful, you will see values change in the status panel of the target and text printed out in the pane on the right. \n\nIf something goes wrong, you can use the big red ESTOP button on the command window to send an emergency stop signal to all robots. Note that this is a software ESTOP and requires the master to be functional enough to send the ESTOP signal and the target functional enough to recieve and act on the ESTOP signal. There are certian failure modes where this will not be the case and this button will not function (i.e. the master software crashes and you can't push the button). In that case, use the physical ESTOP buttons on the robots to cut power.\n\n### Summary of available commands\n\n| Name  | Valid targets | Summary | Additional data|\n|-------|---------------|---------|----------------|\n|MOVE_COARSE | robot | Move the robot coarsely to an absolute position. | Comma separated x,y,a pose (Example: `1.0,2,0` would tell the robot to move to x=1, y=2, a=0). Units are meters, meters, radians. | \n|MOVE_FINE | robot |  Move the robot very precisely to an absolute position.| Comma separated x,y,a pose (Example: `0.1,-0.5,3.14` would tell the robot to move to x=0.1, y=-0.5, a=3.14). Units are meters, meters, radians.|\n|MOVE_REL | robot |Move the robot coarsely to a position relative to its current position.|Comma separated x,y,a pose (Example: `1.0,2,0` would tell the robot to move to x=x_current+1, y=y_current+2, a=a_current+0). Units are meters, meters, radians. |\n|MOVE_CONST_VEL |robot | Move the robot coarsely at a constant velocity for a specified time.|Comma separated x,y,a,t velocity and time (Example: `0.1,0.2,0,4` would tell the robot to move to at a velocity of x_vel=0.1, y_vel=0.2, a_vel=0 for t=4 seconds). Units are meters/sec, meters/sec, radians/sec, seconds. |\n|NET | all |Check if the robot is connected to the network. | None |\n|LOAD| robot | Run the tray loading sequence on the robot. *Note that the tray will pause in the \"ready to load\" position until a LOAD_COMPLETE action is recieved*.| None |\n|PLACE| robot | Run the tray placement sequence on the robot.| None|\n|TRAY_INIT | robot | Run the tray intialization sequence on the robot. | None|\n|LOAD_COMPLETE|robot| Indicate to the robot that the base station has completed the loading process. | None |\n|ESTOP|all| Tell the target to immediately stop any current action/motion. | None |\n|WAIT_FOR_LOCALIZATION| robot | Tell the target to wait for the localization confidence to reach a good level. | None |\n|CLEAR_ERROR |all| Clear an error state that is preventing the target from starting a new action.| None |\n"
  },
  {
    "path": "src/master/RobotClient.py",
    "content": "# This is a basic TCP client for a robot that wraps up sending\n# and recieving various commands\n\nimport socket\nimport select\nimport json\nimport time\nimport logging\nimport copy\n\nPORT = 8123\nNET_TIMEOUT = 0.1 # seconds\nSTART_CHAR = \"<\"\nEND_CHAR = \">\"\n\nclass TcpClient:\n\n    def __init__(self, ip, port, timeout):\n        self.socket = socket.create_connection((ip, port), timeout)\n        self.socket.setblocking(False)\n        if not socket:\n            return None\n\n    def send(self, msg, print_debug=True):\n        totalsent = 0\n        if print_debug:\n            logging.info(\"TX: \" + msg)\n        msg = START_CHAR + msg + END_CHAR\n        msg_bytes = msg.encode()\n        while totalsent < len(msg):\n            sent = 0\n            try:\n                sent = self.socket.send(msg_bytes[totalsent:])\n            except:\n                pass\n            if sent == 0:\n                raise RuntimeError(\"socket connection broken\")\n            totalsent = totalsent + sent\n\n    def recieve(self, timeout=0.1, print_debug=True):\n\n        # logging.info(\"Checking socket ready\")\n        # socket_ready, _, _ = select.select([self.socket], [], [])\n        # if not socket_ready:\n        #     logging.info(\"Socket not ready\")\n        #     return \"\"\n\n        new_msg = \"\"\n        new_msg_ready = False\n        start_time = time.time()\n        while not new_msg_ready and time.time() - start_time < timeout:\n            # Get new data\n            try:\n                data = self.socket.recv(2048)\n            except socket.error:\n                continue\n            if data == b'':\n                raise RuntimeError(\"socket connection broken\")\n\n            # Decode data and parse into message\n            new_str = data.decode(encoding='UTF-8',errors='strict')\n\n            start_idx = new_str.find(START_CHAR)\n            end_idx = new_str.find(END_CHAR)\n\n            if start_idx != -1 and end_idx != -1:\n                new_msg = new_str[start_idx+1:end_idx]\n                new_msg_ready = True\n            elif start_idx != -1:\n                new_msg += new_str[start_idx+1:]\n                msg_rcv_in_progress = True\n            elif end_idx != -1:\n                new_msg += new_str[:end_idx]\n                new_msg_ready = True\n            else:\n                new_msg += new_str\n\n        if print_debug and new_msg:\n            logging.info(\"RX: \" + new_msg)\n        if not new_msg_ready:\n            #logging.info(\"Socket timeout\")\n            new_msg = \"\"\n\n        return new_msg\n\nclass ClientBase:\n\n    def __init__(self, ip): \n        self.client = TcpClient(ip, PORT, NET_TIMEOUT)\n        # Queue is used to help handle out of order messages recieved\n        self.incoming_queue = []\n\n    def wait_for_server_response(self, expected_msg_type, timeout=0.1, print_debug=True):\n        \"\"\"\n        Waits for specified time to get a reply from the server\n        Must set expected_msg_type to a string and this function will return the first message\n        recieved matching that type. Other messages are added to the queue and checked the\n        next time this function is called. If no matching message is recieved before timeout,\n        this function returns None\n        \"\"\"\n\n        # First, check the existing queue to see if the message we are looking for is there\n        for ix in range(len(self.incoming_queue)):\n            # Since type field existence is checked before insertion into queue, it should\n            # always be safe to check the field directly\n            if self.incoming_queue[ix]['type'] == expected_msg_type:\n                msg_copy = copy.copy(self.incoming_queue[ix])\n                del self.incoming_queue[ix]\n                return msg_copy    \n\n        # If there was nothing in the queue matching the expected type, try to recieve a message\n        start_time = time.time()\n        while time.time() - start_time < timeout:\n            try:\n                incoming_msg = self.client.recieve(timeout=0.1, print_debug=print_debug)\n            except socket.timeout:\n                break\n            if incoming_msg:\n                try:\n                    new_msg = json.loads(incoming_msg)\n                    if 'type' not in new_msg:\n                        logging.warning(\"Discarded msg for missing type field: {}\".format(new_msg))\n                        continue\n                    elif new_msg['type'] == expected_msg_type:\n                        return new_msg\n                    else:\n                        self.incoming_queue.append(new_msg)\n                except:\n                    logging.warning(\"Error decoding json: {}\".format(incoming_msg))\n                    break\n\n        # Will get here if timeout is reached or decode error happens\n        return None\n\n    def send_msg_and_wait_for_ack(self, msg, print_debug=False, timeout=0.1):\n        \"\"\" \n        Sends msg and ensures that the correct ack is returned\n        Logs a warning if ack is not recieved or incorrect ack is recieved\n        \"\"\"\n\n        self.client.send(json.dumps(msg,separators=(',',':')), print_debug=print_debug) # Make sure json dump is compact for transmission\n        num_wait_checks = 3\n        counter = 0\n        resp_ok = False\n        while counter < num_wait_checks and not resp_ok:\n            counter += 1\n            resp = self.wait_for_server_response(expected_msg_type='ack', print_debug=print_debug)\n            if not resp:\n                logging.warning(\"Did not recieve ack. Try {} of {}\".format(counter, num_wait_checks))\n                continue\n            else:\n                if resp['type'] != 'ack':\n                    logging.warning(\"Expecting return type ack. Try {} of {}\".format(counter, num_wait_checks))\n                    continue\n                elif resp['data'] != msg['type']:\n                    logging.warning(\"Incorrect ack type. Expecting: {}, Got: {}. Try {} of {}\".format(\n                        msg['type'], resp['data'], counter, num_wait_checks))\n                    continue\n\n            resp_ok = True\n\n        if resp_ok:           \n            return resp\n        else:\n            raise ValueError(\"Bad/no response\")\n\n    def net_status(self):\n        \"\"\" Check if the network connection is ok\"\"\"\n        msg = {'type': 'check'}\n        status = False\n        try:\n            self.send_msg_and_wait_for_ack(msg)\n            status = True\n        except:\n            pass\n        finally:\n            return status\n\n    def request_status(self):\n        \"\"\" Request status from server \"\"\"\n        msg = {'type' : 'status'}\n        self.client.send(json.dumps(msg), print_debug=False)\n        resp = self.wait_for_server_response(expected_msg_type='status', print_debug=False)\n        if not resp:\n            logging.warning(\"Did not recieve status response\")\n            return None\n        return resp['data']\n\n    def estop(self):\n        \"\"\" Tell client to estop \"\"\"\n        msg = {'type': 'estop'}\n        self.send_msg_and_wait_for_ack(msg)\n\n\nclass RobotClient(ClientBase):\n\n    def __init__(self, cfg, robot_id):\n        super().__init__(cfg.ip_map[robot_id])\n\n        self.robot_id = robot_id\n        self.cfg = cfg\n\n    def move(self, x, y, a):\n        \"\"\" Tell robot to move to specific location \"\"\"\n        msg = {'type': 'move', 'data': {'x': x, 'y': y, 'a': a}}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def move_rel(self, x, y, a):\n        \"\"\" Tell robot to move to a relative location \"\"\"\n        msg = {'type': 'move_rel', 'data': {'x': x, 'y': y, 'a': a}}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def move_rel_slow(self, x, y, a):\n        \"\"\" Tell robot to move to a relative location but slowly \"\"\"\n        msg = {'type': 'move_rel_slow', 'data': {'x': x, 'y': y, 'a': a}}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def move_fine(self, x, y, a):\n        \"\"\" Tell robot to move to a specific location with fine precision \"\"\"\n        msg = {'type': 'move_fine', 'data': {'x': x, 'y': y, 'a': a}}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def move_fine_stop_vision(self, x, y, a):\n        \"\"\" Tell robot to move to a specific location with fine precision but stop if vision markers are detected \"\"\"\n        msg = {'type': 'move_fine_stop_vision', 'data': {'x': x, 'y': y, 'a': a}}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def move_with_vision(self, x, y, a):\n        \"\"\" Tell robot to move to a location relative to the vision measurements from cameras\"\"\"\n        msg = {'type': 'move_vision', 'data': {'x': x, 'y': y, 'a': a}}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def move_const_vel(self, vx, vy, va, t):\n        \"\"\" Tell robot to move at constant velocity for a specific amount of time\"\"\"\n        msg = {'type': 'move_const_vel', 'data': {'vx': vx, 'vy': vy, 'va': va, 't': t}}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def place(self):\n        \"\"\" Tell robot to place pallet \"\"\"\n        msg = {'type': 'place'}\n        self.send_msg_and_wait_for_ack(msg)\n    \n    def load(self):\n        \"\"\" Tell robot to load pallet \"\"\"\n        msg = {'type': 'load'}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def tray_init(self):\n        \"\"\" Tell robot to initialize tray \"\"\"\n        msg = {'type': 'init'}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def load_complete(self):\n        \"\"\" Tell robot that base station load is complete \"\"\"\n        msg = {'type': 'lc'}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def clear_error(self):\n        \"\"\" Tell robot to clear an existing error \"\"\"\n        msg = {'type': 'clear_error'}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def wait_for_localization(self):\n        \"\"\" Tell robot to wait for localization\"\"\"\n        msg = {'type': 'wait_for_loc'}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def set_pose(self, x, y, a):\n        \"\"\" Sets the pose of the robot explicity bypassing any localization code \"\"\"\n        msg = {'type': 'set_pose', 'data': {'x': x, 'y': y, 'a': a}}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def toggle_vision_debug(self):\n        \"\"\" Tell robot to toggle vision debug output\"\"\"\n        msg = {'type': 'toggle_vision_debug'}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def start_cameras(self):\n        \"\"\" Tell robot to start the cameras\"\"\"\n        msg = {'type': 'start_cameras'}\n        self.send_msg_and_wait_for_ack(msg)\n\n    def stop_cameras(self):\n        \"\"\" Tell robot to stop the cameras\"\"\"\n        msg = {'type': 'stop_cameras'}\n        self.send_msg_and_wait_for_ack(msg)\n\nclass BaseStationClient(ClientBase):\n    \n    def __init__(self, cfg):\n        super().__init__(cfg.base_station_ip)\n        self.cfg = cfg\n\n    def load(self):\n        \"\"\" Tells the station to load the next set of dominos into the robot\"\"\"\n        msg = {'type': 'load'}\n        self.send_msg_and_wait_for_ack(msg)\n\n\n# Hacky Mocks to use for testing\nclass MockRobotClient:\n    def __init__(self, cfg, robot_id):\n        super().__init__()\n        self.robot_id = robot_id\n        self.cfg = cfg\n\n    def move(self, x, y, a):\n        pass\n\n    def move_rel(self, x, y, a):\n        pass\n\n    def move_rel_slow(self, x, y, a):\n        pass\n\n    def move_fine(self, x, y, a):\n        pass\n\n    def place(self):\n        pass\n\n    def net_status(self):\n        return True\n\n    def estop(self):\n        pass\n\n    def load(self):\n        pass\n\n    def request_status(self):\n        return {\"in_progress\": False, \"pos_x\": 1, \"pos_y\": 2, \"pos_a\": 0}\n    \n    def clear_error(self):\n        pass\n\n    def wait_for_localization(self):\n        pass\n\n    def toggle_distance(self):\n        pass\n\nclass MockBaseStationClient:\n    \n    def __init__(self, cfg):\n        super().__init__()\n        self.cfg = cfg\n\n    def load(self):\n        pass\n\n    def net_status(self):\n        return True\n    \n    def estop(self):\n        pass\n\n    def request_status(self):\n        return {}\n    \n\nif __name__== '__main__':\n    import config\n    from MasterMain import configure_logging\n    cfg = config.Config()\n    configure_logging(cfg.log_folder)\n    r = RobotClient(cfg, 'robot1')\n    time.sleep(1)\n\n    r.request_status()\n    #r.request_status()\n    time.sleep(1)\n    #r.request_status()\n    #time.sleep(1)\n    \n    # while(True):\n    #     speed = input(\"Input move speed [x,y,a]: \").strip().split(',')\n    #     if len(speed) != 3:\n    #         logging.info(\"Need to provide comma separated values.\")\n    #     else:\n    #         x = float(speed[0])\n    #         y = float(speed[1])\n    #         a = float(speed[2])\n    #         r.move(x,y,a)\n        \n\n    #     time.sleep(3)"
  },
  {
    "path": "src/master/Runtime.py",
    "content": "import time\nimport logging\nimport enum\nimport pprint\nimport pickle\nimport os\nimport json\nimport copy\nimport math\n\nfrom numpy.core.defchararray import add\n\nfrom MarvelMindHandler import MarvelmindWrapper, MockMarvelmindWrapper\nfrom RobotClient import RobotClient, BaseStationClient, MockRobotClient, MockBaseStationClient\nfrom FieldPlanner import ActionTypes, Action, TestPlan, RunFieldPlanning\nfrom Utils import write_file, NonBlockingTimer, ActionValidator\n\nclass RobotInterface:\n    def __init__(self, config, robot_id):\n        self.comms_online = False\n        self.robot_client = None\n        self.config = config\n        self.robot_id = robot_id\n        self.last_status_time = 0\n        self.last_status = None\n        self.simple_action_map = {}\n        self.current_action = ActionTypes.NONE\n        self.current_move_data = []\n        self.current_action_timer = None\n        self.wait_timer = None\n        self.waiting_active = False\n\n    def bring_online(self, use_mock=False):\n        self._bring_comms_online(use_mock)\n\n    def check_online(self):\n        return self.comms_online\n\n    def get_robot_metrics(self):\n        metrics = copy.copy(self.last_status)\n        if metrics and type(metrics) is dict:\n            metrics['current_action'] = str(self.current_action)\n            metrics['current_move_data'] = self.current_move_data\n        if metrics and self.waiting_active:\n            metrics['in_progress'] = True\n        return metrics\n\n    def _setup_action_map(self):\n        self.simple_action_map = {\n            ActionTypes.LOAD: self.robot_client.load,\n            ActionTypes.PLACE: self.robot_client.place,\n            ActionTypes.TRAY_INIT: self.robot_client.tray_init,\n            ActionTypes.LOAD_COMPLETE: self.robot_client.load_complete,\n            ActionTypes.ESTOP: self.robot_client.estop,\n            ActionTypes.CLEAR_ERROR: self.robot_client.clear_error,\n            ActionTypes.WAIT_FOR_LOCALIZATION: self.robot_client.wait_for_localization,\n            ActionTypes.TOGGLE_VISION_DEBUG: self.robot_client.toggle_vision_debug,\n            ActionTypes.START_CAMERAS: self.robot_client.start_cameras,\n            ActionTypes.STOP_CAMERAS: self.robot_client.stop_cameras,\n        }\n\n    def _bring_comms_online(self, use_mock=False):\n        try:\n            logging.info(\"Attempting to connect to {} over wifi\".format(self.robot_id))\n\n            if use_mock:\n                self.robot_client = MockRobotClient(self.config, self.robot_id)\n            else:\n                self.robot_client = RobotClient(self.config, self.robot_id)\n\n            if self.robot_client.net_status():\n                self.comms_online = True\n                self._setup_action_map()\n                logging.info(\"Connected to {} over wifi\".format(self.robot_id))\n        except Exception as e:\n            logging.info(\"Couldn't connect to {} over wifi\".format(self.robot_id))\n            return \n\n    def _get_status_from_robot(self):\n        self.last_status = self.robot_client.request_status()\n        self.last_status_time = time.time()\n        \n        # Reset current action and move data if the robot finished an action\n        if self.current_action != ActionTypes.NONE and self.last_status is not None and \\\n            'in_progress' in self.last_status.keys() and not self.last_status['in_progress'] and self.current_action_timer.check():\n            self.current_action = ActionTypes.NONE\n            self.current_move_data = []\n\n\n    def update(self):\n\n        if self.comms_online:\n\n            # Request a status update if needed\n            if time.time() - self.last_status_time > self.config.robot_status_wait_time:\n                try:\n                    self._get_status_from_robot()\n                except RuntimeError:\n                    logging.info(\"Network connection with {} lost\".format(self.robot_id))\n                    self.comms_online = False\n                    self.last_status = \"{} offline\".format(self.robot_id)\n\n            if self.waiting_active:\n                if self.wait_timer.check():\n                    self.waiting_active = False\n                    self.wait_timer = None\n\n    def run_action(self, action):\n        \n        if not self.comms_online:\n            return\n\n        logging.info(\"Running {} on {}\".format(action.action_type, self.robot_id))\n\n        try:\n            if action.action_type is not None:\n                self.current_action = action.action_type\n            \n            if action.action_type == ActionTypes.MOVE_COARSE:\n                self.robot_client.move(action.x, action.y, action.a)\n                self.current_move_data = [action.x, action.y, action.getAngleDegrees()]\n            elif action.action_type == ActionTypes.MOVE_REL:\n                self.robot_client.move_rel(action.x, action.y, action.a)\n            elif action.action_type == ActionTypes.MOVE_REL_SLOW:\n                self.robot_client.move_rel_slow(action.x, action.y, action.a)\n            elif action.action_type == ActionTypes.MOVE_FINE:\n                self.robot_client.move_fine(action.x, action.y, action.a)\n                self.current_move_data = [action.x, action.y, action.getAngleDegrees()]\n            elif action.action_type == ActionTypes.MOVE_FINE_STOP_VISION:\n                self.robot_client.move_fine_stop_vision(action.x, action.y, action.a)\n                self.current_move_data = [action.x, action.y, action.getAngleDegrees()]\n            elif action.action_type == ActionTypes.MOVE_WITH_VISION:\n                self.robot_client.move_with_vision(action.x, action.y, action.a)\n            elif action.action_type == ActionTypes.MOVE_CONST_VEL:\n                self.robot_client.move_const_vel(action.vx, action.vy, action.va, action.t)\n            elif action.action_type == ActionTypes.SET_POSE:\n                self.robot_client.set_pose(action.x, action.y, action.a)\n            elif action.action_type == ActionTypes.NET:\n                status = self.robot_client.net_status()\n                logging.info(\"Robot {} network status is: {}\".format(self.robot_id, status))\n            elif action.action_type == ActionTypes.WAIT:\n                self.wait_timer = NonBlockingTimer(action.time)\n                self.waiting_active = True\n            elif action.action_type in self.simple_action_map.keys():\n                self.simple_action_map[action.action_type]()\n            else:\n                logging.info(\"Unknown action: {}\".format(action.action_type))\n\n            self.current_action_timer = NonBlockingTimer(self.config.robot_next_action_wait_time)\n        except RuntimeError:\n            logging.info(\"Network connection with {} lost\".format(self.robot_id))\n            self.comms_online = False\n            self.last_status = \"{} offline\".format(self.robot_id)\n\n\n\nclass BaseStationInterface:\n\n    def __init__(self, config):\n        self.config = config\n        self.client = None\n        self.comms_online = False\n        self.last_status = None\n        self.last_status_time = 0\n\n    def bring_online(self, use_mock=False):\n        logging.info(\"Bringing BaseStation comms online\")\n        if use_mock:\n            self.client = MockBaseStationClient(self.config)\n        else:\n            self.client = BaseStationClient(self.config)\n\n        if self.client.net_status():\n            self.comms_online = True\n\n    def get_last_status(self):\n        return self.last_status\n\n    def check_online(self):\n        return self.comms_online\n\n    def update(self):\n        # Request a status update if needed\n        if time.time() - self.last_status_time > self.config.base_station_status_wait_time:\n            try:\n                self._get_status_from_base_station()\n            except RuntimeError:\n                logging.info(\"Network connection with base station lost\")\n                self.comms_online = False\n                self.last_status = \"Base station offline\"\n\n    def _get_status_from_base_station(self):\n        self.last_status = self.client.request_status()\n        if self.last_status == None or self.last_status == \"\":\n            self.last_status = \"Base station status not available!\"\n        self.last_status_time = time.time()\n\n    def run_action(self, action):\n\n        if not self.comms_online:\n            return\n\n        try:\n            if action.action_type == ActionTypes.ESTOP:\n                self.client.estop()\n            elif action.action_type == ActionTypes.NET:\n                status = self.client.net_status()\n                logging.info(\"Base station network status is: {}\".format(status))\n            elif action.action_type == ActionTypes.LOAD:\n                self.client.load()\n            else:\n                logging.info(\"Unknown action: {}\".format(action.action_type))\n\n        except RuntimeError:\n            logging.info(\"Network connection with base station lost\")\n            self.comms_online = False\n            self.last_status = \"Base station offline\"\n\nclass PlanStatus(enum.Enum):\n    NONE = 0,\n    LOADED = 1,\n    RUNNING = 2,\n    PAUSED = 3,\n    ABORTED = 4,\n    DONE = 5,\n\nclass RuntimeManager:\n\n    STATUS_NOT_INITIALIZED = 0\n    STATUS_PARTIALLY_INITIALIZED = 1\n    STATUS_FULLY_INITIALIZED = 2\n\n    def __init__(self, config):\n\n        self.config = config\n\n        self.robots = {id: RobotInterface(config, id) for id in self.config.ip_map.keys()}\n        self.base_station = BaseStationInterface(config)\n\n        self.initialization_status = RuntimeManager.STATUS_NOT_INITIALIZED\n\n        self.component_initialization_status = {id: False for id in self.config.ip_map.keys()}\n        self.component_initialization_status['mm'] = False\n        self.component_initialization_status['base_station'] = False\n\n        if self.config.OFFLINE_TESTING or self.config.SKIP_MARVELMIND:\n            self.mm_wrapper = MockMarvelmindWrapper(config)\n        else:\n            self.mm_wrapper = MarvelmindWrapper(config)\n\n        self.last_metrics = {}\n        self.idle_bots = set([n for n in self.robots.keys()])\n        self.initialization_timer = None\n\n        self.cycle_tracker = {n: {'action_id': None, 'cycle_id': None, 'timer': None, 'needs_restart': False, 'action_validator': ActionValidator()} for n in self.robots.keys()}\n        self.next_cycle_number = 0\n        self.plan = None\n        self.plan_path = \"\"\n        self.plan_status = PlanStatus.NONE\n        if self.config.USE_TEST_PLAN:\n            self._load_plan_from_file('')\n        elif self.config.REGEN_PLAN:\n            logging.info(\"Regenerating and loading plan\")\n            plan = RunFieldPlanning(autosave=True)\n            self._load_plan_from_object(plan, os.path.join(self.config.plans_dir, \"autosaved.p\"))\n        elif self.config.AUTO_LOAD_PLAN and self.config.AUTO_LOAD_PLAN_NAME:\n            plan_path = os.path.join(self.config.plans_dir, self.config.AUTO_LOAD_PLAN_NAME)\n            self._load_plan_from_file(plan_path)\n\n    def initialize(self):\n\n        if self.initialization_timer and not self.initialization_timer.check():\n            return\n\n        # Handle testing/mock case\n        use_base_station_mock = False\n        use_robot_mock = False\n        if self.config.OFFLINE_TESTING:\n            use_base_station_mock = True\n            use_robot_mock = True\n        if self.config.SKIP_BASE_STATION:\n            use_base_station_mock = True\n        \n        # Bring everything online if needed\n        if not self.component_initialization_status['base_station']:\n            self.base_station.bring_online(use_base_station_mock)\n        if not self.component_initialization_status['mm']:\n            self.mm_wrapper.wake_all_devices_only_if_needed()\n        for id, robot in self.robots.items():\n            if not self.component_initialization_status[id]:\n                robot.bring_online(use_robot_mock)\n        \n        # Check the status of initialization\n        self._check_initialization_status()\n        if self.initialization_status != RuntimeManager.STATUS_FULLY_INITIALIZED:\n            self.initialization_timer = NonBlockingTimer(3)\n            logging.info(\"Unable to fully initialize RuntimeManager, will try again in 3 seconds\")\n            logging.info(\"Current componenent initialization status:\")\n            logging.info(pprint.pformat(self.component_initialization_status, indent=2, width=10))\n\n\n    def shutdown(self, keep_mm_awake):\n        logging.info(\"Shutting down\")\n        if self.initialization_status != RuntimeManager.STATUS_NOT_INITIALIZED:\n            if not keep_mm_awake:\n                self.mm_wrapper.sleep_all_devices()\n\n    def get_initialization_status(self):\n        self._check_initialization_status()\n        return self.initialization_status\n\n    def get_all_metrics(self):\n        return self.last_metrics\n\n    def update(self):\n        self._check_initialization_status()\n        if self.get_initialization_status != RuntimeManager.STATUS_FULLY_INITIALIZED:\n            self.initialize()\n\n        self.base_station.update()\n        for robot in self.robots.values():\n            robot.update()\n\n        self._update_all_metrics()\n\n        if self.plan_status == PlanStatus.RUNNING:\n            self._update_plan()\n            self._update_cycle_actions()\n            self._cycle_state_to_file()\n\n    def run_manual_action(self, manual_action):\n        target = manual_action[0]\n        action = manual_action[1]\n        self._run_action(target, action)\n\n    def estop(self):\n        logging.warning(\"ESTOP\")\n        self._run_action('base', Action(ActionTypes.ESTOP, 'ESTOP'))\n        for robot in self.robots.values():\n            robot.run_action(Action(ActionTypes.ESTOP, 'ESTOP'))\n\n    def load_plan(self, plan_file):\n        if plan_file and os.path.basename(plan_file).split('.')[1] == 'json':\n            self._load_cycle_state_from_file(plan_file)\n        else:\n            self._load_plan_from_file(plan_file)        \n\n    def set_plan_status(self, status):\n        \n        # Clean up if we don't want to save the plan state\n        if status == PlanStatus.ABORTED:\n            self.next_cycle_number = 0\n            self.cycle_tracker = {n: {'action_id': None, 'cycle_id': None, 'timer': None, 'needs_restart': False, 'action_validator': ActionValidator()} for n in self.robots.keys()}\n            self.idle_bots = set([n for n in self.robots.keys()])\n        elif status == PlanStatus.PAUSED:\n            for data in self.cycle_tracker.values():\n                data['needs_restart'] = True\n\n        self.plan_status = status\n\n    def increment_robot_cycle(self, target):\n        self._modify_cycle_state(target, add_cycle_id=1)\n    \n    def decrement_robot_cycle(self, target):\n        self._modify_cycle_state(target, add_cycle_id=-1)\n\n    def increment_robot_action(self, target):\n        self._modify_cycle_state(target, add_action_id=1)\n\n    def decrement_robot_action(self, target):\n        self._modify_cycle_state(target, add_action_id=-1)\n\n    def set_cycle(self, target, cycle_num):\n        self._modify_cycle_state(target, add_cycle_id=cycle_num, absolute=True)\n\n    def set_action(self, target, action_num):\n        self._modify_cycle_state(target, add_action_id=action_num, absolute=True)\n\n    def get_plan_status(self):\n        return self.plan_status\n\n    def get_plan_info(self):\n        if not self.plan:\n            return None\n        else:\n            return (self.plan, self.plan_status, self.next_cycle_number-1)\n\n    def _load_plan_from_object(self, plan, plan_path=\"\"):\n        self.plan = plan\n        self.plan_status = PlanStatus.LOADED\n        self.plan_path = plan_path\n        for key in self.robots.keys():\n            self.set_cycle(key, 0)\n            self.set_action(key, 0)\n\n    def _load_plan_from_file(self, plan_file):\n        if self.config.USE_TEST_PLAN or plan_file == \"testplan\":\n            self._load_plan_from_object(TestPlan(), \"testplan\")\n        else:\n            with open(plan_file, 'rb') as f:\n                plan = pickle.load(f)\n                self._load_plan_from_object(plan, plan_file)\n                logging.info(\"Loaded plan from {}\".format(plan_file))  \n        for key in self.robots.keys():\n            self.set_cycle(key, 0)\n            self.set_action(key, 0)\n\n    def _update_plan(self):\n        # If we have an idle robot, send it the next cycle to execute\n        if self.plan_status == PlanStatus.RUNNING and self._any_idle_bots():\n            next_cycle = self.plan.get_cycle(self.next_cycle_number)\n            \n            # If we get none, that means we are done with the plan\n            if next_cycle is None:\n                self.next_cycle_number = 0\n                logging.info(\"Completed plan!\")\n                self.plan_status = PlanStatus.DONE\n                self._erase_cycle_state_file()\n            else:\n                logging.info(\"Sending cycle {} for execution\".format(self.next_cycle_number))\n                self.next_cycle_number += 1\n                self._assign_new_cycle(next_cycle)\n\n    def _any_idle_bots(self):\n        return len(self.idle_bots) > 0\n\n    def _assign_new_cycle(self, cycle):\n        expected_robot = cycle.robot_id\n        cycle_id = cycle.id\n        if expected_robot not in self.idle_bots:\n            raise ValueError(\"Expected cycle {} to be run with {} but only available robots are {}\".format(cycle_id, expected_robot, self.idle_bots))\n        else:\n            logging.info(\"Assigning cycle {} to {}\".format(cycle_id, expected_robot))\n            self.cycle_tracker[expected_robot]['cycle_id'] = cycle_id\n            self.cycle_tracker[expected_robot]['action_id'] = None\n\n    def _run_action(self, target, action):\n        if action.action_type == ActionTypes.PAUSE_PLAN:\n            self.set_plan_status(PlanStatus.PAUSED)\n            self.increment_robot_action(target)\n        elif target == 'base':\n            self.base_station.run_action(action)\n        elif 'robot' in target:\n            self.robots[target].run_action(action)\n            if target in self.idle_bots:\n                self.idle_bots.remove(target)\n        else:\n            logging.info(\"Unknown target: {}\".format(target))\n\n    def _check_initialization_status(self):\n        self.component_initialization_status['base_station'] = self.base_station.check_online()\n        self.component_initialization_status['mm'] = self.mm_wrapper.check_all_devices_status()\n        for id,robot in self.robots.items():\n            self.component_initialization_status[id] = robot.check_online()\n        \n        ready = [i for i in self.component_initialization_status.values()]\n        if all(ready):\n            self.initialization_status = RuntimeManager.STATUS_FULLY_INITIALIZED\n        elif any(ready):\n            self.initialization_status = RuntimeManager.STATUS_PARTIALLY_INITIALIZED\n\n    def _update_all_metrics(self):\n        self.last_metrics = {}\n        self.last_metrics['mm'] = self.mm_wrapper.get_metrics()\n        self.last_metrics['base'] = self.base_station.get_last_status()\n        self.last_metrics['plan'] = self._get_plan_metrics()\n        for robot in self.robots.values():\n            robot_metrics = robot.get_robot_metrics()\n            self.last_metrics[str(robot.robot_id)] = robot_metrics\n\n            # Check if the robot has an error that would require pausing the plan\n            robot_has_error = robot_metrics and \"error_status\" in robot_metrics and robot_metrics[\"error_status\"]\n            plan_running = self.plan_status == PlanStatus.RUNNING\n            if plan_running and robot_has_error:\n                logging.warning(\"Pausing plan due to error on {}. Please address before proceeding.\".format(robot.robot_id))\n                self.set_plan_status(PlanStatus.PAUSED)\n\n    def _get_plan_metrics(self):\n        plan_metrics = {}\n        plan_metrics['status'] = self.plan_status\n        plan_metrics['filename'] = os.path.basename(self.plan_path)\n        plan_metrics['next_cycle_number'] = self.next_cycle_number\n        plan_metrics['idle_bots'] = list(self.idle_bots)\n        robot_metrics = {}\n        for id, data in self.cycle_tracker.items():\n            robot_metrics[id] = {}\n            robot_metrics[id] = {'cycle_id': 'None', 'action_name': 'None', 'action_id': 'None', 'vision_offset': 'None','tile_coordinate': 'None'}\n            if data['cycle_id'] is not None:\n                robot_metrics[id]['cycle_id'] = data['cycle_id'] \n                robot_metrics[id]['tile_coordinate'] = self.plan.field.tiles[data['cycle_id']].coordinate               \n            if data['action_id'] is not None:\n                robot_metrics[id]['action_id'] = data['action_id']\n                action = self.plan.get_action(data['cycle_id'], data['action_id'])\n                if action:\n                    robot_metrics[id]['action_name'] = action.name\n                if action and action.action_type == ActionTypes.MOVE_WITH_VISION:\n                    robot_metrics[id]['vision_offset'] = (action.x, action.y, math.degrees(action.a))\n                else:\n                    robot_metrics[id]['vision_offset'] = 'None'\n            robot_metrics[id]['needs_restart'] = data['needs_restart']\n\n        plan_metrics['robots'] = robot_metrics\n        return plan_metrics\n\n    def _update_cycle_actions(self):\n\n        # Use metrics to update in progress actions\n        for robot_id, metric in self.last_metrics.items():\n            # No updates needed for pos tracker, bbse station, or plan\n            if robot_id in ['mm', 'plan', 'base']:\n                continue\n\n            if metric is None or 'in_progress' not in metric.keys():\n                continue\n\n            # Figure out if we need to do any updates on the running actions\n            tracker_data = self.cycle_tracker[robot_id]\n            if tracker_data['cycle_id'] is not None:\n                cycle = self.plan.get_cycle(tracker_data['cycle_id'])\n\n                # Check if this action needs to be restarted due to being paused or interrupted\n                if tracker_data['needs_restart']:\n                    tracker_data['timer'] = NonBlockingTimer(self.config.robot_next_action_wait_time)\n                    action = cycle.action_sequence[tracker_data['action_id']]\n                    logging.info(\"Re-starting action {} ({}) on {}\".format(tracker_data['action_id'], action.name, robot_id))\n                    self._run_action(robot_id, action)\n                    tracker_data['action_validator'].update_expected_action(action.action_type)\n                    tracker_data['needs_restart'] = False\n\n                # The timer check here is to give a delay for the robot to actually start the action \n                # before master checks if it is finished\n                action_timer_ready = tracker_data['timer'] is None or tracker_data['timer'].check()\n\n                # If we got a new cycle but haven't started an action yet, start the first action\n                start_next_action = False\n                if tracker_data['action_id'] is None:\n                    start_next_action = True\n\n                # If the robot was doing an action and is now finished, start the next one\n                action_assigned = tracker_data['action_id'] is not None\n                action_finished = not metric['in_progress']\n                previous_action_validated = tracker_data['action_validator'].update_action_validation(metric)\n                if action_assigned and action_finished and action_timer_ready:\n                    start_next_action = True\n                    if not previous_action_validated:\n                        logging.warning(\"Everything else finished but previous action not validated\")\n\n                if start_next_action:\n                    # Check if there is a new action to run for this cycle, if not, end the cycle\n                    if tracker_data['action_id'] is not None and (tracker_data['action_id'] + 1) >= len(cycle.action_sequence):\n                        logging.info(\"{} finished cycle {}\".format(robot_id, cycle.id))\n                        tracker_data['cycle_id'] = None\n                        tracker_data['action_id'] = None\n\n                    else:\n                        # If there is a new action to run, start it\n                        if tracker_data['action_id'] is None:\n                            tracker_data['action_id'] = 0\n                        else:\n                            tracker_data['action_id'] += 1\n                        tracker_data['timer'] = NonBlockingTimer(self.config.robot_next_action_wait_time)\n                        next_action = cycle.action_sequence[tracker_data['action_id']]\n                        logging.info(\"Starting action {} ({}) on {}\".format(tracker_data['action_id'], next_action.name, robot_id))\n                        self._run_action(robot_id, next_action)\n                        tracker_data['action_validator'].update_expected_action(next_action.action_type)\n\n        # Update if any robots are idle\n        for robot, data in self.cycle_tracker.items():\n            if data['action_id'] == None and data['cycle_id'] == None:\n                self.idle_bots.add(robot)\n\n    def _modify_cycle_state(self, target, add_cycle_id=None, add_action_id=None, absolute=False):\n        if self.plan_status == PlanStatus.RUNNING:\n            logging.warning(\"Cannot update cycle state while plan is running.\")\n            return\n        if target not in self.cycle_tracker.keys():\n            logging.warning(\"Invalid target: {}\".format(target))\n            return\n        \n        cur_cycle_id = self.cycle_tracker[target][\"cycle_id\"]\n        if cur_cycle_id is None:\n            cur_cycle_id = 0\n        new_cycle_id = cur_cycle_id\n        if add_cycle_id is not None:\n            if absolute:\n                new_cycle_id = add_cycle_id\n            else:                \n                new_cycle_id = cur_cycle_id + add_cycle_id\n            if self.plan and new_cycle_id >= len(self.plan.cycles) or new_cycle_id < 0:\n                logging.warning(\"Cycle id {} out of bounds\".format(new_cycle_id))\n                return\n        \n        cur_action_id = self.cycle_tracker[target][\"action_id\"]\n        if cur_action_id is None:\n            cur_action_id = 0\n        new_action_id = cur_action_id\n        if add_action_id is not None:\n            if absolute:\n                new_action_id = add_action_id\n            else:\n                new_action_id = cur_action_id + add_action_id    \n            if self.plan and new_action_id >= len(self.plan.get_cycle(new_cycle_id).action_sequence) or new_action_id < 0:\n                logging.warning(\"Action id {} out of bounds\".format(new_action_id))\n                return\n\n        data = self.cycle_tracker[target]\n        data['action_id'] = new_action_id\n        data['cycle_id'] = new_cycle_id\n        data['timer'] = None\n        data['needs_restart'] = True\n        data['action_validator'] = ActionValidator()\n        self.next_cycle_number = new_cycle_id + 1\n        try:\n            self.idle_bots.remove(target)\n        except:\n            pass\n\n\n    def _cycle_state_to_file(self):\n        if self.plan_status == PlanStatus.RUNNING:\n            # Copy data and prepare to write\n            robot_cycle_states = copy.deepcopy(self.cycle_tracker)\n            for tracker_data in robot_cycle_states.values():\n                del tracker_data['timer']\n                del tracker_data['action_validator']\n\n            data_to_dump = {}\n            data_to_dump['plan_path'] = self.plan_path\n            data_to_dump['next_cycle_number'] = self.next_cycle_number\n            data_to_dump['robots'] = robot_cycle_states\n\n            with open(self.config.cycle_state_file, 'w') as f:\n                json.dump(data_to_dump, f)\n\n    def _load_cycle_state_from_file(self, filepath):\n        logging.info(\"Loading cycle state information from {}\".format(filepath))\n        with open(filepath, 'r') as f:\n            loaded_data = json.load(f)\n        \n        # Set the various bits of data from the file\n        self.plan_path = loaded_data['plan_path']\n        self.next_cycle_number = loaded_data['next_cycle_number']\n        self.cycle_tracker = loaded_data['robots']\n\n        # Need to handle robot states carefully\n        for robot, data in self.cycle_tracker.items():\n            # Need to re-add timer state and tell the controller the action needs to be restarted\n            data['timer'] = None\n            data['needs_restart'] = True\n            data['action_validator'] = ActionValidator()\n\n            # Need to remove the robot from the idle list if it was actually doing something at the time it stopped\n            if data['action_id'] is not None or data['cycle_id'] is not None:\n                self.idle_bots.remove(robot)\n        \n        # Reload the plan data and start the status as PAUSED so that it can be resumed\n        self._load_plan_from_file(self.plan_path)\n        self.plan_status = PlanStatus.PAUSED\n\n    def _erase_cycle_state_file(self):\n        fname = self.config.cycle_state_file       \n        if os.path.exists(fname):\n            os.remove(fname)\n\n\n\n        \n\n"
  },
  {
    "path": "src/master/Utils.py",
    "content": "import time\nimport numpy as np\nimport math\nimport enum\n\nclass ActionTypes(enum.Enum):\n    MOVE_COARSE = 0,\n    MOVE_FINE = 1,\n    MOVE_REL = 2,\n    NET = 3,\n    LOAD = 4,\n    PLACE = 5,\n    TRAY_INIT = 6, \n    LOAD_COMPLETE = 7,\n    ESTOP = 8,\n    WAIT_FOR_LOCALIZATION = 9, \n    MOVE_CONST_VEL = 10,\n    CLEAR_ERROR = 11,\n    NONE = 12,\n    SET_POSE = 13,\n    MOVE_WITH_VISION = 14,\n    TOGGLE_VISION_DEBUG = 15,\n    START_CAMERAS = 16,\n    STOP_CAMERAS = 17,\n    MOVE_REL_SLOW = 18,\n    WAIT = 19,\n    MOVE_FINE_STOP_VISION = 20,\n    PAUSE_PLAN = 21,\n\nclass NonBlockingTimer:\n\n    def __init__(self, trigger_time):\n        self.start_time = time.time()\n        self.trigger_time = trigger_time\n\n    def check(self):\n        if time.time() - self.start_time > self.trigger_time:\n            return True \n        else:\n            return False\n\ndef write_file(filename, text):\n    with open(filename, 'w+') as f:\n        f.write(text)\n\n\n# Pos and offset expected as 2 long numpy vector, Angle is expected in degrees\n# Returns 2 length numpy vector\ndef TransformPos(pos_2d, frame_offset_2d, frame_angle):\n\n    frame_angle = math.radians(frame_angle)\n    # Make a transform\n    tf = np.array([[ np.cos(frame_angle), -np.sin(frame_angle), frame_offset_2d[0] ],\n                   [ np.sin(frame_angle),  np.cos(frame_angle), frame_offset_2d[1] ],\n                   [0, 0, 1 ]])\n\n    pos_2d = np.concatenate((pos_2d,[1])).reshape((3,))\n    new_pos = np.matmul(tf, pos_2d)\n    return new_pos[:2].reshape((2,))\n\n\nclass ActionValidator:\n\n    def __init__(self):\n        self.expected_action = str(ActionTypes.NONE)\n        self.action_validated = False\n\n    def update_expected_action(self, expected_action):\n        self.expected_action = str(expected_action)\n        self.action_validated = False\n\n    def update_action_validation(self, metric):\n        if metric is not None and \\\n           'in_progress' in metric and \\\n           'current_action' in metric and \\\n            metric['current_action'] == self.expected_action:\n            if self.expected_action in [ActionTypes.START_CAMERAS, ActionTypes.STOP_CAMERAS]:\n                self.action_validated = True\n            elif metric['in_progress'] == True:\n                self.action_validated = True\n        return self.action_validated\n\n\n\n\nif __name__ == '__main__':\n\n    test_pos = np.array([1,1])\n    test_offset = np.array([2,2])\n    test_angle = 90\n    expected_pos = np.array([1,3])\n    new_pos = TransformPos(test_pos, test_offset, test_angle)\n    print(\"Actual: {}, expected: {}\".format(new_pos, expected_pos))"
  },
  {
    "path": "src/master/__init__.py",
    "content": ""
  },
  {
    "path": "src/master/camera_utils.py",
    "content": "# Helper script to get camera images and debug info from raspi\n\nimport paramiko\nfrom scp import SCPClient\nimport os\nimport matplotlib.image as mpimg\nimport matplotlib.pyplot as plt\nimport config\nfrom datetime import datetime\n\ndef scp_image(ip, remote_path, local_path):\n    ssh_client = paramiko.SSHClient()\n    ssh_client.load_system_host_keys()\n    ssh_client.connect(ip, username=\"pi\", password='dominobot')\n    with SCPClient(ssh_client.get_transport()) as scp:\n        scp.get(remote_path, local_path)\n    if not os.path.exists(local_path):\n        raise ValueError(\"SCP image did not complete successfully\")\n\ndef display_debug_image(local_path):\n    img = mpimg.imread(local_path)\n    plt.imshow(img)\n    figManager = plt.get_current_fig_manager()\n    figManager.window.state('zoomed')\n    plt.show()\n\ndef get_and_display_multiple_images(cam_name, remote_ip, remote_path, local_path, img_data):\n    fig, axes = plt.subplots(nrows=2, ncols=3)\n    fig.suptitle(\"{} camera\".format(cam_name), fontsize=16)\n    ax = axes.ravel()\n\n    for i,data in enumerate(img_data):\n        cur_time_str = datetime.now().strftime(\"%Y%m%d%H%M%S\")\n        remote_path_to_file = remote_path + data['file']\n        local_path_to_file = os.path.join(local_path, cur_time_str+\"_\"+cam_name+\"_\"+data['file'])\n        scp_image(remote_ip, remote_path_to_file, local_path_to_file)\n        img = mpimg.imread(local_path_to_file)\n        if data[\"color\"]:\n            ax[i].imshow(img)\n        else:\n            ax[i].imshow(img, cmap='gray', vmin=0, vmax=255)\n        ax[i].set_title(data['title'])\n    \n    plt.tight_layout()\n    figManager = plt.get_current_fig_manager()\n    figManager.window.state('zoomed')\n    plt.show()\n\nif __name__ == '__main__':\n    cfg = config.Config()\n\n    robot_ip = cfg.ip_map['robot1']\n    remote_path = \"/home/pi/images/debug/\"\n    local_path = os.path.join(cfg.log_folder,\"images\")\n    img_data = []\n    img_data.append({\n        \"file\": \"img_raw.jpg\",\n        \"title\": \"raw\",\n        \"color\": False\n    })\n    img_data.append({\n        \"file\": \"img_undistorted.jpg\",\n        \"title\": \"undistorted\",\n        \"color\": False\n    })\n    img_data.append({\n        \"file\": \"img_thresh.jpg\",\n        \"title\": \"threshold\",\n        \"color\": False\n    })\n    img_data.append({\n        \"file\": \"img_keypoints.jpg\",\n        \"title\": \"detections\",\n        \"color\": False\n    })\n    img_data.append({\n        \"file\": \"img_best_keypoint.jpg\",\n        \"title\": \"best\",\n        \"color\": False\n    })\n\n    DISPLAY_SIDE = True\n    DISPLAY_REAR = True\n\n    if DISPLAY_SIDE:\n        side_remote_path = remote_path + 'side/'\n        get_and_display_multiple_images(\"side\", robot_ip, side_remote_path, local_path, img_data)\n    if DISPLAY_REAR:\n        rear_remote_path = remote_path + 'rear/'\n        get_and_display_multiple_images(\"rear\", robot_ip, rear_remote_path, local_path, img_data)\n\n"
  },
  {
    "path": "src/master/config.py",
    "content": "import numpy as np\nimport os\nimport Utils\n\nclass Config:\n\n    # Various debug/test flags\n    # Set to override config values for home network\n    USING_HOME_NETWORK = False\n    # Set for laptop vs desktop\n    USING_DESKTOP = False\n    # Set to override config values for small scale testing\n    USE_SMALL_TESTING_CONFIG = False\n    # Set to skip connecting to robot\n    OFFLINE_TESTING = False\n    # Set to skip connecting to base station\n    SKIP_BASE_STATION = True\n    # Set to skip connecting to Marvelmind\n    SKIP_MARVELMIND = True\n    # Set to use fake plan instead of loading a generated one\n    USE_TEST_PLAN = False\n    # MR LOGO plan\n    MR_LOGO_PLAN = False\n    # Set to auto-load this plan on master startup\n    AUTO_LOAD_PLAN = False\n    AUTO_LOAD_PLAN_NAME = \"FullPlan_DominoBros.p\"\n    # Set to regenerate and auto load plan on master startup\n    REGEN_PLAN = True\n    # Set to true to use just a subsection of the overal plan\n    USE_SUBSECTION = True\n\n    # ====== PATHS ========\n    \n    root_path = \"C:\\\\Users\\\\alexb\\\\Documents\\\\Github\\\\DominoRobot\\\\\"  # Laptop\n    if USING_DESKTOP:\n        root_path = \"C:\\\\Users\\\\alexb\\\\Data\\\\Github\\\\DominoRobot\\\\\"   # Desktop\n    mm_api_relative_path = \"marvelmind_SW_20202_04_19\\\\API\\\\api_windows_64bit\\\\dashapi.dll\"\n    config_dir_path = os.path.dirname(os.path.realpath(__file__))\n    mm_api_path = os.path.join(root_path, mm_api_relative_path)\n    log_folder = os.path.join(root_path, 'log')\n    plans_dir = os.path.join(config_dir_path, 'plans')\n    cycle_state_file = os.path.join(plans_dir, 'previous_plan_state.json')\n\n\n    # ====== ROBOT CONFIG ========\n\n    # Maps robot (or static) to sets of marvel mind beacons\n    device_map = {\n    \"static\": (11, 12),\n    \"robot1\": (1, 2)\n    }\n\n    # Specifies which IP address each robot has\n    ip_map = {'robot1': '10.0.0.3'}   # Workshop\n    if USING_HOME_NETWORK:\n        ip_map = {'robot1': '192.168.1.5'}   # Home\n    base_station_ip = '10.0.0.100'\n\n    # ====== PLAN GENERATION ========\n\n    # Image configuration\n    image_name = os.path.join(config_dir_path, 'DominoDesign-Questions.psd')\n    num_tiles_width = 18\n    num_tiles_height = 19\n    dominos = np.array(\n                [('black', (0,0,0)),\n                ('red',   (1,0,0)),\n                ('blue',  (0.188,0.5,0.886)),\n                ('green', (0,1,0)),\n                ('white', (1,1,1)),\n                ('brown', (1,0.51,0)),\n                ('yellow', (1,0.867,0)),\n                ], dtype=object)\n\n    if USE_SMALL_TESTING_CONFIG:  \n        num_tiles_width = 2\n        num_tiles_height = 4\n    if MR_LOGO_PLAN:\n        image_name = os.path.join(config_dir_path, 'logo.jpg')\n        num_tiles_width = 5\n        num_tiles_height = 5\n        dominos = np.array(\n                [('black', (0,0,0)),\n                ('white', (1,1,1)),\n                ], dtype=object)\n\n    # Physical dimensions of dominos\n    domino_width  = 0.025 # meters\n    domino_height = 0.010 # meters\n    domino_spacing_width = 0.037 # meters\n    domino_spacing_height = 0.024 # meters\n\n    # Spacing for drawing dominos as pixels instead of rectangles\n    meters_per_pixel = 0.008\n    domino_width_px = round(domino_width / meters_per_pixel)\n    domino_height_px = round(domino_height / meters_per_pixel)\n    domino_spacing_width_px = round(domino_spacing_width / meters_per_pixel)\n    domino_spacing_height_px = round(domino_spacing_height / meters_per_pixel)\n\n    # Tile configuration\n    tile_width = 15\n    tile_height = 20\n    tile_background_color = (0.9,0.9,0.9)\n    tile_edge_color = (0,0,1)\n    tile_size_width_meters = 0.930 #tile_width * (domino_spacing_width + domino_width)\n    tile_size_height_meters = 0.665 #tile_height * (domino_spacing_height + domino_height)\n    desired_width_dominos = tile_width * num_tiles_width\n    desired_height_dominos = tile_height * num_tiles_height\n\n    # Map configureation\n    grid_top_left = np.array([13.2, 6.9])\n    grid_top_right = np.array([13.2, -10.07])\n    grid_num_cols = 19\n    grid_bottom_left = np.array([1.18,6.9])\n    grid_bottom_right = np.array([1.18,-10.07])\n    grid_num_rows = 19\n    grid_tile_width = ((grid_top_left - grid_top_right)/grid_num_cols)[1]\n    grid_tile_height = ((grid_top_left - grid_bottom_left)/grid_num_rows)[0]\n    tile_size_width_meters = grid_tile_width\n    # tile_size_height_meters = grid_tile_height\n\n    # Vision offset configuration\n    default_vision_offset = np.array((0,0,-0.5))\n    vision_offset_file = os.path.join(plans_dir, 'vision_offsets_full_plan.csv')\n\n    # ====== ENVIRONMENT CONFIGURATION ========\n\n    # Map configuration (distances in meters, angles in degrees)\n    robot_boundaries = np.array([[1,-11],[15,11]])              # Bottom left, top right, global frame\n    highway_x = 2.5                                       # \"Highway\" coordinate\n    load_waypoint = np.array([highway_x, -8.5])                    # xya (global frame) for waypoint to go to first before load prep\n    highway_angle = -90\n\n    base_station_boundaries = np.array([[2.5,10],[3.5,11]])         # Bottom left, top right, global frame\n    base_station_target_angle = -90                             # Target angle (deg) for base station in global frame\n    base_station_relative_offset = np.array([0.9, 0, 0])           # Relative position of base station from prep pos - robot frame (x,y,a)\n    base_station_vision_offset = np.array([0.01,0.010,-0.5])     # Vision offset for base station alignment\n    base_station_prep_pos = np.array([highway_x, -10.5])                   # Pose outside of base station to align with before going in to dock\n    base_station_prep_vision_offset = np.array([0,0.01,-1])      # Vision offset to use for base station prep pose\n\n    robot_pose_top_left = grid_top_left                   # Robot pose in global frame for top left of tile position of domino field\n    domino_field_angle = -90                                     # Domino field angle (deg), global frame\n    tile_placement_fine_offset = np.array([0, 0])              # Offset position for fine tile placement [x,y], in robot coordinate frame (to avoid hitting next column)\n    tile_placement_coarse_offset = np.array([-0.5,0.5])         # Offset position for tile placement [x,y], in robot coordinate frame\n    tile_to_robot_offset = np.array([-0.3, -tile_size_width_meters/2.0])  # Offset from bottom left of tile to robot center [x,y], in robot coordinate frame     \n    enter_position_distance = 0.25                                  # How far out of field boundaries to do robot prep move\n    intermediate_entry_hz_y = 0                                 # Y coordinate for horizontal intermediate position\n    intermediate_place_vt_x = 8                                 # X coordinate for vertical intermediate position\n    field_to_robot_frame_angle = 90                             # In case robot frame and field frame ever need to be rotated relative to each other\n\n    # Used for testing sub-sections of the larger pattern\n    if USE_SUBSECTION:\n        start_coords = (6, 5)\n        end_coords = (18, 6)\n        # start_coords = (8, 4)\n        # end_coords = (18, 4)\n\n    # Left side\n    # if USE_SMALL_TESTING_CONFIG:  \n    #     load_pose = np.array([7.5,7.5,0])            \n    #     robot_pose_top_left = np.array([12.20,9.472])  \n    #     domino_field_angle = -90\n    #     tile_placement_coarse_offset = np.array([-0.5,-0.5])\n    #     tile_to_robot_offset = np.array([-0.3, -tile_size_width_meters/2.0 ])                             \n\n    # Right side\n    if USE_SMALL_TESTING_CONFIG:  \n        load_pose = np.array([8,-6.5,0])            \n        robot_pose_top_left = np.array([12.74,-6.94])  \n        domino_field_angle = -90\n        tile_placement_coarse_offset = np.array([-0.5,-0.5])\n        tile_to_robot_offset = np.array([-0.3, -tile_size_width_meters/2.0 ])    \n\n    # Fine motion y offset adjustments\n    # y_offset_cols = np.linspace(0.05, 0, num_tiles_height)\n    y_offset_cols = np.linspace(0.51, 0, num_tiles_width)\n    y_offset_rows = np.linspace(0, 0.0, num_tiles_height)\n    x_offset_rows = np.linspace(0, 0.3, num_tiles_height)\n    x_offset_rows[0:2] = 0\n    # Angle adjustment for fine motion to try and prevent wheel from hitting\n    angle_adjust_fine = 4   # degrees\n\n    # Computed - don't change\n    field_width = tile_size_width_meters * desired_width_dominos/tile_width\n    field_height = tile_size_height_meters * desired_height_dominos/tile_height\n    # Fix me\n    domino_field_top_left = robot_pose_top_left + np.array([tile_size_height_meters-tile_to_robot_offset[0], -tile_to_robot_offset[1]]) \\\n         - np.array([0, 2*tile_size_width_meters])\n        #Utils.TransformPos(np.array([tile_size_height_meters-tile_to_robot_offset[0], -tile_to_robot_offset[1]]), [0,0], domino_field_angle)\n    domino_field_origin = domino_field_top_left + Utils.TransformPos(np.array([0,-field_height]), [0,0], domino_field_angle)\n    domino_field_top_right = domino_field_origin + Utils.TransformPos(np.array([field_width,field_height]), [0,0], domino_field_angle)\n    domino_field_boundaries = np.array([domino_field_origin, domino_field_top_right])  \n\n    # ====== GUI CONFIGURATION ========\n    graph_padding = 0.5                          # How many meters of padding to have around edge of graph\n    axes_label_offset = 0.25                     # How many meters to offset labels on axes\n    axes_tick_spacing = 5                        # How many meters of space between tick marks\n    axes_tick_size = 0.5                         # How many meters long the tick marks should be\n    robot_chassis_size = 0.6                     # How many meters wide and long the robot chassis is\n    robot_rotation_center_offset = 0.245         # How many meters from front of robot to center of rotation\n    robot_direction_indicator_length = 0.8       # How many meters long to make the direction line for the robot render\n    robot_direction_indicator_arrow_length = 0.4  # How many meters long to make direction arrow lines\n    robot_direction_indicator_arrow_angle = 25    # How many degrees to make arrow angle lines\n\n    # Computed - don't change\n    graph_bottom_left = np.array((\n        min(robot_boundaries[0][0], 0) - graph_padding,\n        robot_boundaries[0][1] - graph_padding\n    ))\n    graph_size = np.max(robot_boundaries[1] - robot_boundaries[0]) + 2*graph_padding\n    graph_top_right = graph_bottom_left + np.array([graph_size,graph_size])\n\n\n    # ====== RUNTIME CONFIGURATION ========\n    robot_status_wait_time = 0.2    # How many seconds to wait between status requests for each robot\n    base_station_status_wait_time = 1 # How many seconds to wait between status requests for the base station\n    robot_next_action_wait_time = 2.0 # How many seconds to wait before checking if robot is finished with current plan action\n"
  },
  {
    "path": "src/master/plans/vision_offsets small_testing_area.csv",
    "content": "#tile_x, tile_y, offset_x (millimeters), offset_y(millimeters), offset_a(degrees), add to default (True) or exact (False)\n0,0,5,5,0,True\n1,1,0,15,0,True\n1,0,10,25,0,True"
  },
  {
    "path": "src/master/plans/vision_offsets_full_plan.csv",
    "content": "#tile_x, tile_y, offset_x (millimeters), offset_y(millimeters), offset_a(degrees)\nX,17,0,-4,0"
  },
  {
    "path": "src/master/plot_logs2.py",
    "content": "import paramiko\nfrom scp import SCPClient\nimport os\nimport matplotlib.pyplot as plt\nimport config\nimport csv\nimport numpy as np\nimport datetime\n\npossible_rows = [\n    'time',\n    'pos_x',\n    'pos_y',\n    'pos_a',\n    'vel_x',\n    'vel_y',\n    'vel_a',\n    'target_pos_x',\n    'target_pos_y',\n    'target_pos_a',\n    'target_vel_x',\n    'target_vel_y',\n    'target_vel_a',\n    'control_vel_x',\n    'control_vel_y',\n    'control_vel_a'\n]\n\ndef scp_last_motion_log(ip, remote_path, local_path):\n    ssh_client = paramiko.SSHClient()\n    ssh_client.load_system_host_keys()\n    ssh_client.connect(ip, username=\"pi\", password='dominobot')\n    with SCPClient(ssh_client.get_transport()) as scp:\n        scp.get(remote_path, local_path)\n    if not os.path.exists(local_path):\n        raise ValueError(\"SCP image did not complete successfully\")\n\ndef init_data():\n    data = {name:[] for name in possible_rows}\n    data['title'] = \"\"\n    data['start_time'] = None\n    return data\n\ndef parse_log_file(path):\n\n    data = init_data()\n    with open(local_file) as csvfile:\n        reader = csv.reader(csvfile, delimiter=\";\")\n        for id,row in enumerate(reader):\n            if id == 0:\n                continue\n            elif id == 1:\n                data['title'] = row[-1].replace('\"','')\n            else:\n                parse_row(row, data)\n\n    return data\n\ndef parse_row(row, data):\n    entry_data = row[-1].replace('\"','').split(\",\")\n    entry_name = entry_data[0]\n    entry_values = entry_data[1:]\n    entry_time = datetime.datetime.strptime(row[1], '%H:%M:%S.%f')\n\n    if entry_name == \"time\":\n        if not data[\"start_time\"]:\n            data[\"start_time\"] = entry_time\n            data[\"time\"].append(0)\n        else:\n            dt = (entry_time - data['start_time'])/ datetime.timedelta(seconds=1)\n            data[\"time\"].append(dt)\n\n    else:\n        for i,axis in enumerate(['x','y','a']):\n            axis_name = entry_name+'_'+axis\n            axis_value = float(entry_values[i])\n            data[axis_name].append(axis_value)\n\ndef plot_data(data, rows_to_plot=None):\n    if not rows_to_plot:\n        rows_to_plot = possible_rows[1:]\n\n    plt.figure()\n    ax = plt.gca()\n\n    for row in rows_to_plot:\n        ax.plot('time', row, data=data, label=row)\n\n    ax.legend()\n    ax.set_title(data['title'])\n    plt.xlabel('Time (s)')\n    plt.show()\n\n\ndef plot_rows_axes(data, rows, axes):\n    rows_to_plot = [row+'_'+axis for axis in axes for row in rows]\n    plot_data(data, rows_to_plot)\n\n\nif __name__ == '__main__':\n    cfg = config.Config()\n\n    GET_FILE = True\n    EXISTING_LOCAL_FILENAME = \"WiggleVision.csv\"\n\n    if GET_FILE:\n        robot_ip = cfg.ip_map['robot1']\n        remote_file = \"/home/pi/DominoRobot/src/robot/log/last_motion_log.csv\"\n        local_file = os.path.join(cfg.log_folder, \"last_motion_log.csv\")\n        scp_last_motion_log(robot_ip, remote_file, local_file)\n    else:\n        local_file = os.path.join(cfg.log_folder, EXISTING_LOCAL_FILENAME)\n\n    parsed_data = parse_log_file(local_file)\n    # plot_data(parsed_data, rows_to_plot=['control_vel_x','control_vel_y','control_vel_a'])\n    plot_rows_axes(parsed_data, ['pos','vel','target_pos','target_vel','control_vel'], ['a'])"
  },
  {
    "path": "src/master/requirements.txt",
    "content": "﻿cycler==0.10.0\r\ndecorator==4.4.2\r\nimageio==2.9.0\r\nkiwisolver==1.3.1 #Was originally 1.2.0, but wasn't building with Python 3.9.6\r\nmatplotlib==3.3.0\r\nnetworkx==2.5\r\nnumpy==1.21.0 #Changed from 1.19.1  \r\nopencv-python==4.5.1.48\r\nparamiko==2.7.2\r\nPillow==8.2.0\r\npyparsing==2.4.7\r\npyserial==3.4\r\nPySimpleGUI==4.32.1\r\npython-dateutil==2.8.1\r\nPyWavelets==1.1.1\r\nrope==0.18.0\r\nscikit-image==0.18.0 #0.17.2\r\nscipy==1.7.0 #1.5.2\r\nscp==0.13.3\r\nsix==1.15.0\r\ntifffile==2020.9.3\r\nvirtualenv==16.7.8\r\n"
  },
  {
    "path": "src/robot/Makefile",
    "content": "# Started with https://spin.atomicobject.com/2016/08/26/makefile-c-projects/\n# Additional inspiration from https://codereview.stackexchange.com/questions/157780/generic-makefile-handling-unit-testing-and-folder-structure\n\nTARGET_EXEC ?= robot-main\nTEST_EXEC ?= test-main\n\nBUILD_DIR ?= build\nSRC_DIRS ?= src\nTEST_DIRS ?=  test\n\nSRCS := $(shell find $(SRC_DIRS) -name *.cpp -or -name *.c -or -name *.s)\nOBJS := $(SRCS:%=$(BUILD_DIR)/%.o)\nDEPS := $(OBJS:.o=.d)\n\nINC_DIRS := ./lib ./src /usr/local/include\nINC_FLAGS := $(addprefix -I,$(INC_DIRS))\n\nCPPFLAGS ?= $(INC_FLAGS) -MMD -MP\n# -g for debug symbols, -O2 for run speed\nCXXFLAGS ?= -std=c++17 -Wall -Wextra -Wwrite-strings -Wno-parentheses -pthread -O3 `pkg-config --cflags opencv4`\nLDFLAGS += -ldl -lconfig++ -lstdc++fs `pkg-config --libs opencv4` \nLIBS += /usr/local/lib/libserial.a ./lib/marvelmind/marvelmind.a\n\nTEST_SRCS := $(shell find $(TEST_DIRS) -name *.cpp -or -name *.c -or -name *.s)\nTEST_OBJS := $(TEST_SRCS:%=$(BUILD_DIR)/%.o) $(filter-out build/src/main.cpp.o, $(OBJS))\n\nvpath %.cpp $(SRC_DIRS)\n\n.PHONY: all\nall: $(BUILD_DIR)/$(TARGET_EXEC)\n\n.PHONY: test\ntest: $(BUILD_DIR)/$(TEST_EXEC)\n\n# Target file\n$(BUILD_DIR)/$(TARGET_EXEC): $(OBJS)\n\t$(CXX) $(CXXFLAGS) $(OBJS) -o $@ $(LIBS) $(LDFLAGS)\n\n# Test file\n$(BUILD_DIR)/$(TEST_EXEC): $(TEST_OBJS)\n\t$(CXX) $(CXXFLAGS) $(TEST_OBJS) -o $@ $(LIBS) $(LDFLAGS)\n\n# Source files\n$(BUILD_DIR)/%.cpp.o: %.cpp\n\t$(MKDIR_P) $(dir $@)\n\t$(CXX) $(CPPFLAGS) $(CXXFLAGS) -c $< -o $@\n\n\n.PHONY: clean\nclean:\n\t$(RM) -r $(BUILD_DIR)\n\n\n-include $(DEPS)\nMKDIR_P ?= mkdir -p\n\n.DELETE_ON_ERROR:\n"
  },
  {
    "path": "src/robot/README.md",
    "content": "# **Warning! This is probably out of date and might be inaccurate. Use at your own risk.**\n\n## Libraries needed for installation on Raspberry Pi\nRequired header libraries at src/robot/lib\n- ArduinoJson\n- Catch (for tests)\n- Eigen\n- plog\n- marvelmind library, compiled as .a - https://marvelmind.com/download/ I used 2020_04_10_C_example. This required the addition of `#include <pthread.h>` in `marvelmind.h` to compile correctly\n\nRequired system libraries at /usr/local/include\n- libSerial: https://github.com/crayzeewulf/libserial (have to build from source due to outdated apt package)\n\nRequired libraries at /usr/local/lib\n- libserial.a\n- opencv: (following instructions here: https://docs.opencv.org/master/d7/d9f/tutorial_linux_install.html and fix from here: https://github.com/cggos/DIPDemoQt/issues/1)\n\nRequired libraries installed with apt-get\n - libconfig++-dev\n\n ## Build and run\n Go to `DominoRobot/src/robot` folder and run `make` to build. You can build tests with `make test` and remove all outputs with `make clean`.\n\n Run main or test using `build/robot-main` or `build/robot-test` respectively. The Raspberry Pi on the robot has these aliased to `run_robot` and `run_test` for ease of use."
  },
  {
    "path": "src/robot/src/KalmanFilter.cpp",
    "content": "#include \"KalmanFilter.h\"\r\n\r\nKalmanFilter::KalmanFilter(\r\n    Eigen::MatrixXf A,\r\n    Eigen::MatrixXf B,\r\n    Eigen::MatrixXf C,\r\n    Eigen::MatrixXf Q,\r\n    Eigen::MatrixXf R )\r\n:  KalmanFilter(A.rows(), C.rows())\r\n{\r\n    A_ = A; \r\n    B_ = B; \r\n    C_ = C; \r\n    Q_ = Q; \r\n    R_ = R; \r\n}\r\n\r\nKalmanFilter::KalmanFilter(int n, int m) \r\n: n_(n),\r\n  m_(m),\r\n  A_(Eigen::MatrixXf::Identity(n,n)), \r\n  B_(Eigen::MatrixXf::Zero(n,n)), \r\n  C_(Eigen::MatrixXf::Zero(m,n)), \r\n  I_(Eigen::MatrixXf::Identity(n,n)),\r\n  K_(Eigen::MatrixXf::Zero(n,m)),\r\n  P_(Eigen::MatrixXf::Identity(n,n)),\r\n  Q_(Eigen::MatrixXf::Identity(n,n)), \r\n  R_(Eigen::MatrixXf::Identity(m,m)), \r\n  S_(Eigen::MatrixXf::Identity(m,m)),\r\n  x_hat_(Eigen::VectorXf::Zero(n))\r\n{}\r\n\r\nvoid KalmanFilter::predict(const Eigen::VectorXf& u)\r\n{\r\n    x_hat_ = A_ * x_hat_ + B_ * u;\r\n    P_ = A_ * P_ * A_.transpose() + Q_;\r\n}\r\n\r\nvoid KalmanFilter::update(const Eigen::VectorXf& y, Eigen::MatrixXf R) \r\n{\r\n    R_ = R;\r\n    update(y);\r\n}\r\n\r\nvoid KalmanFilter::update(const Eigen::VectorXf& y) \r\n{\r\n    S_ = C_ * P_ * C_.transpose() + R_;\r\n    K_ = P_ * C_.transpose() * S_.inverse();\r\n    x_hat_ = x_hat_ + K_ * (y - C_ * x_hat_);\r\n    P_ = (I_ - K_ * C_) * P_;\r\n}"
  },
  {
    "path": "src/robot/src/KalmanFilter.h",
    "content": "/**\r\n* Kalman filter loosely based on\r\n* https://github.com/hmartiro/kalman-cpp\r\n* and\r\n* https://github.com/tysik/kalman_filters\r\n*/\r\n\r\n#ifndef KalmanFilter_h\r\n#define KalmanFilter_h\r\n\r\n#include <Eigen/Dense>\r\n\r\nclass KalmanFilter \r\n{\r\n\r\n  public:\r\n\r\n    KalmanFilter(\r\n        Eigen::MatrixXf A,\r\n        Eigen::MatrixXf B,\r\n        Eigen::MatrixXf C,\r\n        Eigen::MatrixXf Q,\r\n        Eigen::MatrixXf R\r\n    );\r\n\r\n    // Size only simplify initializing in a class\r\n    KalmanFilter(int n, int m);\r\n\r\n    // Prediction step with input u\r\n    void predict(const Eigen::VectorXf& u);\r\n\r\n    // Update the estimated state based on measured values.\r\n    void update(const Eigen::VectorXf& y);\r\n    void update(const Eigen::VectorXf& y, Eigen::MatrixXf R);\r\n\r\n    // Get current state estimate\r\n    Eigen::VectorXf state() { return x_hat_; };\r\n    Eigen::MatrixXf covariance() { return P_; };\r\n\r\n    void update_covariance(Eigen::MatrixXf P) {P_ = P;}\r\n\r\n  private:\r\n\r\n    // System dimensions\r\n    int n_;\r\n    int m_;\r\n\r\n    // Matrices for computation\r\n    Eigen::MatrixXf A_;\r\n    Eigen::MatrixXf B_;\r\n    Eigen::MatrixXf C_;\r\n    Eigen::MatrixXf I_;\r\n    Eigen::MatrixXf K_;\r\n    Eigen::MatrixXf P_;\r\n    Eigen::MatrixXf Q_;\r\n    Eigen::MatrixXf R_;\r\n    Eigen::MatrixXf S_;\r\n\r\n    // Estimated state\r\n    Eigen::VectorXf x_hat_;\r\n};\r\n\r\n#endif //KalmanFilter_h"
  },
  {
    "path": "src/robot/src/Localization.cpp",
    "content": "#include \"Localization.h\"\r\n\r\n#include <math.h>\r\n#include <algorithm> \r\n#include <plog/Log.h>\r\n#include \"constants.h\"\r\n\r\nLocalization::Localization()\r\n: pos_(0,0,0),\r\n  vel_(0,0,0),\r\n  mm_x_offset_(cfg.lookup(\"localization.mm_x_offset\")),\r\n  mm_y_offset_(cfg.lookup(\"localization.mm_y_offset\")),\r\n  variance_ref_trans_(cfg.lookup(\"localization.variance_ref_trans\")),\r\n  variance_ref_angle_(cfg.lookup(\"localization.variance_ref_angle\")),\r\n  meas_trans_cov_(cfg.lookup(\"localization.kf_meas_trans_cov\")),\r\n  meas_angle_cov_(cfg.lookup(\"localization.kf_meas_angle_cov\")),\r\n  localization_uncertainty_scale_(cfg.lookup(\"localization.kf_uncertainty_scale\")),\r\n  min_vel_uncertainty_(cfg.lookup(\"localization.min_vel_uncertainty\")),\r\n  vel_uncertainty_slope_(cfg.lookup(\"localization.vel_uncertainty_slope\")),\r\n  max_vel_uncetainty_(cfg.lookup(\"localization.max_vel_uncetainty\")),\r\n  vel_uncertainty_decay_time_(cfg.lookup(\"localization.vel_uncertainty_decay_time\")),\r\n  metrics_(),\r\n  time_since_last_motion_(),\r\n  kf_(3,3)\r\n{\r\n    Eigen::MatrixXf A = Eigen::MatrixXf::Identity(3,3);\r\n    Eigen::MatrixXf B = Eigen::MatrixXf::Identity(3,3);\r\n    Eigen::MatrixXf C = Eigen::MatrixXf::Identity(3,3);\r\n    Eigen::MatrixXf Q(3,3);\r\n    float predict_trans_cov = cfg.lookup(\"localization.kf_predict_trans_cov\");\r\n    float predict_angle_cov = cfg.lookup(\"localization.kf_predict_angle_cov\");\r\n    Q << predict_trans_cov,0,0, \r\n         0,predict_trans_cov,0,\r\n         0,0,predict_angle_cov;\r\n    Eigen::MatrixXf R(3,3);\r\n    R << meas_trans_cov_,0,0, \r\n         0,meas_trans_cov_,0,\r\n         0,0,meas_angle_cov_;\r\n    kf_ = KalmanFilter(A,B,C,Q,R);\r\n}\r\n\r\nvoid Localization::updatePositionReading(Point global_position)\r\n{\r\n    if (fabs(global_position.a) > 3.2) \r\n    {\r\n        PLOGE << \"INVALID ANGLE - should be in radians between +/- pi\";\r\n        return;\r\n    }\r\n    \r\n    // The x,y,a here is from the center of the marvlemind pair, so we need to transform it to the actual center of the\r\n    // robot (i.e. center of rotation)\r\n    Eigen::Vector3f raw_measured_position = {global_position.x, global_position.y, global_position.a};\r\n    Eigen::Vector3f adjusted_measured_position = marvelmindToRobotCenter(raw_measured_position);\r\n    \r\n    // Generate an uncertainty based on the current velocity and time since motion since we know the beacons are less accurate when moving\r\n    const float position_uncertainty = computePositionUncertainty();\r\n    metrics_.last_position_uncertainty = position_uncertainty;\r\n    Eigen::MatrixXf R(3,3);\r\n    R << meas_trans_cov_ + position_uncertainty*localization_uncertainty_scale_,0,0, \r\n         0,meas_trans_cov_+ position_uncertainty*localization_uncertainty_scale_,0,\r\n         0,0,meas_angle_cov_+ position_uncertainty*localization_uncertainty_scale_;\r\n\r\n    // PLOGI << \"position_uncertainty: \" << position_uncertainty;\r\n    // PLOGI << \"R: \" << R;\r\n    // PLOGI << \"cov1: \" << kf_.covariance();\r\n\r\n    kf_.update(adjusted_measured_position, R);\r\n    Eigen::VectorXf est = kf_.state();\r\n    pos_.x = est[0];\r\n    pos_.y = est[1];\r\n    pos_.a = wrap_angle(est[2]);\r\n    \r\n    // PLOGI << \"cov2: \" << kf_.covariance();\r\n\r\n    PLOGI_(LOCALIZATION_LOG_ID).printf(\"\\nPosition update:\");\r\n    PLOGI_(LOCALIZATION_LOG_ID).printf(\"  Raw input: [%4.3f, %4.3f, %4.3f]\", \r\n        raw_measured_position(0), raw_measured_position(1), raw_measured_position(2));\r\n    PLOGI_(LOCALIZATION_LOG_ID).printf(\"  Adjusted input: [%4.3f, %4.3f, %4.3f]\", \r\n        adjusted_measured_position(0), adjusted_measured_position(1), adjusted_measured_position(2));\r\n    PLOGI_(LOCALIZATION_LOG_ID).printf(\"  position_uncertainty: %4.3f\", position_uncertainty);\r\n    PLOGI_(LOCALIZATION_LOG_ID).printf(\"  Current position: %s\\n\", pos_.toString().c_str());\r\n}\r\n\r\nvoid Localization::updateVelocityReading(Velocity local_cart_vel, float dt)\r\n{\r\n    // Convert local cartesian velocity to global cartesian velocity using the last estimated angle\r\n    float cA = cos(pos_.a);\r\n    float sA = sin(pos_.a);\r\n    vel_.vx = cA * local_cart_vel.vx - sA * local_cart_vel.vy;\r\n    vel_.vy = sA * local_cart_vel.vx + cA * local_cart_vel.vy;\r\n    vel_.va = local_cart_vel.va;\r\n\r\n    // Apply prediction step with velocity to get estimated position\r\n    Eigen::Vector3f u = {vel_.vx, vel_.vy, vel_.va};\r\n    Eigen::Vector3f udt = dt*u;\r\n    kf_.predict(udt);\r\n    // PLOGI << \"cov3: \" << kf_.covariance();\r\n    time_since_last_motion_.reset();\r\n\r\n    Eigen::VectorXf est = kf_.state();\r\n    pos_.x = est[0];\r\n    pos_.y = est[1];\r\n    pos_.a = wrap_angle(est[2]);\r\n\r\n    // Using the covariance matrix from kf, using those values to estimate a fractional 'confidence'\r\n    // in our positioning relative to some reference amout.\r\n    Eigen::MatrixXf cov = kf_.covariance();\r\n\r\n    // Compute inverse confidence. As long as the variance isn't larger than the reference values\r\n    // these will be between 0-1 with 0 being more confident in the positioning\r\n    metrics_.confidence_x = std::max(1-cov(0,0)/variance_ref_trans_, 0.0f);\r\n    metrics_.confidence_y = std::max(1-cov(1,1)/variance_ref_trans_, 0.0f);\r\n    metrics_.confidence_a = std::max(1-cov(2,2)/variance_ref_angle_, 0.0f);\r\n    metrics_.total_confidence = (metrics_.confidence_x + metrics_.confidence_y + metrics_.confidence_a)/3;\r\n}\r\n\r\nEigen::Vector3f Localization::marvelmindToRobotCenter(Eigen::Vector3f mm_global_position) \r\n{\r\n    Eigen::Vector3f local_offset = {mm_x_offset_/1000.0f, mm_y_offset_/1000.0f, 0.0f};\r\n    float cA = cos(mm_global_position(2));\r\n    float sA = sin(mm_global_position(2));\r\n    Eigen::Matrix3f rotation;\r\n    rotation << cA, -sA, 0.0f, \r\n                sA, cA, 0.0f,\r\n                0.0f, 0.0f, 1.0f;\r\n    Eigen::Vector3f adjusted_position = mm_global_position - rotation * local_offset;\r\n\r\n    // PLOGD_(LOCALIZATION_LOG_ID).printf(\"\\nMM to robot frame:\");\r\n    // PLOGD_(LOCALIZATION_LOG_ID).printf(\"  mm position: [%4.3f, %4.3f, %4.3f]\", mm_global_position(0), mm_global_position(1), mm_global_position(2));\r\n    // PLOGD_(LOCALIZATION_LOG_ID).printf(\"  R: \\n[%4.3f, %4.3f, %4.3f]\\n[%4.3f, %4.3f, %4.3f]\\n[%4.3f, %4.3f, %4.3f]\", \r\n    //  rotation(0,0), rotation(0,1), rotation(0,2), rotation(1,0), rotation(1,1), rotation(1,2), rotation(2,0), rotation(2,1), rotation(2,2));\r\n    // PLOGD_(LOCALIZATION_LOG_ID).printf(\"  new position: [%4.3f, %4.3f, %4.3f]\", adjusted_position(0), adjusted_position(1), adjusted_position(2));\r\n\r\n    return adjusted_position;\r\n}\r\n\r\nfloat Localization::computePositionUncertainty()\r\n{\r\n    Eigen::Vector3f v = {vel_.vx, vel_.vy, vel_.va};\r\n    float total_v = v.norm();\r\n\r\n    float position_uncertainty = 0.0f;\r\n    if(total_v > 0.01)\r\n    {\r\n        position_uncertainty = std::max(min_vel_uncertainty_ + vel_uncertainty_slope_*total_v, max_vel_uncetainty_);\r\n    }\r\n    else \r\n    {\r\n        position_uncertainty = std::max(min_vel_uncertainty_ * (vel_uncertainty_decay_time_ - time_since_last_motion_.dt_s()) / vel_uncertainty_decay_time_, 0.0f);\r\n    }\r\n\r\n    PLOGD_(LOCALIZATION_LOG_ID).printf(\"\\nPosition Uncertainty:\");\r\n    PLOGD_(LOCALIZATION_LOG_ID).printf(\"  Total v: %4.3f, time since motion: %4.3f, position_uncertainty: %4.3f\", total_v, time_since_last_motion_.dt_s(), position_uncertainty);\r\n\r\n    return position_uncertainty;\r\n}\r\n\r\nvoid Localization::resetAngleCovariance() \r\n{\r\n    Eigen::MatrixXf covariance = kf_.covariance();\r\n    covariance(2,2) = variance_ref_angle_;\r\n    kf_.update_covariance(covariance);\r\n}"
  },
  {
    "path": "src/robot/src/Localization.h",
    "content": "#ifndef Localization_h\r\n#define Localization_h\r\n\r\n#include \"utils.h\"\r\n#include <Eigen/Dense>\r\n#include \"KalmanFilter.h\"\r\n\r\nclass Localization\r\n{\r\n  public:\r\n    Localization();\r\n\r\n    void updatePositionReading(Point global_position);\r\n\r\n    void updateVelocityReading(Velocity local_cart_vel, float dt);\r\n\r\n    Point getPosition() {return pos_; };\r\n\r\n    Velocity getVelocity() {return vel_; };\r\n\r\n    void forceZeroVelocity() {vel_ = {0,0,0}; };\r\n    \r\n    // Force the position to a specific value, bypassing localization algorithms (used for testing/debugging)\r\n    void forceSetPosition(Point global_position) {pos_ = global_position;};\r\n\r\n    LocalizationMetrics getLocalizationMetrics() { return metrics_; };\r\n\r\n    void resetAngleCovariance();\r\n\r\n  private:\r\n\r\n    // Convert position reading from marvelmind frame to robot frame\r\n    Eigen::Vector3f marvelmindToRobotCenter(Eigen::Vector3f mm_global_position);\r\n\r\n    // Compute a multiplier based on the current velocity that informs how trustworthy the current reading might be\r\n    float computePositionUncertainty();\r\n\r\n    \r\n    // Current position and velocity\r\n    Point pos_;\r\n    Velocity vel_;\r\n    \r\n    // Parameters for localization algorithms\r\n    float mm_x_offset_;\r\n    float mm_y_offset_;\r\n    float variance_ref_trans_;\r\n    float variance_ref_angle_;\r\n    float meas_trans_cov_;\r\n    float meas_angle_cov_;\r\n    float localization_uncertainty_scale_;\r\n    float min_vel_uncertainty_;\r\n    float vel_uncertainty_slope_;\r\n    float max_vel_uncetainty_;\r\n    float vel_uncertainty_decay_time_;\r\n\r\n    LocalizationMetrics metrics_;\r\n    Timer time_since_last_motion_; \r\n    KalmanFilter kf_;\r\n};\r\n\r\n#endif //Localization_h"
  },
  {
    "path": "src/robot/src/MarvelmindWrapper.cpp",
    "content": "#include \"MarvelmindWrapper.h\"\n#include \"constants.h\"\n#include \"plog/Log.h\"\n#include <filesystem>\n\n\nMarvelmindWrapper::MarvelmindWrapper()\n: hedge_(createMarvelmindHedge()),\n  ready_(false)\n{\n    // I expect to 2 devices connected, but because of the fun of linux and udev rules, I don't know \n    // which of the three possible names the devices could have. I could later change this to first \n    // do an `ls /dev/ | grep marvelmind` and find the right ports, but this should work fine too.\n    if(std::filesystem::exists(MARVELMIND_USB_0))\n    {\n        hedge_->ttyFileName = MARVELMIND_USB_0;\n        PLOGI << \"Found device \" << MARVELMIND_USB_0;\n    }\n    else if(std::filesystem::exists(MARVELMIND_USB_1))\n    {\n        hedge_->ttyFileName = MARVELMIND_USB_1;\n        PLOGI << \"Found device \" << MARVELMIND_USB_1;\n    }\n    else if(std::filesystem::exists(MARVELMIND_USB_2))\n    {\n        hedge_->ttyFileName = MARVELMIND_USB_2;\n        PLOGI << \"Found device \" << MARVELMIND_USB_2;\n    }\n    else\n    {\n        PLOGW << \"No marvelmind devices connected\";\n        return;\n    }\n\n    // Now actually try to connect\n    hedge_->verbose = true;\n    startMarvelmindHedge(hedge_);\n    ready_ = true;\n}\n\nstd::vector<float> MarvelmindWrapper::getPositions()\n{\n    PositionValue position_data;\n    std::vector<float> output;\n    \n    if (!ready_)\n    {\n        return output;\n    }\n\n    bool ok = getPositionFromMarvelmindHedge(hedge_, &position_data);\n\n    if (ok)\n    {\n        //PLOGI.printf(\"Position data - ready: %i, addr: %i, ts: %i, x: %i, y: %i, a:%f\", position_data.ready, position_data.address, position_data.timestamp, position_data.x, position_data.y, position_data.angle);\n\n        if (position_data.ready)\n        {\n            output.push_back(position_data.x/1000.0);\n            output.push_back(position_data.y/1000.0);\n            output.push_back(position_data.angle);\n        }\n    }\n\n    return output;\n}\n\n\nMarvelmindWrapper::~MarvelmindWrapper()\n{\n    if(ready_){\n        stopMarvelmindHedge (hedge_);\n    }\n    destroyMarvelmindHedge (hedge_);\n}\n\n\n\n// MarvelmindWrapper::MarvelmindWrapper()\n// {\n//     marvelmindAPILoad();\n//     PLOGI << \"Marvelmind API loaded\";\n\n//     // I expect to connect to 2 devices, but because of the fun of linux and udev rules, I don't know \n//     // which of the three possible names the devices could have. I could later change this to first \n//     // do an `ls /dev/ | grep marvelmind` and find the right ports, but this should work fine too.\n//     // char portName0[64] = MARVELMIND_USB_0;\n//     // char portName1[64] = MARVELMIND_USB_1;\n//     // char portName2[64] = MARVELMIND_USB_2;\n//     // PLOGI << portName0;\n//     // bool success0 = mmOpenPortByName(portName0);\n//     // bool success1 = mmOpenPortByName(portName1);\n//     // bool success2 = mmOpenPortByName(portName2);\n//     // PLOGI << \"S: \" << success0 << success1 << success2;\n//     // int success = static_cast<int>(success0) + static_cast<int>(success1) + static_cast<int>(success2);\n//     // if(success == 2)\n//     // {\n//     //     PLOGI << \"Successfully connected to 2 Marvelmind devices\";\n//     // }\n//     // else if (success == 1)\n//     // {\n//     //     PLOGW << \"Only able to connect to 1 Marvelmind device\";\n//     // }\n//     // else if (success == 0)\n//     // {\n//     //     PLOGW << \"Unable to connect to any Marvelmind devices\";\n//     // }\n//     // else\n//     // {\n//     //     PLOGE << \"Managed to connect to \" << success << \"Marvelmind devices. That shouldn't happen!\";\n//     // }\n\n//     // Open loop\n//     bool open = false;\n//     int count = 0;\n//     while(!open && count < 10000)\n//     {\n//         open = mmOpenPort();\n//         count++;\n//         usleep(1000);\n//     }\n//     PLOGI << \"Open status: \" << open;\n\n//     // Get usb loop\n//     bool usb_ready = false;\n//     count = 0;\n//     MarvelmindDeviceVersion usbDevVersion;\n//     while(!usb_ready && count < 10000)\n//     {\n//         usb_ready = mmGetVersionAndId(MM_USB_DEVICE_ADDRESS, &usbDevVersion);\n//         count++;\n//         usleep(1000);\n//     }\n//     PLOGI <<\"USB ready: \" << usb_ready;\n//     if(usb_ready)\n//     {\n//         PLOGI << \"Device type: \" << (usbDevVersion.fwVerDeviceType);\n//         PLOGI << \"Device is hedge: \" << mmDeviceIsHedgehog(usbDevVersion.fwVerDeviceType);\n//     }\n\n//     // bool device_list_ok = false;\n//     // count = 0;\n//     // MarvelmindDevicesList device_list;\n//     // while(!device_list_ok && count < 10)\n//     // {\n//     //     device_list_ok = mmGetDevicesList(&device_list);\n//     //     count++;\n//     //     PLOGI << count;\n//     //     usleep(1000);\n//     // }\n//     // PLOGI << \"Device list status: \" << device_list_ok;\n\n//     // if(!device_list_ok)\n//     // {\n//     //     PLOGW << \"Could not retrieve list of Marvelmind devices\";\n//     // }\n//     // else\n//     // {\n//     //     for(int i = 0; i < device_list.numDevices; i++)\n//     //     {\n//     //         MarvelmindDeviceInfo data = device_list.devices[i];\n//     //         PLOGI << \"Found device \" << data.address;\n//     //         // if(data.address == MARVELMIND_DEVICE_ID0 || data.address == MARVELMIND_DEVICE_ID1)\n//     //         // {\n//     //         //     PLOGI << \"Found device \" << data.address;\n//     //             // if (data.isSleeping)\n//     //             // {\n//     //             //     PLOGI << \"Device \" << data.address << \" is sleeping, triggering wake up\";\n//     //             //     mmWakeDevice(data.address);\n//     //             // }\n//     //             // else\n//     //             // {\n//     //             //     PLOGI << \"Device \" << data.address << \" is already awake\";\n//     //             // }\n//     //         // }\n//     //     }\n//     // }\n// }\n\n"
  },
  {
    "path": "src/robot/src/MarvelmindWrapper.h",
    "content": "#ifndef MarvelmindWrapper_h\n#define MarvelmindWrapper_h\n\n#include <vector>\n\nextern \"C\" {\n    #include <marvelmind/marvelmind.h>\n}\n\nclass MarvelmindWrapper\n{\n  public:\n    MarvelmindWrapper();\n    ~MarvelmindWrapper();\n    std::vector<float> getPositions();\n\n  private:\n    MarvelmindHedge* hedge_;\n    bool ready_;\n\n};\n\n\n#endif\n"
  },
  {
    "path": "src/robot/src/RobotController.cpp",
    "content": "#include \"RobotController.h\"\n\n#include <math.h>\n#include <plog/Log.h>\n#include <Eigen/Dense>\n\n#include \"constants.h\"\n#include \"serial/SerialCommsFactory.h\"\n#include \"robot_controller_modes/RobotControllerModePosition.h\"\n#include \"robot_controller_modes/RobotControllerModeVision.h\"\n#include \"robot_controller_modes/RobotControllerModeStopFast.h\"\n\n\nRobotController::RobotController(StatusUpdater& statusUpdater)\n: statusUpdater_(statusUpdater),\n  serial_to_motor_driver_(SerialCommsFactory::getFactoryInstance()->get_serial_comms(CLEARCORE_USB)),\n  localization_(),\n  prevOdomLoopTimer_(),\n  cartPos_(),\n  cartVel_(),\n  trajRunning_(false),\n  limits_mode_(LIMITS_MODE::FINE),\n  controller_rate_(cfg.lookup(\"motion.controller_frequency\")),\n  logging_rate_(cfg.lookup(\"motion.log_frequency\")),\n  log_this_cycle_(false),\n  fake_perfect_motion_(cfg.lookup(\"motion.fake_perfect_motion\")),\n  fake_local_cart_vel_(0,0,0),\n  max_cart_vel_limit_({cfg.lookup(\"motion.translation.max_vel.coarse\"),\n                       cfg.lookup(\"motion.translation.max_vel.coarse\"),\n                       cfg.lookup(\"motion.rotation.max_vel.coarse\")}),\n  loop_time_averager_(20)\n{    \n    if(fake_perfect_motion_) PLOGW << \"Fake robot motion enabled\";\n}\n\nvoid RobotController::moveToPosition(float x, float y, float a)\n{\n    reset_last_motion_logger();\n    limits_mode_ = LIMITS_MODE::COARSE;\n    setCartVelLimits(limits_mode_);\n    Point goal_pos = Point(x,y,a);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"MoveToPosition: %s\",goal_pos.toString().c_str());\n\n    auto position_mode = std::make_unique<RobotControllerModePosition>(fake_perfect_motion_);\n    bool ok = position_mode->startMove(cartPos_, goal_pos, limits_mode_);\n   \n    if (ok) \n    { \n        startTraj(); \n        controller_mode_ = std::move(position_mode);\n    }\n    else { statusUpdater_.setErrorStatus(); }\n}\n\nvoid RobotController::moveToPositionRelative(float dx_local, float dy_local, float da_local)\n{\n    reset_last_motion_logger();\n    limits_mode_ = LIMITS_MODE::COARSE;\n    setCartVelLimits(limits_mode_);\n\n    float dx_global =  cos(cartPos_.a) * dx_local - sin(cartPos_.a) * dy_local;\n    float dy_global =  sin(cartPos_.a) * dx_local + cos(cartPos_.a) * dy_local;\n    float da_global = da_local;\n    Point goal_pos = Point(cartPos_.x + dx_global, cartPos_.y + dy_global, wrap_angle(cartPos_.a + da_global));\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"MoveToPositionRelative: %s\",goal_pos.toString().c_str());\n\n    auto position_mode = std::make_unique<RobotControllerModePosition>(fake_perfect_motion_);\n    bool ok = position_mode->startMove(cartPos_, goal_pos, limits_mode_);\n   \n    if (ok) \n    { \n        startTraj(); \n        controller_mode_ = std::move(position_mode);\n    }\n    else { statusUpdater_.setErrorStatus(); }\n}\n\nvoid RobotController::moveToPositionRelativeSlow(float dx_local, float dy_local, float da_local)\n{\n    reset_last_motion_logger();\n    limits_mode_ = LIMITS_MODE::SLOW;\n    setCartVelLimits(limits_mode_);\n\n    float dx_global =  cos(cartPos_.a) * dx_local - sin(cartPos_.a) * dy_local;\n    float dy_global =  sin(cartPos_.a) * dx_local + cos(cartPos_.a) * dy_local;\n    float da_global = da_local;\n    Point goal_pos = Point(cartPos_.x + dx_global, cartPos_.y + dy_global, wrap_angle(cartPos_.a + da_global));\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"MoveToPositionRelativeSlow: %s\",goal_pos.toString().c_str());\n\n    auto position_mode = std::make_unique<RobotControllerModePosition>(fake_perfect_motion_);\n    bool ok = position_mode->startMove(cartPos_, goal_pos, limits_mode_);\n   \n    if (ok) \n    { \n        startTraj(); \n        controller_mode_ = std::move(position_mode);\n    }\n    else { statusUpdater_.setErrorStatus(); }\n}\n\nvoid RobotController::moveToPositionFine(float x, float y, float a)\n{\n    reset_last_motion_logger();\n    limits_mode_ = LIMITS_MODE::FINE;\n    setCartVelLimits(limits_mode_);\n    Point goal_pos = Point(x,y,a);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"MoveToPositionFine: %s\",goal_pos.toString().c_str());\n\n    auto position_mode = std::make_unique<RobotControllerModePosition>(fake_perfect_motion_);\n    bool ok = position_mode->startMove(cartPos_, goal_pos, limits_mode_);\n   \n    if (ok) \n    { \n        startTraj(); \n        controller_mode_ = std::move(position_mode);\n    }\n    else { statusUpdater_.setErrorStatus(); }\n}\n\nvoid RobotController::moveConstVel(float vx , float vy, float va, float t)\n{\n    (void) vx;\n    (void) vy;\n    (void) va;\n    (void) t;\n    PLOGE << \"Not implimented\";\n}\n\nvoid RobotController::moveWithVision(float x, float y, float a)\n{\n    reset_last_motion_logger();\n    limits_mode_ = LIMITS_MODE::VISION;\n    setCartVelLimits(limits_mode_);\n    Point goal = Point(x,y,a);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"MoveWithVision: %s\",goal.toString().c_str());\n    auto vision_mode = std::make_unique<RobotControllerModeVision>(fake_perfect_motion_, statusUpdater_);\n    bool ok = vision_mode->startMove(goal);\n   \n    if (ok) \n    { \n        startTraj(); \n        controller_mode_ = std::move(vision_mode);\n    }\n    else { statusUpdater_.setErrorStatus(); }\n}\n\nvoid RobotController::stopFast()\n{\n    reset_last_motion_logger();\n    limits_mode_ = LIMITS_MODE::FINE;\n    setCartVelLimits(limits_mode_);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"Stop Fast\");\n\n    auto stop_fast_mode = std::make_unique<RobotControllerModeStopFast>(fake_perfect_motion_);\n    stop_fast_mode->startMove(cartPos_, cartVel_);\n    startTraj(); \n    controller_mode_ = std::move(stop_fast_mode);\n}\n\n\nvoid RobotController::startTraj()\n{\n    trajRunning_ = true;\n    enableAllMotors();\n    PLOGI.printf(\"Starting move\");\n}\n\nvoid RobotController::estop()\n{\n    PLOGW.printf(\"Estopping robot control\");\n    PLOGD_(MOTION_LOG_ID) << \"\\n====ESTOP====\\n\";\n    trajRunning_ = false;\n    limits_mode_ = LIMITS_MODE::FINE;\n    disableAllMotors();\n}\n\nvoid RobotController::update()\n{    \n    // Ensures controller runs at approximately constant rate\n    if (!controller_rate_.ready()) { return; }\n    // Ensures logging doesn't get out of hand\n    log_this_cycle_ = logging_rate_.ready();\n    \n    // Create a command based on the trajectory or not moving\n    Velocity target_vel;\n    if(trajRunning_)\n    {\n        target_vel = controller_mode_->computeTargetVelocity(cartPos_, cartVel_, log_this_cycle_);\n        // Check if we are finished with the trajectory\n        if (controller_mode_->checkForMoveComplete(cartPos_, cartVel_))\n        {\n            PLOGI.printf(\"Reached goal\");\n            PLOGD_(MOTION_LOG_ID) << \"Trajectory complete\";\n            target_vel = {0,0,0};\n            disableAllMotors();\n            trajRunning_ = false;\n            // Bit of a hack for reseting angle covariance after vision motion\n            // This avoids weird rotations from error accumlated in the kalman filter\n            if (limits_mode_ == LIMITS_MODE::VISION)\n            {\n                localization_.resetAngleCovariance();\n            }\n            // Re-enable fine mode at the end of a trajectory\n            limits_mode_ = LIMITS_MODE::FINE;\n        }\n    }\n    else\n    {       \n        // Force zero velocity\n        target_vel = {0,0,0};\n        \n        // Force current velocity to 0\n        localization_.forceZeroVelocity();\n\n        // Force flags to default values\n        limits_mode_ = LIMITS_MODE::FINE;\n    }\n\n    // Run controller and odometry update\n    setCartVelCommand(target_vel);\n    computeOdometry();\n\n    // Update status \n    statusUpdater_.updatePosition(cartPos_.x, cartPos_.y, cartPos_.a);\n    statusUpdater_.updateVelocity(cartVel_.vx, cartVel_.vy, cartVel_.va);\n    loop_time_averager_.mark_point();\n    statusUpdater_.updateControlLoopTime(loop_time_averager_.get_ms());\n    statusUpdater_.updateLocalizationMetrics(localization_.getLocalizationMetrics());\n}\n\n\nvoid RobotController::enableAllMotors()\n{\n    if (serial_to_motor_driver_->isConnected())\n    {\n        serial_to_motor_driver_->send(\"base:Power:ON\");\n        PLOGI << \"Motors enabled\";\n        PLOGD_(MOTION_LOG_ID) << \"Motors enabled\";\n    }\n    else\n    {\n        PLOGW << \"No connection: Skipping motor enable\";\n    }\n}\n\nvoid RobotController::disableAllMotors()\n{\n    if (serial_to_motor_driver_->isConnected())\n    {\n        serial_to_motor_driver_->send(\"base:Power:OFF\");\n        PLOGI << \"Motors disabled\";\n        PLOGD_(MOTION_LOG_ID) << \"Motors disabled\";\n    }\n    else\n    {\n        PLOGW << \"No connection: Skipping motor disable\";\n    }\n}\n\nvoid RobotController::inputPosition(float x, float y, float a)\n{\n    bool last_mm_used = false;\n    if(limits_mode_ == LIMITS_MODE::FINE || limits_mode_ == LIMITS_MODE::COARSE)\n    {\n        localization_.updatePositionReading({x,y,a});\n        cartPos_ = localization_.getPosition();\n        last_mm_used = true;\n    }\n    statusUpdater_.updateLastMarvelmindPose({x,y,a}, last_mm_used);\n}\n\nvoid RobotController::forceSetPosition(float x, float y, float a)\n{\n    localization_.forceSetPosition({x,y,a});\n    cartPos_ = localization_.getPosition();\n}\n\nvoid RobotController::setCartVelLimits(LIMITS_MODE limits_mode)\n{\n    if(limits_mode == LIMITS_MODE::VISION)\n    {\n        max_cart_vel_limit_ = { cfg.lookup(\"motion.translation.max_vel.vision\"), \n                                cfg.lookup(\"motion.translation.max_vel.vision\"), \n                                cfg.lookup(\"motion.rotation.max_vel.vision\")};\n    }\n    else if(limits_mode == LIMITS_MODE::FINE || limits_mode == LIMITS_MODE::SLOW)\n    {\n        max_cart_vel_limit_ = { cfg.lookup(\"motion.translation.max_vel.fine\"), \n                                cfg.lookup(\"motion.translation.max_vel.fine\"), \n                                cfg.lookup(\"motion.rotation.max_vel.fine\")};\n    }\n    else\n    {\n        max_cart_vel_limit_ = { cfg.lookup(\"motion.translation.max_vel.coarse\"), \n                                cfg.lookup(\"motion.translation.max_vel.coarse\"), \n                                cfg.lookup(\"motion.rotation.max_vel.coarse\")};\n    }\n}\n\nbool RobotController::readMsgFromMotorDriver(Velocity* decodedVelocity)\n{\n    std::string msg = \"\";\n    std::vector<float> tmpVelocity = {0,0,0};\n    if (serial_to_motor_driver_->isConnected())\n    {\n        int count = 0;\n        while(msg.empty() && count++ < 10)\n        {\n            msg = serial_to_motor_driver_->rcv_base();\n        }\n    }\n\n    if (msg.empty())\n    {\n        return false;\n    }\n    else\n    {\n        tmpVelocity = parseCommaDelimitedStringToFloat(msg);\n        if(tmpVelocity.size() != 3)\n        {\n            PLOGW.printf(\"Decode failed\");\n            return false;\n        }\n    }\n    decodedVelocity->vx = tmpVelocity[0];\n    decodedVelocity->vy = tmpVelocity[1];\n    decodedVelocity->va = tmpVelocity[2];\n    return true;\n}\n\nvoid RobotController::computeOdometry()\n{  \n    Velocity local_cart_vel = {0,0,0};\n    if(fake_perfect_motion_)\n    {\n        local_cart_vel = fake_local_cart_vel_;\n    }\n    else if(!readMsgFromMotorDriver(&local_cart_vel)) { return; }\n    \n    Velocity zero = {0,0,0};\n    if(trajRunning_ || !(local_cart_vel == zero))\n    {\n        PLOGD_IF_(MOTION_LOG_ID, log_this_cycle_).printf(\"Decoded velocity: %.3f, %.3f, %.3f\\n\", local_cart_vel.vx, local_cart_vel.vy, local_cart_vel.va);\n    }\n    // Bypassing motor feedback and just using fake_local_cart_vel_ here may give better performance, but its hard to tell.\n\n    // Compute time since last odom update\n    float dt = prevOdomLoopTimer_.dt_s();\n    prevOdomLoopTimer_.reset();\n\n    localization_.updateVelocityReading(local_cart_vel, dt);\n    cartPos_ = localization_.getPosition();\n    cartVel_ = localization_.getVelocity();\n\n}\n\nvoid RobotController::setCartVelCommand(Velocity target_vel)\n{\n    if (trajRunning_) \n    {\n        PLOGD_IF_(MOTION_LOG_ID, log_this_cycle_).printf(\"CartVelCmd: [vx: %.4f, vy: %.4f, va: %.4f]\", target_vel.vx, target_vel.vy, target_vel.va);\n    }\n\n    // Convert input global velocities to local velocities\n    Velocity local_cart_vel;\n    float cA = cos(cartPos_.a);\n    float sA = sin(cartPos_.a);\n    local_cart_vel.vx =  cA * target_vel.vx + sA * target_vel.vy;\n    local_cart_vel.vy = -sA * target_vel.vx + cA * target_vel.vy;\n    local_cart_vel.va = target_vel.va;\n\n    // Cap velocity for safety\n    if(fabs(local_cart_vel.vx) > max_cart_vel_limit_.vx) \n    {\n        float clamped_vel = sgn(local_cart_vel.vx) * max_cart_vel_limit_.vx;\n        PLOGW.printf(\"Attempted to command X velocity of %.3f m/s, clamping to %.3f m/s\", local_cart_vel.vx, clamped_vel);\n        PLOGW_(MOTION_LOG_ID).printf(\"Attempted to command X velocity of %.3f m/s, clamping to %.3f m/s\", local_cart_vel.vx, clamped_vel);\n        local_cart_vel.vx = clamped_vel;\n    }\n    if(fabs(local_cart_vel.vy) > max_cart_vel_limit_.vy) \n    {\n        float clamped_vel = sgn(local_cart_vel.vy) * max_cart_vel_limit_.vy;\n        PLOGW.printf(\"Attempted to command Y velocity of %.3f m/s, clamping to %.3f m/s\", local_cart_vel.vy, clamped_vel);\n        PLOGW_(MOTION_LOG_ID).printf(\"Attempted to command Y velocity of %.3f m/s, clamping to %.3f m/s\", local_cart_vel.vy, clamped_vel);\n        local_cart_vel.vy = clamped_vel;\n    }\n    if(fabs(local_cart_vel.va) > max_cart_vel_limit_.va) \n    {\n        float clamped_vel = sgn(local_cart_vel.va) * max_cart_vel_limit_.va;\n        PLOGW.printf(\"Attempted to command A velocity of %.3f rad/s, clamping to %.3f rad/s\", local_cart_vel.va, clamped_vel);\n        PLOGW_(MOTION_LOG_ID).printf(\"Attempted to command A velocity of %.3f rad/s, clamping to %.3f rad/s\", local_cart_vel.va, clamped_vel);\n        local_cart_vel.va = clamped_vel;\n    }\n\n    // Prep velocity data to send to motor driver\n    char buff[100];\n    sprintf(buff, \"base:%.4f,%.4f,%.4f\",local_cart_vel.vx, local_cart_vel.vy, local_cart_vel.va);\n    std::string s = buff;\n\n    if (local_cart_vel.vx != 0 || local_cart_vel.vy != 0 || local_cart_vel.va != 0 )\n    {\n        PLOGD_IF_(MOTION_LOG_ID, log_this_cycle_).printf(\"Sending to motors: [%s]\", s.c_str());\n    }\n\n    if(fake_perfect_motion_)\n    {\n        fake_local_cart_vel_ = local_cart_vel;\n    }\n    else if (serial_to_motor_driver_->isConnected())\n    {\n        serial_to_motor_driver_->send(s);\n    }\n}\n"
  },
  {
    "path": "src/robot/src/RobotController.h",
    "content": "#ifndef RobotController_h\n#define RobotController_h\n\n#include \"SmoothTrajectoryGenerator.h\"\n#include \"StatusUpdater.h\"\n#include \"serial/SerialComms.h\"\n#include \"utils.h\"\n#include \"Localization.h\"\n#include \"robot_controller_modes/RobotControllerModeBase.h\"\n\nclass RobotController\n{\n  public:\n\n    // Constructor\n    RobotController(StatusUpdater& statusUpdater);\n\n    // Command robot to move a specific position with low accuracy\n    void moveToPosition(float x, float y, float a);\n\n    // Command robot to move a specific position relative to current position with low accuracy\n    void moveToPositionRelative(float dx_local, float dy_local, float da_local);\n\n    // Command robot to move a specific position relative to current position with low speed\n    void moveToPositionRelativeSlow(float dx_local, float dy_local, float da_local);\n    \n    // Command robot to move to a specific position with high accuracy\n    void moveToPositionFine(float x, float y, float a);\n\n    // Command robot to move with a constant velocity for some amount of time\n    void moveConstVel(float vx , float vy, float va, float t);\n\n    void moveWithVision(float x, float y, float a);\n\n    void stopFast();\n\n    // Main update loop. Should be called as fast as possible\n    void update();\n\n    // Enable all motors at once\n    void enableAllMotors();\n\n    // Disable all motors at once. This will cause motors to coast to a stop\n    void disableAllMotors();\n\n    // Provide a position reading from the MarvelMind sensors\n    void inputPosition(float x, float y, float a);\n\n    // Force the position to a specific value, bypassing localization algorithms (used for testing/debugging)\n    void forceSetPosition(float x, float y, float a);\n\n    // Indicates if a trajectory is currently active\n    bool isTrajectoryRunning() { return trajRunning_; };\n\n    // Stops the currently running motion\n    void estop();\n\n    Point getCurrentPosition() { return cartPos_; };\n\n  private:\n\n    //Internal methods\n    // Set the global cartesian velocity command\n    void setCartVelCommand(Velocity target_vel);\n    // Update loop for motor objects\n    void updateMotors();\n    // Calculate wheel odometry\n    void computeOdometry();\n    // Sets up everything to start the trajectory running\n    void startTraj();\n    // Reads an incoming message from the motor driver and fills the decoded\n    // velocity in the pointer, if available. Returns true if velocity is filled, false otherwise\n    bool readMsgFromMotorDriver(Velocity* decodedVelocity);\n\n    void setCartVelLimits(LIMITS_MODE limits_mode);\n\n    // Member variables\n    StatusUpdater& statusUpdater_;         // Reference to status updater object to input status info about the controller\n    SerialCommsBase* serial_to_motor_driver_;   // Serial connection to motor driver\n    Localization localization_;            // Object that handles localization\n    Timer prevOdomLoopTimer_;              // Timer for odom loop\n    Point cartPos_;                        // Current cartesian position\n    Velocity cartVel_;                     // Current cartesian velocity\n    bool trajRunning_;                     // If a trajectory is currently active\n    LIMITS_MODE limits_mode_;              // Which limits mode is being used.\n    RateController controller_rate_;       // Rate limit controller loops\n    RateController logging_rate_ ;         // Rate limit logging to file\n    bool log_this_cycle_;                  // Trigger for logging this cycle\n    bool fake_perfect_motion_;             // Flag used for testing to enable perfect motion without clearcore\n    Velocity fake_local_cart_vel_;         // Commanded local cartesian velocity used to fake perfect motion\n    Velocity max_cart_vel_limit_;          // Maximum velocity allowed, used to limit commanded velocity\n\n    TimeRunningAverage loop_time_averager_;        // Handles keeping average of the loop timing\n\n    std::unique_ptr<RobotControllerModeBase> controller_mode_;\n\n};\n\n#endif\n"
  },
  {
    "path": "src/robot/src/RobotServer.cpp",
    "content": "#include \"RobotServer.h\"\n\n#include <ArduinoJson/ArduinoJson.h>\n#include <plog/Log.h>\n\n#include \"sockets/SocketMultiThreadWrapperFactory.h\"\n\n#include <iostream>\n\nRobotServer::RobotServer(StatusUpdater& statusUpdater)\n: moveData_(),\n  positionData_(),\n  velocityData_(),\n  statusUpdater_(statusUpdater),\n  recvInProgress_(false),\n  recvIdx_(0),\n  buffer_(\"\"),\n  socket_(SocketMultiThreadWrapperFactory::getFactoryInstance()->get_socket())\n{\n}\n\nCOMMAND RobotServer::getCommand(std::string message)\n{\n    COMMAND cmd = COMMAND::NONE;\n    StaticJsonDocument<256> doc;\n    DeserializationError err = deserializeJson(doc, message);\n\n    if(err)\n    {\n        printIncomingCommand(message);\n        PLOGI.printf(\"Error parsing JSON: \");\n        PLOGI.printf(err.c_str());   \n        sendErr(\"bad_json\");\n    }\n    else\n    {\n        std::string type = doc[\"type\"];\n        if(type == \"move\")\n        {\n            cmd = COMMAND::MOVE;\n            moveData_.x = doc[\"data\"][\"x\"];\n            moveData_.y = doc[\"data\"][\"y\"];\n            moveData_.a = doc[\"data\"][\"a\"];\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"move_rel\")\n        {\n            cmd = COMMAND::MOVE_REL;\n            moveData_.x = doc[\"data\"][\"x\"];\n            moveData_.y = doc[\"data\"][\"y\"];\n            moveData_.a = doc[\"data\"][\"a\"];\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"move_rel_slow\")\n        {\n            cmd = COMMAND::MOVE_REL_SLOW;\n            moveData_.x = doc[\"data\"][\"x\"];\n            moveData_.y = doc[\"data\"][\"y\"];\n            moveData_.a = doc[\"data\"][\"a\"];\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"move_fine\")\n        {\n            cmd = COMMAND::MOVE_FINE;\n            moveData_.x = doc[\"data\"][\"x\"];\n            moveData_.y = doc[\"data\"][\"y\"];\n            moveData_.a = doc[\"data\"][\"a\"];\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"move_fine_stop_vision\")\n        {\n            cmd = COMMAND::MOVE_FINE_STOP_VISION;\n            moveData_.x = doc[\"data\"][\"x\"];\n            moveData_.y = doc[\"data\"][\"y\"];\n            moveData_.a = doc[\"data\"][\"a\"];\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"move_vision\")\n        {\n            cmd = COMMAND::MOVE_WITH_VISION;\n            moveData_.x = doc[\"data\"][\"x\"];\n            moveData_.y = doc[\"data\"][\"y\"];\n            moveData_.a = doc[\"data\"][\"a\"];\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"move_const_vel\")\n        {\n            cmd = COMMAND::MOVE_CONST_VEL;\n            velocityData_.vx = doc[\"data\"][\"vx\"];\n            velocityData_.vy = doc[\"data\"][\"vy\"];\n            velocityData_.va = doc[\"data\"][\"va\"];\n            velocityData_.t = doc[\"data\"][\"t\"];\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"place\")\n        {\n            cmd = COMMAND::PLACE_TRAY;\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"load\")\n        {\n            cmd = COMMAND::LOAD_TRAY;\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"init\")\n        {\n            cmd = COMMAND::INITIALIZE_TRAY;\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"p\")\n        {\n            cmd = COMMAND::POSITION;\n            positionData_.x = doc[\"data\"][\"x\"];\n            positionData_.y = doc[\"data\"][\"y\"];\n            positionData_.a = doc[\"data\"][\"a\"];\n            sendAck(type);\n        }\n        else if(type == \"set_pose\")\n        {\n            cmd = COMMAND::SET_POSE;\n            positionData_.x = doc[\"data\"][\"x\"];\n            positionData_.y = doc[\"data\"][\"y\"];\n            positionData_.a = doc[\"data\"][\"a\"];\n            sendAck(type);\n        }\n        else if(type == \"estop\")\n        {\n            cmd = COMMAND::ESTOP;\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"lc\")\n        {\n            cmd = COMMAND::LOAD_COMPLETE;\n            printIncomingCommand(message);\n            sendAck(type);\n        }\n        else if(type == \"status\")\n        {\n            sendStatus();\n        }\n        else if (type == \"check\")\n        {\n            sendAck(type);\n        }\n        else if (type == \"clear_error\")\n        {\n            statusUpdater_.clearErrorStatus();\n            sendAck(type);\n        }\n        else if (type == \"wait_for_loc\")\n        {\n            sendAck(type);\n            cmd = COMMAND::WAIT_FOR_LOCALIZATION;\n        }\n        else if (type == \"toggle_vision_debug\")\n        {\n            sendAck(type);\n            cmd = COMMAND::TOGGLE_VISION_DEBUG;\n        }\n        else if (type == \"start_cameras\")\n        {\n            sendAck(type);\n            cmd = COMMAND::START_CAMERAS;\n        }\n        else if (type == \"stop_cameras\")\n        {\n            sendAck(type);\n            cmd = COMMAND::STOP_CAMERAS;\n        }\n        else if(type == \"\")\n        {\n            printIncomingCommand(message);\n            PLOGI.printf(\"ERROR: Type field empty or not specified \");\n            sendErr(\"no_type\");\n        }\n        else\n        {\n            printIncomingCommand(message);\n            PLOGI.printf(\"ERROR: Unkown type field \");\n            sendErr(\"unkown_type\");\n        }\n    }\n    return cmd;    \n}\n\nRobotServer::PositionData RobotServer::getMoveData()\n{\n    return moveData_;\n}\n\nRobotServer::PositionData RobotServer::getPositionData()\n{\n    return positionData_;\n}\n\nRobotServer::VelocityData RobotServer::getVelocityData()\n{\n    return velocityData_;\n}\n\nvoid RobotServer::sendStatus()\n{\n    std::string msg = statusUpdater_.getStatusJsonString();\n    sendMsg(msg, false);\n}\n\nCOMMAND RobotServer::oneLoop()\n{\n    COMMAND cmd = COMMAND::NONE;\n    std::string newMsg = getAnyIncomingMessage();\n\n    if(newMsg.length() != 0)\n    {    \n        PLOGD.printf(\"RX: %s\", newMsg.c_str());\n        cmd = getCommand(cleanString(newMsg));\n    }\n    return cmd;\n}\n\nstd::string RobotServer::cleanString(std::string message)\n{\n  int idx_start = message.find(\"{\");\n  int idx_end = message.find(\"}\") + 1;\n  int len = idx_end - idx_start + 1;\n  if(idx_start == -1 || idx_end == 0)\n  {\n      PLOGW.printf(\"Could not find brackets in message\");\n      return message;\n  }\n  return message.substr(idx_start, len);\n}\n\nstd::string RobotServer::getAnyIncomingMessage()\n{\n    bool newData = false;\n    std::string new_msg = \"\";\n\n    while (socket_->dataAvailableToRead() && newData == false) \n    {\n        std::string data = socket_->getData();\n        for (auto c : data)\n        {\n            if (recvInProgress_ == true) \n            {\n                if (c == START_CHAR)\n                {\n                    buffer_ = \"\";\n                }\n                else if (c != END_CHAR) \n                {\n                    buffer_ += c;\n                }\n                else \n                {\n                    recvInProgress_ = false;\n                    newData = true;\n                    new_msg = buffer_;\n                    buffer_ = \"\";\n                }\n            }\n            else if (c == START_CHAR) \n            {\n                recvInProgress_ = true;\n            }\n        }\n    }\n    return new_msg;\n}\n\nvoid RobotServer::sendMsg(std::string msg, bool print_debug)\n{\n    if (msg.length() == 0 && print_debug)\n    {\n      PLOGI.printf(\"Nothing to send!!!\\n\");\n    }\n    else\n    {\n        if(print_debug)\n        {\n            PLOGD.printf(\"TX: %s\", msg.c_str());\n        }\n\n        std::string send_msg = START_CHAR + msg + END_CHAR;\n        socket_->sendData(send_msg);\n    }\n}\n\nvoid RobotServer::printIncomingCommand(std::string message)\n{\n    PLOGI.printf(message.c_str());\n}\n\nvoid RobotServer::sendAck(std::string data)\n{\n    StaticJsonDocument<64> doc;\n    doc[\"type\"] = \"ack\";\n    doc[\"data\"] = data;\n    std::string msg;\n    serializeJson(doc, msg);\n    sendMsg(msg);\n}\n\nvoid RobotServer::sendErr(std::string data)\n{\n    StaticJsonDocument<64> doc;\n    doc[\"type\"] = \"ack\";\n    doc[\"data\"] = data;\n    std::string msg;\n    serializeJson(doc, msg);\n    sendMsg(msg);\n}"
  },
  {
    "path": "src/robot/src/RobotServer.h",
    "content": "/*\n* Robot server that handles gettings messages from master\n* and responding correctly\n*/\n\n#ifndef RobotServer_h\n#define RobotServer_h\n\n#include <string>\n#include <memory>\n\n#include \"constants.h\"\n#include \"sockets/SocketMultiThreadWrapperBase.h\"\n#include \"StatusUpdater.h\"\n\n#define START_CHAR '<'\n#define END_CHAR '>'\n\nclass RobotServer\n{\n  public:\n\n    struct PositionData\n    {\n      float x;\n      float y;\n      float a;\n    };\n\n    struct VelocityData\n    {\n      float vx;\n      float vy;\n      float va;\n      float t;\n    };\n    \n    RobotServer(StatusUpdater& statusUpdater);\n\n    COMMAND oneLoop();\n\n    RobotServer::PositionData getMoveData();\n\n    RobotServer::PositionData getPositionData();\n\n    RobotServer::VelocityData getVelocityData();\n\n  private:\n    PositionData moveData_;\n    PositionData positionData_;\n    VelocityData velocityData_;\n    StatusUpdater& statusUpdater_;\n\n    bool recvInProgress_;\n    int recvIdx_;\n    std::string buffer_;\n    SocketMultiThreadWrapperBase* socket_;\n\n    COMMAND getCommand(std::string message);\n    void sendMsg(std::string msg, bool print_debug=true);\n    void sendAck(std::string data);\n    void sendErr(std::string data);\n    std::string getAnyIncomingMessage();\n    std::string cleanString(std::string message);\n    void printIncomingCommand(std::string message);\n    void sendStatus();\n\n};\n\n\n\n#endif\n"
  },
  {
    "path": "src/robot/src/SmoothTrajectoryGenerator.cpp",
    "content": "#include \"SmoothTrajectoryGenerator.h\"\n#include <plog/Log.h>\n#include \"constants.h\"\n#include \"utils.h\"\n\nconstexpr float d6 = 1/6.0;\n\nSmoothTrajectoryGenerator::SmoothTrajectoryGenerator()\n  : currentTrajectory_()\n{\n    currentTrajectory_.complete = false;\n    solver_params_.num_loops = cfg.lookup(\"trajectory_generation.solver_max_loops\"); \n    solver_params_.beta_decay = cfg.lookup(\"trajectory_generation.solver_beta_decay\");\n    solver_params_.alpha_decay = cfg.lookup(\"trajectory_generation.solver_alpha_decay\");\n    solver_params_.exponent_decay = cfg.lookup(\"trajectory_generation.solver_exponent_decay\");\n}\n\nPVTPoint SmoothTrajectoryGenerator::lookup(float time)\n{\n    std::vector<float> trans_values = lookup_1D(time, currentTrajectory_.trans_params);\n    std::vector<float> rot_values = lookup_1D(time, currentTrajectory_.rot_params);    \n\n    // Map translational trajectory into XY space with direction vector\n    Eigen::Vector2f trans_pos_delta = trans_values[0] * currentTrajectory_.trans_direction;\n    Eigen::Vector2f trans_vel = trans_values[1] * currentTrajectory_.trans_direction;\n    // Map rotational trajectory into angular space with direction\n    float rot_pos_delta = rot_values[0] * currentTrajectory_.rot_direction;\n    float rot_vel = rot_values[1] * currentTrajectory_.rot_direction;\n\n    // Build and return pvtpoint\n    PVTPoint pvt;\n    pvt.position = {currentTrajectory_.initialPoint.x + trans_pos_delta(0),\n                     currentTrajectory_.initialPoint.y + trans_pos_delta(1),\n                     wrap_angle(currentTrajectory_.initialPoint.a + rot_pos_delta) };\n    pvt.velocity = {trans_vel(0), trans_vel(1), rot_vel};\n    pvt.time = time;\n    return pvt;\n}\n\nstd::vector<float> lookup_1D(float time, const SCurveParameters& params)\n{\n    // Handle time before start of trajectory\n    if(time <= params.switch_points[0].t)\n    {\n        return {params.switch_points[0].p, params.switch_points[0].v};\n    }\n    // Handle time after the end of the trajectory\n    else if (time > params.switch_points[7].t)\n    {\n        return {params.switch_points[7].p, params.switch_points[7].v};\n    }\n    // Handle times within the trajectory\n    else\n    {\n        // Look for correct region\n        for (int i = 1; i <= 7; i++)\n        {\n            // Once region is found, compute position and velocity from previous switch point\n            if(params.switch_points[i-1].t < time && time <= params.switch_points[i].t)\n            {\n                float dt = time - params.switch_points[i-1].t;\n                std::vector<float> values = computeKinematicsBasedOnRegion(params, i, dt);\n                return {values[2], values[1]};\n            }\n        }\n    }\n\n    PLOGE << \"Should not get to this point in lookup\";\n    return {0,0};\n}\n\n\nbool SmoothTrajectoryGenerator::generatePointToPointTrajectory(Point initialPoint, Point targetPoint, LIMITS_MODE limits_mode)\n{    \n    // Print to logs\n    PLOGI.printf(\"Generating trajectory\");\n    PLOGI.printf(\"Starting point: %s\", initialPoint.toString().c_str());\n    PLOGI.printf(\"Target point: %s\", targetPoint.toString().c_str());\n    PLOGD_(MOTION_LOG_ID).printf(\"\\nGenerating trajectory\");\n    PLOGD_(MOTION_LOG_ID).printf(\"Starting point: %s\", initialPoint.toString().c_str());\n    PLOGD_(MOTION_LOG_ID).printf(\"Target point: %s\", targetPoint.toString().c_str());\n\n    MotionPlanningProblem mpp = buildMotionPlanningProblem(initialPoint, targetPoint, limits_mode, solver_params_);\n    currentTrajectory_ = generateTrajectory(mpp);\n\n    PLOGI << currentTrajectory_.toString();\n    PLOGD_(MOTION_LOG_ID) << currentTrajectory_.toString();\n\n    return currentTrajectory_.complete;\n}\n\n// TODO Implement a more accurate version of this if needed\n// Note that this implimentation is a hack and isn't guaranteed to give an accurate constant velocity - so use with caution.\nbool SmoothTrajectoryGenerator::generateConstVelTrajectory(Point initialPoint, Velocity velocity, float moveTime, LIMITS_MODE limits_mode)\n{\n    // Print to logs\n    PLOGI.printf(\"Generating constVel (sort of) trajectory\");\n    PLOGI.printf(\"Starting point: %s\", initialPoint.toString().c_str());\n    PLOGI.printf(\"Target velocity: %s\", velocity.toString().c_str());\n    PLOGI.printf(\"Move time: %d\", moveTime);\n    PLOGD_(MOTION_LOG_ID).printf(\"Generating constVel (sort of) trajectory\");\n    PLOGD_(MOTION_LOG_ID).printf(\"Starting point: %s\", initialPoint.toString().c_str());\n    PLOGD_(MOTION_LOG_ID).printf(\"Target velocity: %s\", velocity.toString().c_str());\n    PLOGD_(MOTION_LOG_ID).printf(\"Move time: %d\", moveTime);\n\n    // This will undershoot the target velocity because we aren't consider accel/jerk here so the \n    // solver will not quite reach this velocity - especially if the move time specified is small.\n    Point targetPoint;\n    targetPoint.x = initialPoint.x + velocity.vx * moveTime;\n    targetPoint.y = initialPoint.y + velocity.vy * moveTime;\n    targetPoint.a = initialPoint.a + velocity.va * moveTime;\n\n    MotionPlanningProblem mpp = buildMotionPlanningProblem(initialPoint, targetPoint, limits_mode, solver_params_);\n    currentTrajectory_ = generateTrajectory(mpp);\n\n    PLOGI << currentTrajectory_.toString();\n    PLOGD_(MOTION_LOG_ID) << currentTrajectory_.toString();\n\n    return currentTrajectory_.complete;\n}\n\nMotionPlanningProblem buildMotionPlanningProblem(Point initialPoint, Point targetPoint, LIMITS_MODE limits_mode, const SolverParameters& solver)\n{\n    MotionPlanningProblem mpp;\n    mpp.initialPoint = {initialPoint.x, initialPoint.y, initialPoint.a};\n    mpp.targetPoint = {targetPoint.x, targetPoint.y, targetPoint.a};\n\n    DynamicLimits translationalLimits;\n    DynamicLimits rotationalLimits;\n    if(limits_mode == LIMITS_MODE::VISION)\n    {\n        PLOGI << \"Setting trajectory limits mode to LIMITS_MODE::VISION\";\n        translationalLimits = { cfg.lookup(\"motion.translation.max_vel.vision\"), \n                                cfg.lookup(\"motion.translation.max_acc.vision\"), \n                                cfg.lookup(\"motion.translation.max_jerk.vision\")};\n        rotationalLimits = {    cfg.lookup(\"motion.rotation.max_vel.vision\"), \n                                cfg.lookup(\"motion.rotation.max_acc.vision\"), \n                                cfg.lookup(\"motion.rotation.max_jerk.vision\")};\n    }\n    else if(limits_mode == LIMITS_MODE::FINE || limits_mode == LIMITS_MODE::SLOW)\n    {\n        PLOGI << \"Setting trajectory limits mode to LIMITS_MODE::FINE/SLOW\";\n        translationalLimits = { cfg.lookup(\"motion.translation.max_vel.fine\"), \n                                cfg.lookup(\"motion.translation.max_acc.fine\"), \n                                cfg.lookup(\"motion.translation.max_jerk.fine\")};\n        rotationalLimits = {    cfg.lookup(\"motion.rotation.max_vel.fine\"), \n                                cfg.lookup(\"motion.rotation.max_acc.fine\"), \n                                cfg.lookup(\"motion.rotation.max_jerk.fine\")};\n    }\n    else\n    {\n        PLOGI << \"Setting trajectory limits mode to LIMITS_MODE::COARSE\";\n        translationalLimits = { cfg.lookup(\"motion.translation.max_vel.coarse\"), \n                                cfg.lookup(\"motion.translation.max_acc.coarse\"), \n                                cfg.lookup(\"motion.translation.max_jerk.coarse\")};\n        rotationalLimits = {    cfg.lookup(\"motion.rotation.max_vel.coarse\"), \n                                cfg.lookup(\"motion.rotation.max_acc.coarse\"), \n                                cfg.lookup(\"motion.rotation.max_jerk.coarse\")};\n    }\n\n    // This scaling makes sure to give some headroom for the controller to go a bit faster than the planned limits \n    // without actually violating any hard constraints\n    mpp.translationalLimits = translationalLimits * cfg.lookup(\"motion.limit_max_fraction\");\n    mpp.rotationalLimits = rotationalLimits * cfg.lookup(\"motion.limit_max_fraction\");\n    mpp.solver_params = solver;\n\n    return std::move(mpp);     \n}\n\nbool sCurveWithinLimits(const SCurveParameters& params, const DynamicLimits& limits)\n{\n    return params.v_lim <= limits.max_vel && params.a_lim <= limits.max_acc && params.j_lim <= limits.max_jerk;\n}\n\nTrajectory generateTrajectory(MotionPlanningProblem problem)\n{   \n    // Figure out delta that the trajectory needs to cover\n    Eigen::Vector3f deltaPosition = problem.targetPoint - problem.initialPoint;\n    deltaPosition(2) = wrap_angle(deltaPosition(2));\n\n    // Begin building trajectory object\n    Trajectory traj;\n    traj.complete = false;\n    traj.initialPoint = {problem.initialPoint(0), problem.initialPoint(1), problem.initialPoint(2)};\n    traj.trans_direction = deltaPosition.head(2).normalized();\n    traj.rot_direction = sgn(deltaPosition(2));\n    \n    // Solve translational component\n    float dist = deltaPosition.head(2).norm();\n    SCurveParameters trans_params;\n    bool ok = generateSCurve(dist, problem.translationalLimits, problem.solver_params, &trans_params);\n    if(!ok)\n    {\n        PLOGW << \"Failed to generate translational trajectory\";\n        return traj;\n    }\n\n    // Solve rotational component\n    SCurveParameters rot_params;\n    dist = fabs(deltaPosition(2));\n    ok = generateSCurve(dist, problem.rotationalLimits, problem.solver_params, &rot_params);\n    if(!ok)\n    {\n        PLOGW << \"Failed to generate rotational trajectory\";\n        return traj;\n    }\n\n    // Syncronize trajectories so they both start and end at the same time\n    ok = synchronizeParameters(&trans_params, &rot_params);\n    if (!ok)\n    {\n        PLOGW << \"Failed to synchronize trajectory parameters\";\n        return traj; \n    }\n\n    if(!sCurveWithinLimits(trans_params, problem.translationalLimits))\n    {\n        PLOGW << \"Generated translational trajectory violates limits\";\n        return traj; \n    }\n    if(!sCurveWithinLimits(rot_params, problem.rotationalLimits))\n    {\n        PLOGW << \"Generated rotational trajectory violates limits\";\n        return traj; \n    }\n\n    traj.trans_params = trans_params;\n    traj.rot_params = rot_params;\n    traj.complete = true;\n\n    return traj;\n}\n\nbool generateSCurve(float dist, DynamicLimits limits, const SolverParameters& solver, SCurveParameters* params)\n{\n    // Handle case where distance is very close to 0\n    float min_dist = cfg.lookup(\"trajectory_generation.min_dist_limit\"); \n    if (fabs(dist) < min_dist)\n    {\n        params->v_lim = 0;\n        params->a_lim = 0;\n        params->j_lim = 0;\n        for (int i = 0; i < 8; i++)\n        {\n            params->switch_points[i].t = 0;\n            params->switch_points[i].p = 0;\n            params->switch_points[i].v = 0;\n            params->switch_points[i].a = 0;\n        }\n        return true;\n    }\n    \n    // Initialize parameters\n    float v_lim = limits.max_vel;\n    float a_lim = limits.max_acc;\n    float j_lim = limits.max_jerk;\n\n    bool solution_found = false;\n    int loop_counter = 0;\n    while(!solution_found && loop_counter < solver.num_loops)\n    {\n        loop_counter++;\n        PLOGI << \"Trajectory generation loop \" << loop_counter;\n\n        // Constant jerk region\n        float dt_j = a_lim / j_lim;\n        float dv_j = 0.5 * j_lim * std::pow(dt_j, 2);\n        float dp_j1 = d6 * j_lim * std::pow(dt_j, 3);\n        float dp_j2 = (v_lim - dv_j) * dt_j + 0.5 * a_lim * std::pow(dt_j, 2) - d6 * j_lim * std::pow(dt_j, 3);\n\n        // Constant accel region\n        float dt_a = (v_lim - 2 * dv_j) / a_lim;\n        if (dt_a <= 0)\n        {\n            // If dt_a is negative, it means we couldn't find a solution\n            // so adjust accel parameter and try loop again\n            a_lim *= std::pow(solver.beta_decay, 1 + solver.exponent_decay * loop_counter);\n            PLOGI << \"dt_a: \" << dt_a << \", trying new accel limit: \" << a_lim;\n            continue;\n        }\n        float dp_a = dv_j * dt_a + 0.5 * a_lim * std::pow(dt_a, 2);\n\n        // Constant velocity region\n        float dt_v = (dist - 2 * dp_j1 - 2 * dp_j2 - 2 * dp_a) / v_lim;\n        if (dt_v <= 0)\n        {\n            // If dt_a is negative, it means we couldn't find a solution\n            // so adjust velocity parameter and try loop again\n            v_lim *= std::pow(solver.alpha_decay, 1 + solver.exponent_decay * loop_counter);\n            PLOGI << \"dt_v: \" << dt_v << \", trying new velocity limit: \" << v_lim;\n            continue;\n        }\n\n        // If we get here, it means we found a valid solution and can populate the rest of the \n        // switch time parameters\n        solution_found = true;\n        params->v_lim = v_lim;\n        params->a_lim = a_lim;\n        params->j_lim = j_lim;\n        PLOGI << \"Trajectory solution found\";\n        populateSwitchTimeParameters(params, dt_j, dt_a, dt_v);\n    }\n\n    return solution_found;\n}\n\nvoid populateSwitchTimeParameters(SCurveParameters* params, float dt_j, float dt_a, float dt_v)\n{\n    // Fill first point with all zeros\n    params->switch_points[0].t = 0;\n    params->switch_points[0].p = 0;\n    params->switch_points[0].v = 0;\n    params->switch_points[0].a = 0;\n\n    for (int i = 1; i < 8; i++)\n    {\n        float dt;\n        // Constant jerk regions\n        if (i == 1 || i == 3 || i == 5 || i == 7)\n        {\n            dt = dt_j;\n        }\n        // Constant acceleration regions\n        else if (i == 2 || i == 6)\n        {\n            dt = dt_a;\n        }\n        // Constant velocity region\n        else\n        {\n            dt = dt_v;\n        }\n\n        // Populate values\n        std::vector<float> values = computeKinematicsBasedOnRegion(*params, i, dt);\n        params->switch_points[i].a = values[0];\n        params->switch_points[i].v = values[1];\n        params->switch_points[i].p = values[2];\n        params->switch_points[i].t = params->switch_points[i-1].t + dt;\n    }\n}\n\nbool synchronizeParameters(SCurveParameters* params1, SCurveParameters* params2)\n{\n    // Figure out which parameter set is the faster one\n    float end_time_1 = params1->switch_points[7].t;\n    float end_time_2 = params2->switch_points[7].t;\n\n    // Slow down which ever one is faster to syncronize them\n    bool ok = false;\n    if(end_time_1 > end_time_2)\n    {\n        ok = slowDownParamsToMatchTime(params2, end_time_1);\n    }\n    else \n    {\n        ok = slowDownParamsToMatchTime(params1, end_time_2);\n    }\n\n    return ok;\n}\n\nbool slowDownParamsToMatchTime(SCurveParameters* params, float time_to_match)\n{\n    // Compute time delta needed to adjust by \n    float end_time = params->switch_points[7].t;\n    float dt = time_to_match - end_time;\n\n    // Do a simple adjustment by adding an equal amount of time to each region\n    float dt_per_region = dt / 7.0;\n    for (int i = 1; i < 8; i++)\n    {\n        params->switch_points[i].t += + i*dt_per_region;\n    }\n\n    // Then recompute limits based on updated times\n    return solveInverse(params);\n}\n\nbool solveInverse(SCurveParameters* params)\n{\n    // Gather parameters needed\n    const float dt_j = params->switch_points[1].t - params->switch_points[0].t;\n    const float dt_a = params->switch_points[2].t - params->switch_points[1].t;\n    const float dt_v = params->switch_points[4].t - params->switch_points[3].t;\n    const float deltaPosition = params->switch_points[7].p;\n\n    // Build linear system\n    Eigen::Matrix3f A;\n    Eigen::Vector3f b;\n    A << dt_j,                                     -1,                                     0    ,\n         std::pow(dt_j, 2),                        dt_a                    ,              -1    ,\n         std::pow(dt_j, 2) * (dt_a - dt_j),  std::pow(dt_j, 2) + std::pow(dt_a, 2),  dt_v + 2* dt_j;\n    b << 0, 0, deltaPosition;\n\n    // Solve system and check results\n    // Eigen::Vector3f lims = A.fullPivHouseholderQr().solve(b);\n    Eigen::Vector3f lims = A.bdcSvd(Eigen::ComputeFullU | Eigen::ComputeFullV).solve(b);\n    float relative_error = (A*lims - b).norm() / b.norm();\n    if (relative_error > 1e-5)\n    {\n        PLOGW << \"Could not find feasible inverse parameter mapping\";\n        return false;\n    }\n\n    params->j_lim = lims(0);\n    params->a_lim = lims(1);\n    params->v_lim = lims(2);\n    populateSwitchTimeParameters(params, dt_j, dt_a, dt_v);   \n    return true;\n}\n\nstd::vector<float> computeKinematicsBasedOnRegion(const SCurveParameters& params, int region, float dt)\n{\n    float j, a, v, p;\n    bool need_a = true;\n    bool need_v = true;\n\n    // Positive jerk\n    if (region == 1 || region == 7) \n    { \n        j = params.j_lim; \n    }\n    // Negative jerk\n    else if (region == 3 || region == 5) \n    { \n        j = -1 * params.j_lim; \n    }\n    // Constant positive acceleration\n    else if (region == 2)\n    {\n        j = 0;\n        a = params.a_lim;\n        need_a = false;        \n    }\n    // Constant negative acceleration\n    else if ( region == 6)\n    {\n        j = 0;\n        a = -1*params.a_lim;\n        need_a = false; \n    }\n    // Constant velocity region\n    else if (region == 4)\n    {\n        j = 0;\n        a = 0;\n        v = params.v_lim;\n        need_a = false; \n        need_v = false; \n    }\n    else\n    {\n        // Error\n        PLOGE << \"Invalid region value: \" << region;\n        return {0,0,0,0};\n    }\n\n    // Compute remaining values\n    if (need_a)\n    {\n        a = params.switch_points[region-1].a + j * dt;\n    }\n\n    if (need_v)\n    {\n        v = params.switch_points[region-1].v + \n            params.switch_points[region-1].a * dt + \n            0.5 * j * std::pow(dt, 2);\n    }\n\n    p = params.switch_points[region-1].p + \n        params.switch_points[region-1].v * dt + \n        0.5 * params.switch_points[region-1].a * std::pow(dt, 2) + \n        d6 * j * std::pow(dt, 3);\n\n    return {a,v,p};\n}"
  },
  {
    "path": "src/robot/src/SmoothTrajectoryGenerator.h",
    "content": "#ifndef SmoothTrajectoryGenerator_h\n#define SmoothTrajectoryGenerator_h\n\n#include <Eigen/Dense>\n#include \"utils.h\"\n\n\n// Return structure for a trajectory point lookup that contains all the info about a point in time the controller\n// needs to drive the robot\nstruct PVTPoint\n{\n    Point position;\n    Velocity velocity;\n    float time;\n\n    std::string toString() const\n    {\n      char s[200];\n      sprintf(s, \"[Position: %s, Velocity: %s, T: %.3f]\", position.toString().c_str(), velocity.toString().c_str(), time);\n      return static_cast<std::string>(s);\n    }\n\n};\n\n// Contains info about the maximum dynamic limits of a trajectory\nstruct DynamicLimits\n{\n    float max_vel;\n    float max_acc;\n    float max_jerk;\n\n    DynamicLimits operator* (float c)\n    {\n        DynamicLimits rtn;\n        rtn.max_vel = c * max_vel;\n        rtn.max_acc = c * max_acc;\n        rtn.max_jerk = c * max_jerk;\n        return rtn;\n    }\n};\n\n// A fully defined point for switching from one region of the trajectory\n// to another - needed for efficient lookup without building a huge table\nstruct SwitchPoint\n{\n    float t;\n    float p;\n    float v;\n    float a;\n};\n\n// Parameters defining a 1-D S-curve trajectory\nstruct SCurveParameters\n{\n    float v_lim;\n    float a_lim;\n    float j_lim;\n    SwitchPoint switch_points[8];\n\n    std::string toString() const\n    {\n      char s[256];\n      sprintf(s, \"    Limits: [v: %.3f, a: %.3f, j: %.3f]\\n    Switch times: [%.3f, %.3f, %.3f, %.3f, %.3f, %.3f, %.3f, %.3f]\", v_lim, a_lim, j_lim,\n      switch_points[0].t,switch_points[1].t,switch_points[2].t,switch_points[3].t,switch_points[4].t,switch_points[5].t,switch_points[6].t,switch_points[7].t);\n      return static_cast<std::string>(s);\n    }\n};\n\n// Everything needed to define a point to point s-curve trajectory in X, Y, and angle\nstruct Trajectory\n{\n    Eigen::Vector2f trans_direction;\n    int rot_direction;\n    Point initialPoint;\n    SCurveParameters trans_params;\n    SCurveParameters rot_params;\n    bool complete;\n\n    std::string toString() const\n    {\n      char s[1000];\n      sprintf(s, \"Trajectory Parameters:\\nTranslation:\\n  Direction: [%.2f, %.2f]\\n  S-Curve:\\n%s\\nRotation:\\n  Direction: %i\\n  S-Curve:\\n%s\\n\", \n        trans_direction[0], trans_direction[1], trans_params.toString().c_str(), rot_direction, rot_params.toString().c_str());\n      return static_cast<std::string>(s);\n    }\n};\n\nstruct SolverParameters\n{\n    int num_loops;\n    float alpha_decay;\n    float beta_decay;\n    float exponent_decay;\n};\n\nenum class LIMITS_MODE\n{\n    COARSE,\n    FINE,\n    VISION,\n    SLOW,\n};\n\n// All the pieces needed to define the motion planning problem\nstruct MotionPlanningProblem\n{\n    Eigen::Vector3f initialPoint;\n    Eigen::Vector3f targetPoint;\n    DynamicLimits translationalLimits;\n    DynamicLimits rotationalLimits;  \n    SolverParameters solver_params;\n};\n\n// Helper methods - making public for easier testing\nMotionPlanningProblem buildMotionPlanningProblem(Point initialPoint, Point targetPoint, LIMITS_MODE limits_mode, const SolverParameters& solver);\nTrajectory generateTrajectory(MotionPlanningProblem problem);\nbool generateSCurve(float dist, DynamicLimits limits, const SolverParameters& solver, SCurveParameters* params);\nvoid populateSwitchTimeParameters(SCurveParameters* params, float dt_j, float dt_a, float dt_v);\nbool synchronizeParameters(SCurveParameters* params1, SCurveParameters* params2);\nbool slowDownParamsToMatchTime(SCurveParameters* params, float time_to_match);\nbool solveInverse(SCurveParameters* params);\nstd::vector<float> lookup_1D(float time, const SCurveParameters& params);\nstd::vector<float> computeKinematicsBasedOnRegion(const SCurveParameters& params, int region, float dt);\n\n\nclass SmoothTrajectoryGenerator\n{\n\n  public:\n    SmoothTrajectoryGenerator();\n\n    // Generates a trajectory that starts at the initial point and ends at the target point. Setting fineMode to true makes the \n    // adjusts the dynamic limits for a more accurate motion. Returns a bool indicating if trajectory generation was\n    // successful\n    bool generatePointToPointTrajectory(Point initialPoint, Point targetPoint, LIMITS_MODE limits_mode);\n\n    // Generates a trajectory that attempts to maintain the target velocity for a specified time. Note that the current implimentation\n    // of this does not give a guarantee on the accuracy of the velocity if the specified velocity and move time would violate the dynamic \n    // limits of the fine or coarse movement mode. Returns a bool indicating if trajectory generation was successful\n    bool generateConstVelTrajectory(Point initialPoint, Velocity velocity, float moveTime, LIMITS_MODE limits_mode);\n\n    // Looks up a point in the current trajectory based on the time, in seconds, from the start of the trajectory\n    PVTPoint lookup(float time);\n\n  private:\n\n    // The current trajectory - this lets the generation class hold onto this and just provide a lookup method\n    // since I don't have a need to pass the trajectory around anywhere\n    Trajectory currentTrajectory_;\n    \n    // These need to be part of the class because they need to be loaded at construction time, not\n    // program initialization time (i.e. as globals). This is because the config file is not\n    // yet loaded at program start up time.\n    SolverParameters solver_params_;\n};\n\n#endif"
  },
  {
    "path": "src/robot/src/StatusUpdater.cpp",
    "content": "#include \"StatusUpdater.h\"\n#include \"constants.h\"\n\nStatusUpdater::StatusUpdater() :\n  currentStatus_()\n{\n}\n\nstd::string StatusUpdater::getStatusJsonString() \n{\n    // Return the status string\n    return currentStatus_.toJsonString();\n}\n\nvoid StatusUpdater::updatePosition(float x, float y, float a)\n{\n    currentStatus_.pos_x = x;\n    currentStatus_.pos_y = y;\n    currentStatus_.pos_a = a;\n}\n\nvoid StatusUpdater::updateVelocity(float vx, float vy, float va)\n{\n    currentStatus_.vel_x = vx;\n    currentStatus_.vel_y = vy;\n    currentStatus_.vel_a = va;\n}\n\nvoid StatusUpdater::updateInProgress(bool in_progress)\n{\n  currentStatus_.in_progress = in_progress;\n}\n\nvoid StatusUpdater::updateControlLoopTime(int controller_loop_ms)\n{\n    currentStatus_.controller_loop_ms = controller_loop_ms;\n}\n\nvoid StatusUpdater::updatePositionLoopTime(int position_loop_ms)\n{\n    currentStatus_.position_loop_ms = position_loop_ms;\n}\n\nvoid StatusUpdater::updateLocalizationMetrics(LocalizationMetrics localization_metrics)\n{\n    currentStatus_.localization_metrics = localization_metrics;\n}\n\nvoid StatusUpdater::update_motor_driver_connected(bool connected)\n{\n  currentStatus_.motor_driver_connected = connected;\n}\n\nvoid StatusUpdater::update_lifter_driver_connected(bool connected)\n{\n  currentStatus_.lifter_driver_connected = connected;\n}\n\nvoid StatusUpdater::updateVisionControllerPose(Point pose)\n{\n  currentStatus_.vision_x = pose.x;\n  currentStatus_.vision_y = pose.y;\n  currentStatus_.vision_a = pose.a;\n}\n\nvoid StatusUpdater::updateLastMarvelmindPose(Point pose, bool pose_used)\n{\n  currentStatus_.last_mm_x = pose.x;\n  currentStatus_.last_mm_y = pose.y;\n  currentStatus_.last_mm_a = pose.a;\n  currentStatus_.last_mm_used = pose_used;\n}\n"
  },
  {
    "path": "src/robot/src/StatusUpdater.h",
    "content": "\n#ifndef StatusUpdater_h\n#define StatusUpdater_h\n\n#include <ArduinoJson/ArduinoJson.h>\n#include \"utils.h\"\n\nclass StatusUpdater\n{\n  public:\n    StatusUpdater();\n\n    std::string getStatusJsonString();\n\n    void updatePosition(float x, float y, float a);\n\n    void updateVelocity(float vx, float vy, float va);\n\n    void updateControlLoopTime(int controller_loop_ms);\n\n    void updatePositionLoopTime(int position_loop_ms);\n\n    void updateInProgress(bool in_progress);\n\n    bool getInProgress() const { return currentStatus_.in_progress; };\n    \n    void setErrorStatus() { currentStatus_.error_status = true;}\n\n    void clearErrorStatus() {currentStatus_.error_status = false;};\n\n    bool getErrorStatus() const {return currentStatus_.error_status;}\n\n    void updateLocalizationMetrics(LocalizationMetrics localization_metrics);\n\n    float getLocalizationConfidence() const {return currentStatus_.localization_metrics.total_confidence;};\n\n    void update_motor_driver_connected(bool connected);\n\n    void update_lifter_driver_connected(bool connected);\n\n    void updateCameraDebug(CameraDebug camera_debug) {currentStatus_.camera_debug = camera_debug;};\n\n    void updateVisionControllerPose(Point pose);\n    \n    void updateLastMarvelmindPose(Point pose, bool pose_used);\n\n    struct Status\n    {\n      // Current position and velocity\n      float pos_x;\n      float pos_y;\n      float pos_a;\n      float vel_x;\n      float vel_y;\n      float vel_a;\n\n      // Vision tracker pose\n      float vision_x;\n      float vision_y;\n      float vision_a;\n\n      // Last pose from mm\n      float last_mm_x;\n      float last_mm_y;\n      float last_mm_a;\n      bool last_mm_used;\n\n      // Loop times\n      int controller_loop_ms;\n      int position_loop_ms;\n\n      bool in_progress;\n      bool error_status;\n      uint8_t counter; // Just to show that the status is updating. Okay to roll over\n\n      bool motor_driver_connected;\n      bool lifter_driver_connected;\n\n      LocalizationMetrics localization_metrics;\n      CameraDebug camera_debug;\n\n      //When adding extra fields, update toJsonString method to serialize and add additional capacity\n\n      Status():\n      pos_x(0.0),\n      pos_y(0.0),\n      pos_a(0.0),\n      vel_x(0.0),\n      vel_y(0.0),\n      vel_a(0.0),\n      vision_x(0.0),\n      vision_y(0.0),\n      vision_a(0.0),\n      last_mm_x(0.0),\n      last_mm_y(0.0),\n      last_mm_a(0.0),\n      last_mm_used(false),\n      controller_loop_ms(999),\n      position_loop_ms(999),\n      in_progress(false),\n      error_status(false),\n      counter(0),\n      motor_driver_connected(false),\n      lifter_driver_connected(false),\n      localization_metrics(),\n      camera_debug()\n      {\n      }\n\n      std::string toJsonString()\n      {\n        // Size the object correctly\n        const size_t capacity = JSON_OBJECT_SIZE(50); // Update when adding new fields\n        DynamicJsonDocument root(capacity);\n\n        // Format to match messages sent by server\n        root[\"type\"] = \"status\";\n        JsonObject doc = root.createNestedObject(\"data\");\n\n        // Fill in data\n        doc[\"pos_x\"] = pos_x;\n        doc[\"pos_y\"] = pos_y;\n        doc[\"pos_a\"] = pos_a;\n        doc[\"vel_x\"] = vel_x;\n        doc[\"vel_y\"] = vel_y;\n        doc[\"vel_a\"] = vel_a;\n        doc[\"controller_loop_ms\"] = controller_loop_ms;\n        doc[\"position_loop_ms\"] = position_loop_ms;\n        doc[\"in_progress\"] = in_progress;\n        doc[\"error_status\"] = error_status;\n        doc[\"counter\"] = counter++;\n        doc[\"motor_driver_connected\"] = motor_driver_connected;\n        doc[\"lifter_driver_connected\"] = lifter_driver_connected;\n        doc[\"localization_confidence_x\"] = localization_metrics.confidence_x;\n        doc[\"localization_confidence_y\"] = localization_metrics.confidence_y;\n        doc[\"localization_confidence_a\"] = localization_metrics.confidence_a;\n        doc[\"localization_total_confidence\"] = localization_metrics.total_confidence;\n        doc[\"last_position_uncertainty\"] = localization_metrics.last_position_uncertainty;\n        doc[\"cam_side_ok\"] = camera_debug.side_ok;\n        doc[\"cam_rear_ok\"] = camera_debug.rear_ok;\n        doc[\"cam_both_ok\"] = camera_debug.both_ok;\n        doc[\"cam_side_u\"] = camera_debug.side_u;\n        doc[\"cam_side_v\"] = camera_debug.side_v;\n        doc[\"cam_rear_u\"] = camera_debug.rear_u;\n        doc[\"cam_rear_v\"] = camera_debug.rear_v;\n        doc[\"cam_side_x\"] = camera_debug.side_x;\n        doc[\"cam_side_y\"] = camera_debug.side_y;\n        doc[\"cam_rear_x\"] = camera_debug.rear_x;\n        doc[\"cam_rear_y\"] = camera_debug.rear_y;\n        doc[\"cam_pose_x\"] = camera_debug.pose_x;\n        doc[\"cam_pose_y\"] = camera_debug.pose_y;\n        doc[\"cam_pose_a\"] = camera_debug.pose_a;\n        doc[\"cam_loop_ms\"] = camera_debug.loop_ms;\n        doc[\"vision_x\"] = vision_x;\n        doc[\"vision_y\"] = vision_y;\n        doc[\"vision_a\"] = vision_a;\n        doc[\"last_mm_x\"] = last_mm_x;\n        doc[\"last_mm_y\"] = last_mm_y;\n        doc[\"last_mm_a\"] = last_mm_a;\n        doc[\"last_mm_used\"] = last_mm_used;\n\n        // Serialize and return string\n        std::string msg;\n        serializeJson(root, msg);\n        return msg;\n      }\n    };\n\n    Status getStatus() { return currentStatus_;};\n\n  private:\n    Status currentStatus_;\n\n};\n\n#endif\n"
  },
  {
    "path": "src/robot/src/TrayController.cpp",
    "content": "#include \"TrayController.h\"\n#include \"constants.h\"\n#include <plog/Log.h>\n#include \"serial/SerialCommsFactory.h\"\n\nenum class LifterPosType\n{\n    DEFAULT,\n    PLACE,\n    LOAD,\n};\n\nint getPos(LifterPosType pos_type)\n{\n    float revs = 0.0;\n    switch(pos_type)\n    {\n        case LifterPosType::DEFAULT:\n            revs = cfg.lookup(\"tray.default_pos_revs\");\n            break;\n        case LifterPosType::PLACE:\n            revs = cfg.lookup(\"tray.place_pos_revs\");\n            break;\n        case LifterPosType::LOAD:\n            revs = cfg.lookup(\"tray.load_pos_revs\");\n            break;\n    }\n    int steps_per_rev = cfg.lookup(\"tray.steps_per_rev\");\n    float steps = steps_per_rev * revs;\n    return static_cast<int>(steps);\n}\n\nTrayController::TrayController()\n: serial_to_lifter_driver_(SerialCommsFactory::getFactoryInstance()->get_serial_comms(CLEARCORE_USB)),\n  action_step_running_(false),\n  load_complete_(false),\n  action_step_(0),\n  fake_tray_motion_(cfg.lookup(\"tray.fake_tray_motions\")),\n  cur_action_(ACTION::NONE),\n  controller_rate_(cfg.lookup(\"tray.controller_frequency\")),\n  is_initialized_(false)\n{\n    if(fake_tray_motion_) PLOGW << \"Fake tray motion enabled\";\n}\n\nvoid TrayController::initialize()\n{\n    cur_action_ = ACTION::INITIALIZE;\n    action_step_ = 0;\n    PLOGI << \"Starting tray action INITIALIZE\";\n}\n\nbool TrayController::place()\n{\n    if(!is_initialized_ && !fake_tray_motion_) \n    {\n        PLOGE << \"Tray is not initialized, aborting action\";\n        return false;\n    }\n    \n    cur_action_ = ACTION::PLACE;\n    action_step_ = 0;\n    PLOGI << \"Starting tray action PLACE\";\n    return true;\n}\n\nbool TrayController::load()\n{\n    if(!is_initialized_ && !fake_tray_motion_) \n    {\n        PLOGE << \"Tray is not initialized, aborting action\";\n        return false;\n    }\n    \n    cur_action_ = ACTION::LOAD;\n    action_step_ = 0;\n    PLOGI << \"Starting tray action LOAD\";\n    return true;\n}\n\nvoid TrayController::estop()\n{\n    cur_action_ = ACTION::NONE;\n    action_step_ = 0;\n    action_step_running_ = false;\n    PLOGW << \"Estopping tray control\";\n    if (serial_to_lifter_driver_->isConnected())\n    {\n        serial_to_lifter_driver_->send(\"lift:stop\");\n    }\n}\n\nvoid TrayController::setLoadComplete()\n{\n    if(cur_action_ == ACTION::LOAD && action_step_ == 1)\n    {\n        load_complete_ = true;\n    } \n    else \n    {\n        PLOGW << \"Recieved LOAD_COMPLETE signal at incorrect time. Ignoring.\";\n    }\n}\n\n\nvoid TrayController::update()\n{\n    if (!controller_rate_.ready()) { return; }\n    switch (cur_action_)\n    {\n        case ACTION::INITIALIZE:\n            updateInitialize();\n            break;\n        case ACTION::LOAD:\n            updateLoad();\n            break;\n        case ACTION::PLACE:\n            updatePlace();\n            break;\n        default:\n            // Just to make sure a serial buffer doesn't fill up somewhere\n            std::string msg = \"\";\n            msg = serial_to_lifter_driver_->rcv_lift();\n            (void) msg;\n            break;\n    }\n}\n\nvoid TrayController::runStepAndWaitForCompletion(std::string data, std::string debug_print)\n{\n    if(!action_step_running_)\n    {\n        if (serial_to_lifter_driver_->isConnected())\n        {\n            serial_to_lifter_driver_->send(data);\n        }\n        action_step_running_ = true;\n        PLOGI << debug_print;\n        action_timer_.reset();\n    }\n    else\n    {\n        // Request status and wait for command to complete\n        std::string msg = \"\";\n        if (serial_to_lifter_driver_->isConnected())\n        {\n            serial_to_lifter_driver_->send(\"lift:status_req\");\n            msg = serial_to_lifter_driver_->rcv_lift();\n        }\n        // Initial delay for action\n        if(action_timer_.dt_ms() > 1000 && (msg == \"none\" || fake_tray_motion_)) {\n            action_step_running_ = false;\n            action_step_++;\n        }\n    }\n}\n\nvoid TrayController::updateInitialize()\n{\n    /*\n    Sequence:\n    0 - Close latch\n    1 - Do tray init\n    2 - Move to default location\n    */\n\n    // 0 - Close latch\n    if(action_step_ == 0)\n    {\n        std::string data = \"lift:close\";\n        std::string debug = \"Closing latch\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 1 - Do tray init\n    if(action_step_ == 1)\n    {\n        std::string data = \"lift:home\";\n        std::string debug = \"Homing tray\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 2 - Move to default location\n    if(action_step_ == 2)\n    {\n        int pos = getPos(LifterPosType::DEFAULT);\n        std::string data = \"lift:pos:\" + std::to_string(pos);\n        std::string debug = \"Moving tray to default position\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 3 - Done with actinon\n    if(action_step_ == 3)\n    {\n        cur_action_ = ACTION::NONE;\n        PLOGI << \"Done with initialization\";\n        is_initialized_ = true;\n    }\n}\n\nvoid TrayController::updatePlace()\n{\n    /*\n    Sequence:\n    0 - Move tray to place\n    1 - Open latch\n    2 - Move tray to default\n    3 - Close latch\n    */\n\n    // 0 - Move tray to place\n    if(action_step_ == 0)\n    {\n        int pos = getPos(LifterPosType::PLACE);\n        std::string data = \"lift:pos:\" + std::to_string(pos);\n        std::string debug = \"Moving tray to placement position\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 1 - Open latch\n    if(action_step_ == 1)\n    {\n        std::string data = \"lift:open\";\n        std::string debug = \"Opening latch\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 2 - Move tray to default\n    if(action_step_ == 2)\n    {\n        int pos = getPos(LifterPosType::DEFAULT);\n        std::string data = \"lift:pos:\" + std::to_string(pos);\n        std::string debug = \"Moving tray to default position\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 3 - Close latch\n    if(action_step_ == 3)\n    {\n        std::string data = \"lift:close\";\n        std::string debug = \"Closing latch\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 4 - Done with actinon\n    if(action_step_ == 4)\n    {\n        cur_action_ = ACTION::NONE;\n        PLOGI << \"Done with place\";\n    }\n}\n\nvoid TrayController::updateLoad()\n{\n    /*\n    Sequence:\n    0 - Move tray to load\n    1 - Wait for load complete signal\n    2 - Move to default\n    */\n\n    // 0 - Move tray to load\n    if(action_step_ == 0)\n    {\n        int pos = getPos(LifterPosType::LOAD);\n        std::string data = \"lift:pos:\" + std::to_string(pos);\n        std::string debug = \"Moving tray to load position\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 1 - Wait for load complete signal\n    if(action_step_ == 1)\n    {\n        if(!action_step_running_) \n        {\n            action_step_running_ = true;\n            PLOGI << \"Waiting for load complete\";\n        } \n        else\n        {\n            if(load_complete_)\n            {\n                action_step_++;\n                PLOGI << \"Got load complete signal\";\n                load_complete_ = false;\n                action_step_running_ = false;\n            }\n        }\n    }\n    // 2 - Move to default\n    if(action_step_ == 2)\n    {\n        int pos = getPos(LifterPosType::DEFAULT);\n        std::string data = \"lift:pos:\" + std::to_string(pos);\n        std::string debug = \"Moving tray to default position\";\n        runStepAndWaitForCompletion(data, debug);\n    }\n    // 3 - Done with actinon\n    if(action_step_ == 3)\n    {\n        cur_action_ = ACTION::NONE;\n        PLOGI << \"Done with load\";\n    }\n\n}"
  },
  {
    "path": "src/robot/src/TrayController.h",
    "content": "#ifndef TrayController_h\n#define TrayController_h\n\n#include \"serial/SerialComms.h\"\n#include \"utils.h\"\n\nclass TrayController\n{\n  public:\n\n    TrayController();\n\n    void initialize();\n\n    // Returns bool indicating if action started successfully\n    bool place();\n\n    // Returns bool indicating if action started successfully\n    bool load();\n\n    bool isActionRunning() {return cur_action_ != ACTION::NONE;}\n\n    void update();\n\n    void estop();\n\n    void setLoadComplete();\n\n    // Used for testing\n    void setTrayInitialized(bool value) {is_initialized_ = value;};\n\n  private:\n\n    enum ACTION\n    {\n        NONE,\n        INITIALIZE,\n        PLACE,\n        LOAD,\n    };\n\n    SerialCommsBase* serial_to_lifter_driver_;   // Serial connection to lifter driver\n    bool action_step_running_;\n    bool load_complete_;\n    int action_step_;\n    bool fake_tray_motion_;\n    ACTION cur_action_;\n    RateController controller_rate_;       // Rate limit controller loop\n    Timer action_timer_;\n    bool is_initialized_;\n\n    void runStepAndWaitForCompletion(std::string data, std::string debug_print);\n    void updateInitialize();\n    void updateLoad();\n    void updatePlace();\n\n\n};\n\n#endif"
  },
  {
    "path": "src/robot/src/camera_tracker/CameraPipeline.cpp",
    "content": "#include \"CameraPipeline.h\"\n\n#include <plog/Log.h>\n#include \"constants.h\"\n#include <mutex>\n\nstd::mutex data_mutex;\n\ncv::Point2f getBestKeypoint(std::vector<cv::KeyPoint> keypoints)\n{\n    if(keypoints.empty())\n    {\n        return {0,0};\n    }\n    \n    // Get keypoint with largest area\n    cv::KeyPoint best_keypoint = keypoints.front();\n    for (const auto& k : keypoints) {\n        // PLOGI << \"Keypoint: \" << k.class_id;\n        // PLOGI << \"Point: \" << k.pt;\n        // PLOGI << \"Angle: \" << k.angle;\n        // PLOGI << \"Size: \" << k.size;\n        if(k.size > best_keypoint.size)\n        {\n            best_keypoint = k;\n        }\n    }\n    return best_keypoint.pt;\n}\n\nCameraPipeline::CameraPipeline(CAMERA_ID id, bool start_thread)\n: use_debug_image_(cfg.lookup(\"vision_tracker.debug.use_debug_image\")),\n  output_debug_images_(cfg.lookup(\"vision_tracker.debug.save_camera_debug\")),\n  thread_running_(false),\n  current_output_()\n{\n    initCamera(id);\n\n    // Configure detection parameters\n    threshold_ = cfg.lookup(\"vision_tracker.detection.threshold\");\n    blob_params_.minThreshold = 10;\n    blob_params_.maxThreshold = 200;\n    blob_params_.filterByArea = cfg.lookup(\"vision_tracker.detection.blob.use_area\");\n    blob_params_.minArea = cfg.lookup(\"vision_tracker.detection.blob.min_area\");\n    blob_params_.maxArea = cfg.lookup(\"vision_tracker.detection.blob.max_area\");\n    blob_params_.filterByColor = false;\n    blob_params_.filterByCircularity = cfg.lookup(\"vision_tracker.detection.blob.use_circularity\");\n    blob_params_.minCircularity = cfg.lookup(\"vision_tracker.detection.blob.min_circularity\");\n    blob_params_.maxCircularity = cfg.lookup(\"vision_tracker.detection.blob.max_circularity\");\n    blob_params_.filterByConvexity = false;\n    blob_params_.filterByInertia = false;\n    blob_detector_ = cv::SimpleBlobDetector::create(blob_params_);\n\n    // Start thread\n    if (start_thread) start();\n}\n\nCameraPipeline::~CameraPipeline()\n{\n    stop();\n}\n\nvoid CameraPipeline::initCamera(CAMERA_ID id)\n{\n    // Get config strings\n    std::string name = cameraIdToString(id);\n    std::string calibration_path_config_name = \"vision_tracker.\" + name + \".calibration_file\";\n    std::string camera_path_config_name = \"vision_tracker.\" + name + \".camera_path\";\n    std::string debug_output_path_config_name = \"vision_tracker.\" + name + \".debug_output_path\";\n    std::string debug_image_config_name = \"vision_tracker.\" + name + \".debug_image\";\n    std::string x_offset_config_name = \"vision_tracker.physical.\" + name + \".x_offset\";\n    std::string y_offset_config_name = \"vision_tracker.physical.\" + name + \".y_offset\";\n    std::string z_offset_config_name = \"vision_tracker.physical.\" + name + \".z_offset\";\n    std::string x_res_scale_config_name = \"vision_tracker.\" + name + \".resolution_scale_x\";\n    std::string y_res_scale_config_name = \"vision_tracker.\" + name + \".resolution_scale_y\";\n    \n    // Initialize intrinsic calibration data\n    camera_data_.id = id;\n    std::string calibration_path = cfg.lookup(calibration_path_config_name);\n    PLOGI.printf(\"Loading %s camera calibration from %s\", name.c_str(), calibration_path.c_str());\n    cv::FileStorage fs(calibration_path, cv::FileStorage::READ);\n    if(fs[\"K\"].isNone() || fs[\"D\"].isNone()) \n    {\n        PLOGE.printf(\"Missing %s calibration data\", name.c_str());\n        throw;\n    }\n    fs[\"K\"] >> camera_data_.K;\n    fs[\"D\"] >> camera_data_.D;\n\n    // Handle scale factors\n    float x_res_scale = cfg.lookup(x_res_scale_config_name);\n    camera_data_.K.at<double>(0,0) = camera_data_.K.at<double>(0,0) * x_res_scale;\n    camera_data_.K.at<double>(0,1) = camera_data_.K.at<double>(0,1) * x_res_scale;\n    camera_data_.K.at<double>(0,2) = camera_data_.K.at<double>(0,2) * x_res_scale;\n    float y_res_scale = cfg.lookup(y_res_scale_config_name);\n    camera_data_.K.at<double>(1,0) = camera_data_.K.at<double>(1,0) * y_res_scale;\n    camera_data_.K.at<double>(1,1) = camera_data_.K.at<double>(1,1) * y_res_scale;\n    camera_data_.K.at<double>(1,2) = camera_data_.K.at<double>(1,2) * y_res_scale;\n\n    // Initialize extrinsic data\n    float x_offset = cfg.lookup(x_offset_config_name);\n    float y_offset = cfg.lookup(y_offset_config_name);\n    float z_offset = cfg.lookup(z_offset_config_name);\n    Eigen::Vector3f camera_pose = {x_offset, y_offset, z_offset};\n    Eigen::Matrix3f camera_rotation;\n    // Hardcoding rotation matrices here for simplicity\n    if(id == CAMERA_ID::SIDE) \n    {\n        camera_rotation << 1,0,0, 0,-1,0, 0,0,-1;\n    }\n    else\n    {\n        camera_rotation << 0,1,0, 1,0,0, 0,0,-1;\n    }\n    // From https://ksimek.github.io/2012/08/22/extrinsic/\n    camera_data_.t = -1 * camera_rotation * camera_pose;\n    camera_data_.R =  camera_rotation.transpose();\n    // PLOGI << name << \" R: \" << camera_data_.R;\n    // PLOGI << name << \" t: \" << camera_data_.t.transpose();\n\n    // Precompute some inverse values for later re-use\n    camera_data_.R_inv = camera_data_.R.inverse();\n    Eigen::Matrix3f tmp;\n    // Hack to get around problems with enabling eigen support in opencv\n    tmp << camera_data_.K.at<double>(0,0), camera_data_.K.at<double>(0,1), camera_data_.K.at<double>(0,2),\n           camera_data_.K.at<double>(1,0), camera_data_.K.at<double>(1,1), camera_data_.K.at<double>(1,2),\n           camera_data_.K.at<double>(2,0), camera_data_.K.at<double>(2,1), camera_data_.K.at<double>(2,2);\n    camera_data_.K_inv = tmp.inverse();\n\n    // PLOGI << name << \" K: \" << camera_data_.K;\n    // PLOGI << name << \" Ktmp: \" << tmp;\n    // PLOGI << name << \" Kinv: \" << camera_data_.K_inv;\n    \n    // Initialize camera calibration capture or debug data\n    if(!use_debug_image_)\n    {\n        std::string camera_path = cfg.lookup(camera_path_config_name);\n        camera_data_.capture = cv::VideoCapture(camera_path);\n        if (!camera_data_.capture.isOpened()) \n        {\n            PLOGE.printf(\"Could not open %s camera at %s\", name.c_str(), camera_path.c_str());\n            throw;\n        }\n        PLOGI.printf(\"Opened %s camera at %s\", name.c_str(), camera_path.c_str());\n        camera_data_.capture.set(cv::CAP_PROP_FRAME_WIDTH, 320);\n        camera_data_.capture.set(cv::CAP_PROP_FRAME_HEIGHT, 240);\n        camera_data_.capture.set(cv::CAP_PROP_FPS, 30);\n\n        int width = camera_data_.capture.get(cv::CAP_PROP_FRAME_WIDTH);\n        int height = camera_data_.capture.get(cv::CAP_PROP_FRAME_HEIGHT);\n        int fps = camera_data_.capture.get(cv::CAP_PROP_FPS);\n        PLOGI.printf(\"Properties %s: resolution: %ix%i, fps: %i\", name.c_str(), width, height, fps);\n    } \n    else \n    {\n        std::string image_path = cfg.lookup(debug_image_config_name);\n        camera_data_.debug_frame = cv::imread(image_path, cv::IMREAD_GRAYSCALE);\n        if(camera_data_.debug_frame.empty())\n        {\n            PLOGE.printf(\"Could not read %s debug image from %s\", name.c_str(), image_path.c_str());\n            throw;\n        }\n        PLOGW.printf(\"Loading %s debug image from %s\", name.c_str(), image_path.c_str());\n    }\n\n    // Initialze debug output path\n    if(output_debug_images_)\n    {\n        camera_data_.debug_output_path = std::string(cfg.lookup(debug_output_path_config_name)); \n    }\n}\n\nstd::string CameraPipeline::cameraIdToString(CAMERA_ID id)\n{\n    if(id == CAMERA_ID::SIDE) return \"side\";\n    if(id == CAMERA_ID::REAR) return \"rear\";\n    PLOGE << \"Unknown camera id\";\n    return \"unk\";\n}\n\nstd::vector<cv::KeyPoint> CameraPipeline::allKeypointsInImage(cv::Mat img_raw, bool output_debug)\n{   \n    // Timer t;\n\n    // Undistort and crop\n    cv::Mat img_undistorted;\n    cv::Rect validPixROI;\n    // PLOGI << \"init memory time: \" << t.dt_ms();\n    // t.reset();\n    cv::Mat newcameramtx = cv::getOptimalNewCameraMatrix(camera_data_.K, camera_data_.D, img_raw.size(), /*alpha=*/1, img_raw.size(), &validPixROI);\n    // PLOGI << \"new camera time: \" << t.dt_ms();\n    // t.reset();\n    cv::undistort(img_raw, img_undistorted, camera_data_.K, camera_data_.D);\n\n    // PLOGI << \"undistort time: \" << t.dt_ms();\n    // t.reset();\n\n    // Threshold\n    cv::Mat img_thresh;\n    cv::threshold(img_undistorted, img_thresh, threshold_, 255, cv::THRESH_BINARY_INV);\n    // PLOGI <<\" Threshold\";\n\n    // PLOGI << \"threshold time: \" << t.dt_ms();\n    // t.reset();\n\n    // Blob detection\n    std::vector<cv::KeyPoint> keypoints;\n    keypoints.reserve(10);\n    // PLOGI << \"blob memory time: \" << t.dt_ms();\n    // t.reset();\n    blob_detector_->detect(img_thresh, keypoints);\n    // PLOGI <<\"Num blobs \" << keypoints.size();\n\n    // PLOGI << \"blob detect time: \" << t.dt_ms();\n    // t.reset();\n\n    if(output_debug)\n    {\n        std::string debug_path = camera_data_.id == CAMERA_ID::SIDE ? \n            cfg.lookup(\"vision_tracker.side.debug_output_path\") :\n            cfg.lookup(\"vision_tracker.rear.debug_output_path\");\n        cv::imwrite(debug_path + \"img_raw.jpg\", img_raw);\n        cv::imwrite(debug_path + \"img_undistorted.jpg\", img_undistorted);\n        cv::imwrite(debug_path + \"img_thresh.jpg\", img_thresh);\n\n        cv::Mat img_with_keypoints;\n        cv::drawKeypoints(img_thresh, keypoints, img_with_keypoints, cv::Scalar(0,0,255), cv::DrawMatchesFlags::DRAW_RICH_KEYPOINTS);\n        cv::imwrite(debug_path + \"img_keypoints.jpg\", img_with_keypoints);\n\n        cv::Mat img_with_best_keypoint = img_undistorted;\n        cv::Point2f best_keypoint = getBestKeypoint(keypoints);\n        cv::circle(img_with_best_keypoint, best_keypoint, 1, cv::Scalar(0,0,255), -1);\n        cv::circle(img_with_best_keypoint, best_keypoint, 10, cv::Scalar(0,0,255), 5);\n        Eigen::Vector2f best_point_m = cameraToRobot(best_keypoint);\n        std::string label_text = \"Best:\" + std::to_string(best_point_m[0]) +\"m,\"+ std::to_string(best_point_m[1]) + \"m\";\n        cv::putText(img_with_best_keypoint, //target image\n                    label_text, //text\n                    cv::Point(10, 20), //top-left position\n                    cv::FONT_HERSHEY_DUPLEX,\n                    0.5,\n                    CV_RGB(255,0,0), //font color\n                    1);\n        Eigen::Vector2f target_point_world;\n        if(camera_data_.id == CAMERA_ID::SIDE)\n        {\n            target_point_world << float(cfg.lookup(\"vision_tracker.physical.side.target_x\")), float(cfg.lookup(\"vision_tracker.physical.side.target_y\"));\n        }\n        else \n        {\n            target_point_world << float(cfg.lookup(\"vision_tracker.physical.rear.target_x\")), float(cfg.lookup(\"vision_tracker.physical.rear.target_y\"));\n        }\n        Eigen::Vector2f target_point_camera = robotToCamera(target_point_world);\n        cv::Point2f pt{target_point_camera[0], target_point_camera[1]};\n        cv::circle(img_with_best_keypoint, pt, 5, cv::Scalar(255,0,0), -1);\n        std::string label_text2 = \"Target:\" + std::to_string(target_point_camera[0]) +\"px, \"+ std::to_string(target_point_camera[1]) + \"px\";\n        cv::putText(img_with_best_keypoint, //target image\n                    label_text2, //text\n                    cv::Point(10, 40), //top-left position\n                    cv::FONT_HERSHEY_DUPLEX,\n                    0.5,\n                    CV_RGB(0,0,255), //font color\n                    1);\n\n        cv::imwrite(debug_path + \"img_best_keypoint.jpg\", img_with_best_keypoint);\n        PLOGI.printf(\"Writing debug images %s\",cameraIdToString(camera_data_.id).c_str());\n    }\n\n    // PLOGI << \"debug time: \" << t.dt_ms();\n    // t.reset();\n\n    return keypoints;\n}\n\nvoid CameraPipeline::start()\n{\n    if(!thread_running_)\n    {\n        PLOGI.printf(\"Starting %s camera thread\",cameraIdToString(camera_data_.id).c_str());\n        thread_running_ = true;\n        thread_ = std::thread(&CameraPipeline::threadLoop, this);\n        // thread_.detach();\n    }\n}\n\nvoid CameraPipeline::stop()\n{\n    if(thread_running_)\n    {\n        PLOGI.printf(\"Stopping %s camera thread\",cameraIdToString(camera_data_.id).c_str());\n        thread_running_ = false;\n        thread_.join();\n    }\n}\n\nvoid CameraPipeline::threadLoop()\n{\n    while(thread_running_)\n    {\n        oneLoop();\n    }\n}\n\nvoid CameraPipeline::oneLoop()\n{\n    Eigen::Vector2f best_point_m = {0,0};\n    cv::Point2f best_point_px = {0,0};\n    bool detected = false;\n    try\n    {\n\n        // Get latest frame\n        cv::Mat frame;\n        if (use_debug_image_) frame = camera_data_.debug_frame;\n        else camera_data_.capture >> frame;\n\n        // Perform keypoint detection\n        std::vector<cv::KeyPoint> keypoints = allKeypointsInImage(frame, output_debug_images_);\n\n        // Do post-processing if the detection was successful\n        detected = !keypoints.empty();\n        if(detected) \n        {\n            best_point_px = getBestKeypoint(keypoints);\n            best_point_m = cameraToRobot(best_point_px);\n\n        }\n    }\n    catch (cv::Exception& e)\n    {\n        PLOGW << \"Caught cv::Exception: \" << e.what();\n        detected = false;\n        best_point_m = {0,0};\n        best_point_px = {0,0};\n    }\n    \n\n    // Update output values in thread-safe manner\n    {\n        std::lock_guard<std::mutex> read_lock(data_mutex);\n        current_output_.ok = detected;\n        current_output_.timestamp = ClockFactory::getFactoryInstance()->get_clock()->now();\n        current_output_.point = best_point_m;\n        current_output_.uv = {best_point_px.x, best_point_px.y};\n    }\n}\n\nCameraPipelineOutput CameraPipeline::getData()\n{\n    std::lock_guard<std::mutex> read_lock(data_mutex);\n    CameraPipelineOutput output = current_output_;\n    current_output_.ok = false;\n    return output;\n}\n\nEigen::Vector2f CameraPipeline::cameraToRobot(cv::Point2f cameraPt)\n{\n    Eigen::Vector3f uv_point = {cameraPt.x, cameraPt.y, 1};\n\n    // Compute scaling factor for Z=0\n    Eigen::Vector3f tmp1 = camera_data_.R_inv * camera_data_.K_inv * uv_point;\n    Eigen::Vector3f tmp2 = camera_data_.R_inv * camera_data_.t;\n    float s = tmp2(2,0) / tmp1(2,0);\n\n    // Debug\n    Eigen::Vector3f scaled_cam_frame = s * camera_data_.K_inv * uv_point;\n    Eigen::Vector3f cam_frame_offset = scaled_cam_frame  - camera_data_.t;\n\n    // Convert to world coordinate\n    Eigen::Vector3f world_point = camera_data_.R_inv * cam_frame_offset;\n    // PLOGI << cameraIdToString(id);\n    // PLOGI << \"uv_point: \" << uv_point.transpose();\n    // PLOGI << \"scaled_cam_frame: \" << scaled_cam_frame.transpose();\n    // PLOGI << \"cam_frame_offset: \" << cam_frame_offset.transpose();\n    // PLOGI << \"world_point: \" << world_point.transpose();\n\n\n    if(fabs(world_point(2)) > 1e-3) PLOGE.printf(\"Z value %f is not near zero\", world_point(2));\n\n    return {world_point(0),world_point(1)};\n}\n\nEigen::Vector2f CameraPipeline::robotToCamera(Eigen::Vector2f robotPt)\n{\n    Eigen::Vector4f robot_point_3d = {robotPt[0], robotPt[1], 0, 1};\n    Eigen::MatrixXf r_t(3,4);\n    r_t << camera_data_.R, camera_data_.t;\n\n    Eigen::Matrix3f K_tmp;\n    // Hack to get around problems with enabling eigen support in opencv\n    K_tmp << camera_data_.K.at<double>(0,0), camera_data_.K.at<double>(0,1), camera_data_.K.at<double>(0,2),\n           camera_data_.K.at<double>(1,0), camera_data_.K.at<double>(1,1), camera_data_.K.at<double>(1,2),\n           camera_data_.K.at<double>(2,0), camera_data_.K.at<double>(2,1), camera_data_.K.at<double>(2,2);\n\n    Eigen::Vector3f camera_pt_scaled = K_tmp * r_t * robot_point_3d;\n    return {camera_pt_scaled[0]/camera_pt_scaled[2], camera_pt_scaled[1]/camera_pt_scaled[2]};\n}"
  },
  {
    "path": "src/robot/src/camera_tracker/CameraPipeline.h",
    "content": "#ifndef CameraPipeline_h\n#define CameraPipeline_h\n\n#include \"utils.h\"\n#include <opencv2/opencv.hpp>\n#include <Eigen/Dense>\n#include <thread>\n#include <atomic>\n\nenum class CAMERA_ID\n{\n    REAR,\n    SIDE\n};\n\nstruct CameraPipelineOutput\n{\n  bool ok = false;\n  ClockTimePoint timestamp = ClockFactory::getFactoryInstance()->get_clock()->now();\n  Eigen::Vector2f point = {0,0};\n  Eigen::Vector2f uv = {0,0};\n};\n\nclass CameraPipeline \n{\n  public:\n\n    CameraPipeline(CAMERA_ID id, bool start_thread);\n\n    ~CameraPipeline();\n\n    CameraPipelineOutput getData();\n\n    void start();\n\n    void stop();\n\n    void oneLoop();\n\n    void toggleDebugImageOutput() {output_debug_images_ = !output_debug_images_;};\n\n    Eigen::Vector2f cameraToRobot(cv::Point2f cameraPt);\n\n    Eigen::Vector2f robotToCamera(Eigen::Vector2f robotPt);\n  \n  private:\n\n    struct CameraData \n    {\n      CAMERA_ID id;\n      cv::VideoCapture capture;\n      cv::Mat K;\n      Eigen::Matrix3f K_inv;\n      cv::Mat D;\n      Eigen::Matrix3f R;\n      Eigen::Matrix3f R_inv;\n      Eigen::Vector3f t;\n      cv::Mat debug_frame;\n      std::string debug_output_path;\n    };\n\n    void threadLoop();\n\n    std::string cameraIdToString(CAMERA_ID id);\n    \n    void initCamera(CAMERA_ID id);\n\n    std::vector<cv::KeyPoint> allKeypointsInImage(cv::Mat img_raw, bool output_debug);\n\n    CameraData camera_data_;\n    bool use_debug_image_;\n    std::atomic<bool> output_debug_images_;\n    std::atomic<bool> thread_running_;\n    std::thread thread_;\n    CameraPipelineOutput current_output_;\n\n    // Params read from config\n    cv::SimpleBlobDetector::Params blob_params_;\n    cv::Ptr<cv::SimpleBlobDetector> blob_detector_;\n    int threshold_;\n    float pixels_per_meter_u_;\n    float pixels_per_meter_v_;\n};\n\n#endif //CameraPipeline_h"
  },
  {
    "path": "src/robot/src/camera_tracker/CameraTracker.cpp",
    "content": "#include \"CameraTracker.h\"\r\n\r\n#include <plog/Log.h>\r\n#include \"constants.h\"\r\n\r\nconstexpr int num_samples_to_average = 10;\r\n\r\nCameraTracker::CameraTracker(bool start_thread)\r\n: camera_loop_time_averager_(num_samples_to_average),\r\n  rear_cam_(CAMERA_ID::REAR, start_thread),\r\n  side_cam_(CAMERA_ID::SIDE, start_thread),\r\n  output_({{0,0,0}, false, ClockFactory::getFactoryInstance()->get_clock()->now(), false}),\r\n  last_rear_cam_output_(),\r\n  last_side_cam_output_(),\r\n  side_cam_ok_filter_(0.5),\r\n  rear_cam_ok_filter_(0.5),\r\n  both_cams_ok_filter_(0.5),\r\n  running_(false)\r\n{\r\n    // Target points\r\n    robot_P_side_target_ = {cfg.lookup(\"vision_tracker.physical.side.target_x\"), \r\n                            cfg.lookup(\"vision_tracker.physical.side.target_y\")};\r\n    robot_P_rear_target_ = {cfg.lookup(\"vision_tracker.physical.rear.target_x\"), \r\n                            cfg.lookup(\"vision_tracker.physical.rear.target_y\")};\r\n}\r\n\r\nCameraTracker::~CameraTracker() {}\r\n\r\nvoid CameraTracker::update()\r\n{\r\n    CameraPipelineOutput side_output = side_cam_.getData();\r\n    CameraPipelineOutput rear_output = rear_cam_.getData();\r\n    output_.raw_detection = side_output.ok && rear_output.ok;\r\n    debug_.side_ok = side_cam_ok_filter_.update(side_output.ok);\r\n    debug_.rear_ok = rear_cam_ok_filter_.update(rear_output.ok);\r\n    bool new_output_pose_ready = false;\r\n\r\n    if(rear_output.ok)\r\n    {\r\n        last_rear_cam_output_ = rear_output;\r\n    }\r\n    if(side_output.ok)\r\n    {\r\n        last_side_cam_output_ = side_output;\r\n    }\r\n\r\n    if(last_side_cam_output_.ok && last_rear_cam_output_.ok) new_output_pose_ready = true;\r\n\r\n    // int time_delta_ms = std::chrono::duration_cast<std::chrono::milliseconds>\r\n    //     (last_side_cam_output_.timestamp - last_rear_cam_output_.timestamp).count();\r\n    // if(abs(time_delta_ms) > 400)\r\n    // {\r\n    //     PLOGW << \"Camera timestamp delta too large: \" << time_delta_ms;\r\n    //     new_output_pose_ready = false;\r\n    // }\r\n\r\n    output_.ok = both_cams_ok_filter_.update(new_output_pose_ready);\r\n    debug_.both_ok = output_.ok;\r\n    if(!new_output_pose_ready) return;\r\n\r\n    // Populate output\r\n    output_.pose = computeRobotPoseFromImagePoints(last_side_cam_output_.point, last_rear_cam_output_.point);\r\n    output_.timestamp = ClockFactory::getFactoryInstance()->get_clock()->now();\r\n    camera_loop_time_averager_.mark_point();\r\n\r\n    // Populate debug\r\n    debug_.side_u = last_side_cam_output_.uv[0];\r\n    debug_.side_v = last_side_cam_output_.uv[1];\r\n    debug_.rear_u = last_rear_cam_output_.uv[0];\r\n    debug_.rear_v = last_rear_cam_output_.uv[1];\r\n    debug_.side_x = last_side_cam_output_.point[0];\r\n    debug_.side_y = last_side_cam_output_.point[1];\r\n    debug_.rear_x = last_rear_cam_output_.point[0];\r\n    debug_.rear_y = last_rear_cam_output_.point[1];\r\n    debug_.pose_x = output_.pose.x;\r\n    debug_.pose_y = output_.pose.y;\r\n    debug_.pose_a = output_.pose.a;\r\n    debug_.loop_ms = camera_loop_time_averager_.get_ms();\r\n\r\n    // Mark camera output values as not okay to make sure they are only used once\r\n    last_rear_cam_output_.ok = false;\r\n    last_side_cam_output_.ok = false;\r\n}\r\n\r\nCameraTrackerOutput CameraTracker::getPoseFromCamera()\r\n{\r\n    return output_;\r\n}\r\n\r\nCameraDebug CameraTracker::getCameraDebug()\r\n{\r\n    return debug_;\r\n}\r\n\r\nvoid CameraTracker::start()\r\n{\r\n    rear_cam_.start();\r\n    side_cam_.start();\r\n    camera_loop_time_averager_ = TimeRunningAverage(num_samples_to_average);\r\n    running_ = true;\r\n}\r\n\r\nvoid CameraTracker::stop()\r\n{\r\n    rear_cam_.stop();\r\n    side_cam_.stop();\r\n    running_ = false;\r\n}\r\n\r\nvoid CameraTracker::toggleDebugImageOutput()\r\n{\r\n    rear_cam_.toggleDebugImageOutput();\r\n    side_cam_.toggleDebugImageOutput();\r\n}\r\n\r\nPoint CameraTracker::computeRobotPoseFromImagePoints(Eigen::Vector2f p_side, Eigen::Vector2f p_rear)\r\n{  \r\n    // Solve linear equations to get pose values\r\n    Eigen::Matrix4f A;\r\n    Eigen::Vector4f b;\r\n    A << 1, 0, -p_rear[1], p_rear[0],\r\n         0, 1,  p_rear[0], p_rear[1],\r\n         1, 0, -p_side[1], p_side[0],\r\n         0, 1,  p_side[0], p_side[1];\r\n    b << robot_P_rear_target_[0],\r\n         robot_P_rear_target_[1],\r\n         robot_P_side_target_[0],\r\n         robot_P_side_target_[1];\r\n    Eigen::Vector4f x = A.colPivHouseholderQr().solve(b);\r\n    \r\n    // Populate pose with angle averaging\r\n    float pose_x = x[0];\r\n    float pose_y = x[1];\r\n    float pose_a = atan2(x[2], x[3]);\r\n    return {pose_x, pose_y, pose_a};\r\n}\r\n\r\n"
  },
  {
    "path": "src/robot/src/camera_tracker/CameraTracker.h",
    "content": "#ifndef CameraTracker_h\r\n#define CameraTracker_h\r\n\r\n#include \"CameraTrackerBase.h\"\r\n#include \"utils.h\"\r\n#include <Eigen/Dense>\r\n#include \"CameraPipeline.h\"\r\n\r\nclass CameraTracker : public CameraTrackerBase\r\n{\r\n  public:\r\n    CameraTracker(bool start_thread = false);\r\n    \r\n    virtual ~CameraTracker();\r\n\r\n    virtual void start() override;\r\n\r\n    virtual void stop() override;\r\n\r\n    virtual void update() override;\r\n\r\n    virtual bool running() override { return running_; };\r\n\r\n    virtual void toggleDebugImageOutput() override;\r\n\r\n    virtual CameraTrackerOutput getPoseFromCamera() override; \r\n\r\n    virtual CameraDebug getCameraDebug() override;\r\n\r\n    // Below here exposed for testing only (yes, bad practice, but useful for now)\r\n\r\n    Point computeRobotPoseFromImagePoints(Eigen::Vector2f p_side, Eigen::Vector2f p_rear);\r\n\r\n    void oneLoop();\r\n\r\n  private:\r\n\r\n    TimeRunningAverage camera_loop_time_averager_;\r\n    CameraDebug debug_;\r\n    CameraPipeline rear_cam_;\r\n    CameraPipeline side_cam_;\r\n    CameraTrackerOutput output_;\r\n    Eigen::Vector2f robot_P_side_target_;\r\n    Eigen::Vector2f robot_P_rear_target_;\r\n    CameraPipelineOutput last_rear_cam_output_;\r\n    CameraPipelineOutput last_side_cam_output_;\r\n    LatchedBool side_cam_ok_filter_;\r\n    LatchedBool rear_cam_ok_filter_;\r\n    LatchedBool both_cams_ok_filter_;\r\n    bool running_;\r\n};\r\n\r\n\r\n#endif //CameraTracker_h"
  },
  {
    "path": "src/robot/src/camera_tracker/CameraTrackerBase.h",
    "content": "#ifndef CameraTrackerBase_h\r\n#define CameraTrackerBase_h\r\n\r\n#include \"utils.h\"\r\n\r\nstruct CameraTrackerOutput\r\n{\r\n  Point pose;\r\n  bool ok;\r\n  ClockTimePoint timestamp;\r\n  bool raw_detection;\r\n};\r\n\r\nclass CameraTrackerBase\r\n{\r\n  public:\r\n\r\n    virtual void start() = 0;\r\n\r\n    virtual void stop() = 0;\r\n\r\n    virtual void update() = 0;\r\n\r\n    virtual bool running() = 0;\r\n\r\n    virtual void toggleDebugImageOutput() = 0;\r\n\r\n    virtual CameraTrackerOutput getPoseFromCamera() = 0; \r\n\r\n    virtual CameraDebug getCameraDebug() = 0;\r\n\r\n};\r\n\r\n\r\n#endif //CameraTrackerBase_h"
  },
  {
    "path": "src/robot/src/camera_tracker/CameraTrackerFactory.cpp",
    "content": "#include \"CameraTrackerFactory.h\"\r\n#include \"CameraTracker.h\"\r\n#include \"CameraTrackerMock.h\"\r\n\r\n#include <plog/Log.h> \r\n\r\nCameraTrackerFactory* CameraTrackerFactory::instance = NULL;\r\n\r\n\r\nCameraTrackerFactory* CameraTrackerFactory::getFactoryInstance()\r\n{\r\n    if(!instance)\r\n    {\r\n        instance = new CameraTrackerFactory;\r\n    }\r\n    return instance;\r\n}\r\n\r\nvoid CameraTrackerFactory::set_mode(CAMERA_TRACKER_FACTORY_MODE mode)\r\n{\r\n    mode_ = mode;\r\n}\r\n\r\nCameraTrackerBase* CameraTrackerFactory::get_camera_tracker()\r\n{\r\n    if (!camera_tracker_)\r\n    {\r\n        build_camera_tracker();\r\n    }\r\n    return camera_tracker_.get();\r\n}\r\n\r\n\r\nvoid CameraTrackerFactory::build_camera_tracker()\r\n{    \r\n    if(mode_ == CAMERA_TRACKER_FACTORY_MODE::STANDARD)\r\n    {\r\n        camera_tracker_ = std::make_unique<CameraTracker>();\r\n        PLOGI << \"Built CameraTracker\";\r\n\r\n    }\r\n    else if (mode_ == CAMERA_TRACKER_FACTORY_MODE::MOCK)\r\n    {\r\n        camera_tracker_ = std::make_unique<CameraTrackerMock>();\r\n        PLOGI << \"Built CameraTrackerMock\";\r\n    }\r\n}\r\n\r\n\r\n// Private constructor\r\nCameraTrackerFactory::CameraTrackerFactory()\r\n: mode_(CAMERA_TRACKER_FACTORY_MODE::STANDARD),\r\n  camera_tracker_()\r\n{}\r\n  "
  },
  {
    "path": "src/robot/src/camera_tracker/CameraTrackerFactory.h",
    "content": "#ifndef CameraTrackerFactory_h\r\n#define CameraTrackerFactory_h\r\n\r\n#include <memory>\r\n#include <map>\r\n\r\n#include \"CameraTrackerBase.h\"\r\n\r\nenum class CAMERA_TRACKER_FACTORY_MODE\r\n{\r\n  STANDARD,\r\n  MOCK,\r\n};\r\n\r\nclass CameraTrackerFactory\r\n{\r\n\r\n  public:\r\n\r\n    static CameraTrackerFactory* getFactoryInstance();\r\n\r\n    void set_mode(CAMERA_TRACKER_FACTORY_MODE mode);\r\n\r\n    CameraTrackerBase* get_camera_tracker();\r\n\r\n    // Delete copy and assignment constructors\r\n    CameraTrackerFactory(CameraTrackerFactory const&) = delete;\r\n    CameraTrackerFactory& operator= (CameraTrackerFactory const&) = delete;\r\n\r\n  private:\r\n\r\n    // Make standard constructor private so it can't be created\r\n    CameraTrackerFactory();\r\n\r\n    void build_camera_tracker();\r\n\r\n    static CameraTrackerFactory* instance;\r\n\r\n    CAMERA_TRACKER_FACTORY_MODE mode_;\r\n\r\n    std::unique_ptr<CameraTrackerBase> camera_tracker_;\r\n\r\n};\r\n\r\n\r\n#endif //CameraTrackerFactory_h"
  },
  {
    "path": "src/robot/src/camera_tracker/CameraTrackerMock.h",
    "content": "#ifndef CameraTrackerMock_h\r\n#define CameraTrackerMock_h\r\n\r\n#include \"utils.h\"\r\n\r\nclass CameraTrackerMock : public CameraTrackerBase\r\n{\r\n  public:\r\n\r\n    virtual void start() override {};\r\n\r\n    virtual void stop() override {};\r\n\r\n    virtual void update() override {};\r\n\r\n    virtual bool running() override {return true;};\r\n\r\n    virtual void toggleDebugImageOutput() override {};\r\n\r\n    virtual CameraTrackerOutput getPoseFromCamera() override { return {{0,0,0},false, ClockFactory::getFactoryInstance()->get_clock()->now(),false};};\r\n\r\n    virtual CameraDebug getCameraDebug() override {return CameraDebug(); }; \r\n};\r\n\r\n\r\n#endif //CameraTrackerMock_h"
  },
  {
    "path": "src/robot/src/constants.cfg",
    "content": "name = \"Runtime constants\";\nlog_level = \"info\";   // verbose, debug, info, warning, error, fatal, none\n\nmotion = \n{\n  limit_max_fraction  = 0.8;      // Only generate a trajectory to this fraction of max speed to give motors headroom to compensate\n  controller_frequency  = 40 ;    // Hz for RobotController\n  log_frequency         = 20 ;    // HZ for logging to motion log\n  fake_perfect_motion   = false;   // Enable or disable bypassing clearcore to fake perfect motion for testing\n  rate_always_ready     = false;   // Bypasses rate limiter if set to true\n  translation = \n  {\n    max_vel = \n    {\n      vision = 0.08; // m/s\n      fine   = 0.1;  // m/s\n      coarse = 0.7;  // m/s\n    };\n    max_acc = \n    {\n      vision = 0.1;   // m/s^2\n      fine   = 0.15;  // m/s^2\n      coarse = 0.2;   // m/s^2\n    };\n    max_jerk = \n    {\n      vision = 0.1;  // m/s^3\n      fine   = 0.2;  // m/s^3\n      coarse = 0.5;  // m/s^3\n    };\n    position_threshold = \n    {\n      vision = 0.005;   // m\n      fine   = 0.04;    // m\n      coarse = 0.10;    // m\n    };\n    velocity_threshold = \n    {\n      vision = 0.01;   // m/s\n      fine   = 0.03;   // m/s\n      coarse = 0.05;   // m/s\n    };\n    gains =\n    {\n      kp = 2.0;\n      ki = 0.1;\n      kd = 0.0;\n    };\n    gains_vision =\n    {\n      kp = 1.0;\n      ki = 0.0;\n      kd = 0.9;\n    };\n  };\n\n  rotation = \n  {\n    max_vel = \n    {\n      vision = 0.1;   // rad/s\n      fine   = 0.15;  // rad/s\n      coarse = 0.4;   // rad/s\n    };\n    max_acc = \n    {\n      vision = 0.2;  // rad/s^2\n      fine   = 0.3;  // rad/s^2\n      coarse = 0.5;  // rad/s^2\n    };\n    max_jerk = \n    {\n      vision = 0.5;  // rad/s^3\n      fine   = 0.7;  // rad/s^3\n      coarse = 1.0;  // rad/s^3\n    };\n    position_threshold = \n    {\n      vision = 0.002;   // rad\n      fine   = 0.02;    // rad\n      coarse = 0.08;    // rad\n    };\n    velocity_threshold = \n    {\n      vision = 0.02;   // rad/s\n      fine   = 0.03;   // rad/s\n      coarse = 0.05;   // rad/s\n    };\n    gains =\n    {\n      kp = 3.0;\n      ki = 0.1;\n      kd = 0.0;\n    };\n    gains_vision =\n    {\n      kp = 1.0;\n      ki = 0.0;\n      kd = 0.9;\n    };\n  };\n};\n\nphysical = \n{\n  wheel_diameter = 0.152;  // m\n  wheel_dist_from_center = 0.4794; // m\n};\n\ntrajectory_generation = \n{\n  solver_max_loops   = 30;     // Only let the solver loop this many times before giving up\n  solver_alpha_decay = 0.8;    // Decay for velocity limit\n  solver_beta_decay  = 0.8;    // Decay for acceleration limit\n  solver_exponent_decay = 0.1; // Decay expoenent to apply each loop\n  min_dist_limit    = 0.0001;  // Smallest value solver will attempt to solve for\n};\n\ntray = \n{\n  default_pos_revs = 33.0;     // Default position for driving in revs from home\n  load_pos_revs    = 33.0;     // Loading position in revs from home\n  place_pos_revs   = 67.0;    // Placing position in revs from home\n  steps_per_rev    = 800;     // Number of steps per motor rev\n  controller_frequency  = 20 ;    // Hz for controller rate\n  fake_tray_motions = false;   // Flag to fake tray motions for testing\n};\n\nlocalization = \n{\n  mm_x_offset = -60.0;                      // X offset of mm pair center from center of rotation (in millimeters, + is in front, - is behind)\n  mm_y_offset = 0.0;                        // Y offset of mm pair center from center of rotation (in millimeters, + is to left, - is to right)\n  max_wait_time = 10.0;                     // Max time to wait (s) when waiting for good localization\n  confidence_for_wait = 0.97;               // Confidence level required for waiting for localization \n  kf_predict_trans_cov = 0.00005;            // How much noise is expected in the prediciton step (each timestep) for position, lower is less noise\n  kf_predict_angle_cov = 0.0001;            // How much noise is expected in the prediction step for angle, lower is less noise\n  kf_meas_trans_cov = 0.01;                 // How much noise is expected in the marvelmind measurements of position, lower is less noise\n  kf_meas_angle_cov = 0.01;                  // How much noise is expected in the marvelmind measurements of angle, lower is less noise\n  kf_uncertainty_scale = 1.0;               // Scale factor for how much extra noise to add based on position confidence\n  variance_ref_trans = 1.0;                 // Max position varaince for 0 confidence\n  variance_ref_angle = 1.57;                // Max angle variance for 0 confidence\n  min_vel_uncertainty = 0.5;                // Minimum uncertianty for any nonzero velocity\n  vel_uncertainty_slope = 0.5;              // Uncertianty/velocity scale\n  max_vel_uncetainty = 1.0;                 // Uncertainty cap\n  vel_uncertainty_decay_time = 4.0;         // Number of seconds after completeing motion until uncertainty goes to 0\n};\n\n\n// USB ports on pi mapping to v4l path num\n//  USB2 - 3  |   USB3 - 1   |   Eth\n//  USB2 - 4  |   USB3 - 2   |\nvision_tracker = {\n  side = {\n    camera_path = \"/dev/v4l/by-path/platform-fd500000.pcie-pci-0000:01:00.0-usb-0:1.1.1:1.0-video-index0\"\n    calibration_file = \"/home/pi/IR_calibration_2.yml\"; // Calibration data\n    debug_output_path = \"/home/pi/images/debug/side/\"; // Path to output debug images\n    debug_image = \"/home/pi/images/dev/side_raw_2.jpg\"; // Debug image path\n    resolution_scale_x = 0.25;                             // Scale factor from calibration resolution to onboard resolution\n    resolution_scale_y = 0.333333;                         // Scale factor from calibration resolution to onboard resolution\n  }\n  rear = {\n    camera_path = \"/dev/v4l/by-path/platform-fd500000.pcie-pci-0000:01:00.0-usb-0:1.1.2:1.0-video-index0\"\n    calibration_file = \"/home/pi/IR_calibration_1.yml\"; // Calibration data\n    debug_output_path = \"/home/pi/images/debug/rear/\"; // Path to output debug images\n    debug_image = \"/home/pi/DominoRobot/src/robot/test/testdata/images/20210701210351_rear_img_raw.jpg\"; // Debug image path\n    resolution_scale_x = 0.25;                             // Scale factor from calibration resolution to onboard resolution\n    resolution_scale_y = 0.333333;                         // Scale factor from calibration resolution to onboard resolution\n  }\n  debug = {\n    use_debug_image = false;                   // Use debug image instead of loading camera\n    save_camera_debug = false;\n  }\n  detection = {\n    threshold = 235;                         // Threshold value for image processing\n    blob = {\n      use_area = true;                      // Use area for blob detection\n      min_area = 80.0;                      // Min area for blob detection\n      max_area = 500.0;                    // Max area for blob detection\n      use_circularity = true;               // Use circularity for blob detection\n      min_circularity = 0.6;                // Min circularity for blob detection\n      max_circularity = 1.0;                  // Max circularity for blob detection\n    }\n  }\n  physical = {\n    pixels_per_meter_u = 1362.0;              // Hack to try and get something working\n    pixels_per_meter_v = 1411.0;\n    side = {\n      x_offset = 0.0;                       // x offset (meters) from fake robot frame origin\n      y_offset = 0.93;                      // y offset (meters) from fake robot frame origin\n      z_offset = 0.485;                      // z offset (meters) from fake robot frame origin\n      target_x = 0.0;                       // target x coord (meters) in fake robot frame           \n      target_y = 0.93;                     // target y coord (meters) in fake robot frame \n    }\n    rear = {\n      x_offset = -0.68;                      // x offset (meters) fromfake robot frame origin\n      y_offset = 0.0;                       // y offset (meters) from fake robot frame origin\n      z_offset = 0.49;                      // z offset (meters) from fake robot frame origin\n      target_x = -0.68;                    // target x coord (meters) in fake robot frame          \n      target_y = 0.0;                       // target y coord (meters) in fake robot frame \n    }\n  }\n  kf = \n  {\n    predict_trans_cov = 0.0001;               // How much noise is expected in the prediciton step (each timestep) for position, lower is less noise\n    predict_angle_cov = 0.001;                // How much noise is expected in the prediciton step (each timestep) for angle, lower is less noise\n    meas_trans_cov = 0.001;                   // How much noise is expected in the update step for position, lower is less noise\n    meas_angle_cov = 0.001;                    // How much noise is expected in the update step for angle, lower is less noise\n  }\n};\n\nmock_socket = \n{\n  enabled = false;\n  // Commands must have <> symbols and a number is to pause for that many ms\n  data = [\"2000\",\"<{'type':'move_vision','data':{'x':0,'y':0,'a':0}}>\"];\n};\n"
  },
  {
    "path": "src/robot/src/constants.h",
    "content": "#ifndef Constants_h\n#define Constants_h\n\n#include <libconfig.h++>\nextern libconfig::Config cfg;\n\n#define CONSTANTS_FILE \"/home/pi/DominoRobot/src/robot/src/constants.cfg\"\n#define TEST_CONSTANTS_FILE \"/home/pi/DominoRobot/src/robot/test/test_constants.cfg\"\n\n// USB devices\n#define CLEARCORE_USB \"/dev/clearcore\"\n#define MARVELMIND_USB_0 \"/dev/marvelmind0\" //Marvelminds could show up at any of these three links\n#define MARVELMIND_USB_1 \"/dev/marvelmind1\"\n#define MARVELMIND_USB_2 \"/dev/marvelmind2\"\n\n// Expected marvelmind device connections\n#define MARVELMIND_DEVICE_ID0 5\n#define MARVELMIND_DEVICE_ID1 6\n\n// Log file ID for motion specific stuff\n#define MOTION_LOG_ID 2\n#define LOCALIZATION_LOG_ID 3\n#define MOTION_CSV_LOG_ID 4\n\n// Commands use to communicate about behavior specified from master\nenum class COMMAND\n{\n    NONE,\n    MOVE,\n    MOVE_REL,\n    MOVE_FINE,\n    MOVE_CONST_VEL,\n    MOVE_WITH_VISION,\n    PLACE_TRAY,\n    LOAD_TRAY,\n    INITIALIZE_TRAY,\n    POSITION,\n    ESTOP,\n    LOAD_COMPLETE,\n    CLEAR_ERROR,\n    WAIT_FOR_LOCALIZATION,\n    SET_POSE,\n    TOGGLE_VISION_DEBUG,\n    START_CAMERAS,\n    STOP_CAMERAS,\n    MOVE_REL_SLOW,\n    MOVE_FINE_STOP_VISION,\n};\n\n#endif\n"
  },
  {
    "path": "src/robot/src/main.cpp",
    "content": "#include <plog/Log.h> \n#include <plog/Init.h>\n#include <plog/Formatters/TxtFormatter.h>\n#include <plog/Formatters/MessageOnlyFormatter.h>\n#include <plog/Appenders/ColorConsoleAppender.h>\n#include <plog/Appenders/RollingFileAppender.h>\n#include <chrono>\n#include <iostream>\n\n#include \"robot.h\"\n#include \"constants.h\"\n#include \"sockets/SocketMultiThreadWrapperFactory.h\"\n#include \"camera_tracker/CameraTrackerFactory.h\"\n\nlibconfig::Config cfg = libconfig::Config();\n\nvoid configure_logger()\n{\n    // Get current date/time\n    const std::time_t datetime =  std::chrono::system_clock::to_time_t(std::chrono::system_clock::now());\n    char datetime_str[50];\n    std::strftime(datetime_str, sizeof(datetime_str), \"%Y%m%d_%H%M%S\", std::localtime(&datetime));\n\n    // Make file names\n    std::string robot_log_file_name = std::string(\"log/robot_log_\") + std::string(datetime_str) + std::string(\".txt\");\n    std::string motion_log_file_name = std::string(\"log/motion_log_\") + std::string(datetime_str) + std::string(\".txt\");\n    std::string localization_log_file_name = std::string(\"log/localization_log_\") + std::string(datetime_str) + std::string(\".txt\");\n\n    // Initialize robot logs to to go file and console\n    std::string log_level = cfg.lookup(\"log_level\");\n    plog::Severity severity = plog::severityFromString(log_level.c_str());\n    static plog::RollingFileAppender<plog::TxtFormatter> fileAppender(robot_log_file_name.c_str(), 1000000, 5);\n    static plog::ColorConsoleAppender<plog::TxtFormatter> consoleAppender;\n    plog::init(severity, &fileAppender).addAppender(&consoleAppender); \n\n    // Initialize motion logs to go to file\n    static plog::RollingFileAppender<plog::TxtFormatter> motionFileAppender(motion_log_file_name.c_str(), 1000000, 5);\n    plog::init<MOTION_LOG_ID>(plog::debug, &motionFileAppender);\n\n    // Initialize localization logs to go to file\n    static plog::RollingFileAppender<plog::TxtFormatter> localizationFileAppender(localization_log_file_name.c_str(), 1000000, 5);\n    plog::init<LOCALIZATION_LOG_ID>(plog::debug, &localizationFileAppender);\n\n    PLOGI << \"Logger ready\";\n}\n\nvoid setup_mock_socket()\n{\n    bool enabled = cfg.lookup(\"mock_socket.enabled\");\n    if(enabled)\n    {\n        auto factory = SocketMultiThreadWrapperFactory::getFactoryInstance();\n        factory->set_mode(SOCKET_FACTORY_MODE::MOCK);\n        factory->build_socket();\n        for (const auto& s : cfg.lookup(\"mock_socket.data\"))\n        {\n            factory->add_mock_data(s);\n        }\n    }\n}\n\nvoid capture_images()\n{\n    // Modify config values to ensure they are set to log debug images\n    libconfig::Setting& save_debug_config = cfg.lookup(\"vision_tracker.debug.save_camera_debug\");\n    save_debug_config = true;\n    \n    // Start camera\n    CameraTrackerBase* camera_tracker_ = CameraTrackerFactory::getFactoryInstance()->get_camera_tracker();\n    camera_tracker_->start();\n    \n    // Wait for a few seconds for images to save\n    Timer t;\n    while(t.dt_s() < 2.0) {}\n\n    camera_tracker_->stop();\n\n    PLOGI << \"Camera image capture complete...Exiting\";\n}\n\nint main(int argc, char** argv)\n{\n    try\n    {\n        cfg.readFile(CONSTANTS_FILE);\n        std::string name = cfg.lookup(\"name\");\n\n        configure_logger();\n        PLOGI << \"Loaded constants file: \" << name;\n\n        if(argc > 1 && std::string(argv[1]) == \"-c\")\n        {\n            capture_images();\n        } \n        else \n        {\n            setup_mock_socket();\n\n            Robot r;\n            r.run(); //Should loop forver until stopped\n        }\n\n    }\n    catch (const libconfig::SettingNotFoundException &e)\n    {\n        std::cerr << \"Configuration error with \" << e.getPath() << std::endl;\n        return(EXIT_FAILURE);\n    }\n\n    return(EXIT_SUCCESS);\n}\n"
  },
  {
    "path": "src/robot/src/robot.cpp",
    "content": "#include \"robot.h\"\n\n#include <plog/Log.h> \n#include \"utils.h\"\n#include \"camera_tracker/CameraTrackerFactory.h\"\n\n\nWaitForLocalizeHelper::WaitForLocalizeHelper(const StatusUpdater& statusUpdater, float max_timeout, float confidence_threshold) \n: statusUpdater_(statusUpdater),\n    timer_(),\n    max_timeout_(max_timeout),\n    confidence_threshold_(confidence_threshold)\n{}\n\nbool WaitForLocalizeHelper::isDone() \n{\n    if(timer_.dt_s() > max_timeout_)\n    {\n        PLOGW.printf(\"Exiting wait for localize due to time\");\n        return true;\n    }\n    float confidence = statusUpdater_.getLocalizationConfidence();\n    if(confidence > confidence_threshold_)\n    {\n        PLOGI.printf(\"Exiting wait for localize with confidence: %4.2f\", confidence);\n        return true;\n    }\n    return false;\n}\n\nvoid WaitForLocalizeHelper::start()\n{\n    timer_.reset();\n}\n\n\n\nRobot::Robot()\n: statusUpdater_(),\n  server_(statusUpdater_),\n  controller_(statusUpdater_),\n  tray_controller_(),\n  mm_wrapper_(),\n  position_time_averager_(10),\n  robot_loop_time_averager_(20),\n  wait_for_localize_helper_(statusUpdater_, cfg.lookup(\"localization.max_wait_time\"), cfg.lookup(\"localization.confidence_for_wait\")),\n  vision_print_rate_(10),\n  camera_tracker_(CameraTrackerFactory::getFactoryInstance()->get_camera_tracker()),\n  camera_motion_start_time_(ClockTimePoint::min()),\n  camera_trigger_time_1_(ClockTimePoint::min()),\n  camera_trigger_time_2_(ClockTimePoint::min()),\n  camera_stop_triggered_(false),\n  curCmd_(COMMAND::NONE)\n{\n    PLOGI.printf(\"Robot starting\");\n}\n\nvoid Robot::run()\n{\n    while(true)\n    {\n        runOnce();\n    }\n}\n\n\nvoid Robot::runOnce() \n{\n    // Check for new command and try to start it\n    COMMAND newCmd = server_.oneLoop();\n    bool status = tryStartNewCmd(newCmd);\n\n    // Update our current command if we successfully started a new command\n    if(status)\n    {\n        curCmd_ = newCmd;\n        statusUpdater_.updateInProgress(true);\n    }\n\n    // Service marvelmind\n    std::vector<float> positions = mm_wrapper_.getPositions();\n    if (positions.size() == 3)\n    {\n        position_time_averager_.mark_point();\n        float angle_rad = wrap_angle(positions[2] * M_PI / 180.0);\n        controller_.inputPosition(positions[0], positions[1], angle_rad);\n    }\n\n    // Service various modules\n    controller_.update();\n    tray_controller_.update();\n    camera_tracker_->update();\n\n    // Verify if MOVE_FINE_STOP_VISION needs to trigger stop\n    if(checkForCameraStopTrigger())\n    {\n        controller_.stopFast();\n        camera_stop_triggered_ = true;\n    }\n\n    // Check if the current command has finished\n    bool done = checkForCmdComplete(curCmd_);\n    if(done)\n    {\n        curCmd_ = COMMAND::NONE;\n        statusUpdater_.updateInProgress(false);\n    }\n\n    // Update loop time and status updater\n    statusUpdater_.updatePositionLoopTime(position_time_averager_.get_ms());\n    CameraDebug camera_debug = camera_tracker_->getCameraDebug();\n    statusUpdater_.updateCameraDebug(camera_debug);\n    robot_loop_time_averager_.mark_point();\n}\n\n\nbool Robot::tryStartNewCmd(COMMAND cmd)\n{\n    // Position info doesn't count as a real 'command' since it doesn't interrupt anything\n    // Always service it, but don't consider it starting a new command\n    if (cmd == COMMAND::POSITION)\n    {\n        RobotServer::PositionData data = server_.getPositionData();\n        controller_.inputPosition(data.x, data.y, data.a);\n\n        // Update the position rate\n        position_time_averager_.mark_point();\n\n        return false;\n    }\n    if (cmd == COMMAND::SET_POSE)\n    {\n        RobotServer::PositionData data = server_.getPositionData();\n        controller_.forceSetPosition(data.x, data.y, data.a);\n        return false;\n    }\n    if (cmd == COMMAND::TOGGLE_VISION_DEBUG)\n    {\n        camera_tracker_->toggleDebugImageOutput();\n        return false;\n    }\n    if (cmd == COMMAND::START_CAMERAS)\n    {\n        camera_tracker_->start();\n        return false;\n    }\n    if (cmd == COMMAND::STOP_CAMERAS)\n    {\n        camera_tracker_->stop();\n        return false;\n    }\n    // Same with ESTOP\n    if (cmd == COMMAND::ESTOP)\n    {\n        controller_.estop();\n        tray_controller_.estop();\n        return false;\n    }\n    // Same with LOAD_COMPLETE\n    if (cmd == COMMAND::LOAD_COMPLETE)\n    {\n        tray_controller_.setLoadComplete();\n        return false;\n    }\n\n    // Just do nothing for NONE\n    if (cmd == COMMAND::NONE) { return false;}\n    \n    // For all other commands, we need to make sure we aren't doing anything else at the moment\n    if(statusUpdater_.getInProgress())\n    {\n        PLOGW << \"Command \" << static_cast<int>(curCmd_) << \" already running, rejecting new command: \" << static_cast<int>(cmd);\n        return false;\n    }\n    else if (statusUpdater_.getErrorStatus())\n    {\n        return false;\n    }\n    \n    // Start new command\n    if(cmd == COMMAND::MOVE)\n    {\n        RobotServer::PositionData data = server_.getMoveData();\n        controller_.moveToPosition(data.x, data.y, data.a);\n    }\n    else if(cmd == COMMAND::MOVE_REL)\n    {\n        RobotServer::PositionData data = server_.getMoveData();\n        controller_.moveToPositionRelative(data.x, data.y, data.a);\n    }\n    else if(cmd == COMMAND::MOVE_REL_SLOW)\n    {\n        RobotServer::PositionData data = server_.getMoveData();\n        controller_.moveToPositionRelativeSlow(data.x, data.y, data.a);\n    }\n    else if(cmd == COMMAND::MOVE_FINE)\n    {\n        RobotServer::PositionData data = server_.getMoveData();\n        controller_.moveToPositionFine(data.x, data.y, data.a);\n    }\n    else if(cmd == COMMAND::MOVE_FINE_STOP_VISION)\n    {\n        if(!camera_tracker_->running()) \n        {\n            PLOGW << \"Cannot start MOVE_FINE_STOP_VISION if camera tracker isn't running\";\n            return false;\n        }\n        RobotServer::PositionData data = server_.getMoveData();\n        controller_.moveToPositionFine(data.x, data.y, data.a);\n        resetCameraStopTriggers();\n        camera_motion_start_time_ = ClockFactory::getFactoryInstance()->get_clock()->now();\n        fine_move_target_ = {data.x, data.y, data.a};\n    \n    }\n    else if(cmd == COMMAND::MOVE_CONST_VEL)\n    {\n        RobotServer::VelocityData data = server_.getVelocityData();\n        controller_.moveConstVel(data.vx, data.vy, data.va, data.t);\n    }\n    else if (cmd == COMMAND::MOVE_WITH_VISION)\n    {\n        RobotServer::PositionData data = server_.getMoveData();\n        controller_.moveWithVision(data.x, data.y, data.a);\n    }\n    else if(cmd == COMMAND::PLACE_TRAY)\n    {\n        bool ok = tray_controller_.place();\n        if(!ok) statusUpdater_.setErrorStatus();\n        return ok;\n    }\n    else if(cmd == COMMAND::LOAD_TRAY)\n    {\n        bool ok = tray_controller_.load();\n        if(!ok) statusUpdater_.setErrorStatus();\n        return ok;\n    }\n    else if(cmd == COMMAND::INITIALIZE_TRAY)\n    {\n        tray_controller_.initialize();\n    }\n    else if (cmd == COMMAND::WAIT_FOR_LOCALIZATION)\n    {\n        wait_for_localize_helper_.start();\n    }\n    else\n    {\n        PLOGW.printf(\"Unknown command!\");\n        return false;\n    }\n\n    return true;\n}\n\nbool Robot::checkForCmdComplete(COMMAND cmd)\n{\n    if (cmd == COMMAND::NONE)\n    {\n        return true;\n    }\n    else if(cmd == COMMAND::MOVE || \n            cmd == COMMAND::MOVE_REL ||\n            cmd == COMMAND::MOVE_FINE ||\n            cmd == COMMAND::MOVE_CONST_VEL ||\n            cmd == COMMAND::MOVE_WITH_VISION || \n            cmd == COMMAND::MOVE_REL_SLOW || \n            cmd == COMMAND::MOVE_FINE_STOP_VISION)\n    {\n        return !controller_.isTrajectoryRunning();\n    }\n    else if(cmd == COMMAND::PLACE_TRAY ||\n            cmd == COMMAND::LOAD_TRAY ||\n            cmd == COMMAND::INITIALIZE_TRAY)\n    {\n        return !tray_controller_.isActionRunning();\n    }\n    else if (cmd == COMMAND::WAIT_FOR_LOCALIZATION) \n    {\n        return wait_for_localize_helper_.isDone();\n    }\n    else\n    {\n        PLOGE.printf(\"Completion check not implimented for command: %i\",cmd);\n        return true;\n    }\n        \n}\n\nbool Robot::checkForCameraStopTrigger()\n{\n    if(curCmd_ != COMMAND::MOVE_FINE_STOP_VISION) return false;\n    if(camera_stop_triggered_) return false;\n\n    CameraTrackerOutput camera_output = camera_tracker_->getPoseFromCamera();\n    if(camera_output.ok)\n    {\n        bool camera_trigger_1_ = camera_trigger_time_1_ > camera_motion_start_time_;\n        bool camera_trigger_2 = camera_trigger_time_2_ > camera_motion_start_time_;\n        bool timestamp_after_1 = camera_output.timestamp > camera_trigger_time_1_;\n        bool timestamp_after_2 = camera_output.timestamp > camera_trigger_time_2_;\n        bool timestamp_difference = std::chrono::duration_cast<FpSeconds>(camera_output.timestamp - camera_trigger_time_1_).count() > 0;\n        Point current_pos = controller_.getCurrentPosition();\n        Eigen::Vector2f dp = {fine_move_target_.x - current_pos.x, fine_move_target_.y - current_pos.y};\n        bool pos_in_tolerance = dp.norm() < 0.5;\n\n        if(camera_trigger_1_ && camera_trigger_2 && timestamp_after_1 && timestamp_after_2 && pos_in_tolerance)\n        {\n            return true;\n        }\n        else if(camera_trigger_1_ && timestamp_after_1 && timestamp_difference)\n        {\n            camera_trigger_time_2_ = camera_output.timestamp;\n            return true;\n        }\n        else\n        {\n            camera_trigger_time_1_ = camera_output.timestamp;\n            return false;\n        }\n    }\n    else\n    {\n        resetCameraStopTriggers();\n    }\n    return false;\n}\n\nvoid Robot::resetCameraStopTriggers()\n{\n    camera_stop_triggered_ = false;\n    camera_trigger_time_1_ = ClockTimePoint::min();\n    camera_trigger_time_2_ = ClockTimePoint::min();\n}\n"
  },
  {
    "path": "src/robot/src/robot.h",
    "content": "#ifndef Robot_h\n#define Robot_h\n\n#include \"MarvelmindWrapper.h\"\n#include \"RobotController.h\"\n#include \"RobotServer.h\"\n#include \"StatusUpdater.h\"\n#include \"TrayController.h\"\n#include \"utils.h\"\n#include \"camera_tracker/CameraTracker.h\"\n\nclass WaitForLocalizeHelper \n{\n  public:\n    WaitForLocalizeHelper(const StatusUpdater& statusUpdater, float max_timeout, float confidence_threshold);\n    bool isDone();\n    void start();\n\n  private:\n    const StatusUpdater& statusUpdater_;\n    Timer timer_;\n    float max_timeout_;\n    float confidence_threshold_;\n};\n\n\nclass Robot\n{\n\n  public:\n\n    Robot();\n\n    void run();\n    void runOnce();\n\n    // Used for tests only\n    COMMAND getCurrentCommand() { return curCmd_; };\n    StatusUpdater::Status getStatus() { return statusUpdater_.getStatus(); };\n\n  private:\n\n    bool checkForCmdComplete(COMMAND cmd);\n    bool tryStartNewCmd(COMMAND cmd);\n    bool checkForCameraStopTrigger();\n    void resetCameraStopTriggers();\n\n    StatusUpdater statusUpdater_;\n    RobotServer server_;\n    RobotController controller_;\n    TrayController tray_controller_;\n    MarvelmindWrapper mm_wrapper_;\n\n    TimeRunningAverage position_time_averager_;    // Handles keeping average of the position update timing\n    TimeRunningAverage robot_loop_time_averager_; \n    WaitForLocalizeHelper wait_for_localize_helper_;\n    RateController vision_print_rate_;\n    CameraTrackerBase* camera_tracker_;\n    ClockTimePoint camera_motion_start_time_;\n    ClockTimePoint camera_trigger_time_1_;\n    ClockTimePoint camera_trigger_time_2_;\n    bool camera_stop_triggered_;\n    Point fine_move_target_;\n\n    COMMAND curCmd_;\n};\n\n\n#endif"
  },
  {
    "path": "src/robot/src/robot_controller_modes/RobotControllerModeBase.cpp",
    "content": "#include \"RobotControllerModeBase.h\"\r\n\r\nRobotControllerModeBase::RobotControllerModeBase(bool fake_perfect_motion)\r\n: move_start_timer_(),\r\n  move_running_(false),\r\n  fake_perfect_motion_(fake_perfect_motion)\r\n{\r\n}\r\n\r\nvoid RobotControllerModeBase::startMove()\r\n{\r\n    move_running_ = true;\r\n    move_start_timer_.reset();\r\n    loop_timer_.reset();\r\n}"
  },
  {
    "path": "src/robot/src/robot_controller_modes/RobotControllerModeBase.h",
    "content": "#ifndef RobotControllerModeBase_h\r\n#define RobotControllerModeBase_h\r\n\r\n#include \"SmoothTrajectoryGenerator.h\"\r\n#include \"utils.h\"\r\n\r\nclass RobotControllerModeBase\r\n{\r\n\r\n  public:\r\n\r\n    RobotControllerModeBase(bool fake_perfect_motion);\r\n\r\n    virtual Velocity computeTargetVelocity(Point current_position, Velocity current_velocity, bool log_this_cycle) = 0;\r\n\r\n    virtual bool checkForMoveComplete(Point current_position, Velocity current_velocity) = 0;\r\n\r\n  protected:\r\n\r\n    struct TrajectoryTolerances\r\n    {\r\n        float trans_pos_err;\r\n        float ang_pos_err;\r\n        float trans_vel_err;\r\n        float ang_vel_err;\r\n    };\r\n\r\n    void startMove();\r\n\r\n    Timer move_start_timer_;\r\n    Timer loop_timer_;\r\n    bool move_running_; \r\n    bool fake_perfect_motion_;\r\n\r\n};\r\n\r\n#endif //RobotControllerModeBase_h\r\n"
  },
  {
    "path": "src/robot/src/robot_controller_modes/RobotControllerModePosition.cpp",
    "content": "#include \"RobotControllerModePosition.h\"\r\n#include \"constants.h\"\r\n#include <plog/Log.h>\r\n\r\nRobotControllerModePosition::RobotControllerModePosition(bool fake_perfect_motion)\r\n: RobotControllerModeBase(fake_perfect_motion),\r\n  traj_gen_(),\r\n  limits_mode_(LIMITS_MODE::FINE),\r\n  goal_pos_(0,0,0),\r\n  current_target_()\r\n{\r\n    coarse_tolerances_.trans_pos_err = cfg.lookup(\"motion.translation.position_threshold.coarse\");\r\n    coarse_tolerances_.ang_pos_err = cfg.lookup(\"motion.rotation.position_threshold.coarse\");\r\n    coarse_tolerances_.trans_vel_err = cfg.lookup(\"motion.translation.velocity_threshold.coarse\");\r\n    coarse_tolerances_.ang_vel_err = cfg.lookup(\"motion.rotation.velocity_threshold.coarse\");\r\n\r\n    fine_tolerances_.trans_pos_err = cfg.lookup(\"motion.translation.position_threshold.fine\");\r\n    fine_tolerances_.ang_pos_err = cfg.lookup(\"motion.rotation.position_threshold.fine\");\r\n    fine_tolerances_.trans_vel_err = cfg.lookup(\"motion.translation.velocity_threshold.fine\");\r\n    fine_tolerances_.ang_vel_err = cfg.lookup(\"motion.rotation.velocity_threshold.fine\");\r\n\r\n    PositionController::Gains position_gains;\r\n    position_gains.kp = cfg.lookup(\"motion.translation.gains.kp\");\r\n    position_gains.ki = cfg.lookup(\"motion.translation.gains.ki\");\r\n    position_gains.kd = cfg.lookup(\"motion.translation.gains.kd\");\r\n    x_controller_ = PositionController(position_gains);\r\n    y_controller_ = PositionController(position_gains);\r\n\r\n    PositionController::Gains angle_gains;\r\n    angle_gains.kp = cfg.lookup(\"motion.rotation.gains.kp\");\r\n    angle_gains.ki = cfg.lookup(\"motion.rotation.gains.ki\");\r\n    angle_gains.kd = cfg.lookup(\"motion.rotation.gains.kd\");\r\n    a_controller_ = PositionController(angle_gains);\r\n}\r\n\r\nbool RobotControllerModePosition::startMove(Point current_position, Point target_position, LIMITS_MODE limits_mode)\r\n{\r\n    limits_mode_ = limits_mode;\r\n    goal_pos_ = target_position;\r\n    bool ok = traj_gen_.generatePointToPointTrajectory(current_position, target_position, limits_mode);\r\n    if(ok) RobotControllerModeBase::startMove();\r\n    return ok;\r\n}\r\n\r\nVelocity RobotControllerModePosition::computeTargetVelocity(Point current_position, Velocity current_velocity, bool log_this_cycle)\r\n{\r\n    float dt_from_traj_start = move_start_timer_.dt_s();\r\n    current_target_ = traj_gen_.lookup(dt_from_traj_start);\r\n\r\n    // Print motion estimates to log\r\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"Target: \" << current_target_.toString();\r\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"Est Vel: \" << current_velocity.toString();\r\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"Est Pos: \" << current_position.toString();\r\n    \r\n    Velocity output;\r\n    if(fake_perfect_motion_)\r\n    {\r\n        output = current_target_.velocity;\r\n    }\r\n    else \r\n    {\r\n        float dt_since_last_loop = loop_timer_.dt_s();\r\n        loop_timer_.reset();\r\n        output.vx = x_controller_.compute(current_target_.position.x, current_position.x, current_target_.velocity.vx, current_velocity.vx, dt_since_last_loop);\r\n        output.vy = y_controller_.compute(current_target_.position.y, current_position.y, current_target_.velocity.vy, current_velocity.vy, dt_since_last_loop);\r\n        output.va = a_controller_.compute(current_target_.position.a, current_position.a, current_target_.velocity.va, current_velocity.va, dt_since_last_loop);\r\n    }\r\n\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"time,%.4f\",dt_from_traj_start);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"pos,%.4f,%.4f,%.4f\",current_position.x,current_position.y,current_position.a);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"target_pos,%.4f,%.4f,%.4f\",current_target_.position.x,current_target_.position.y,current_target_.position.a);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"vel,%.4f,%.4f,%.4f\",current_velocity.vx,current_velocity.vy,current_velocity.va);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"target_vel,%.4f,%.4f,%.4f\",current_target_.velocity.vx,current_target_.velocity.vy,current_target_.velocity.va);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"control_vel,%.4f,%.4f,%.4f\",output.vx,output.vy,output.va);\r\n\r\n    return output;\r\n}\r\n\r\nbool RobotControllerModePosition::checkForMoveComplete(Point current_position, Velocity current_velocity)\r\n{\r\n    // Get the right threshold values\r\n    TrajectoryTolerances tol = limits_mode_ == LIMITS_MODE::FINE ? fine_tolerances_ : coarse_tolerances_;\r\n\r\n    // Verify our target velocity is zero (i.e. we reached the end of the trajectory)\r\n    bool zero_cmd_vel = current_target_.velocity.nearZero();\r\n\r\n    // Verify actual translational and rotational positions are within tolerance\r\n    Eigen::Vector2f dp = {goal_pos_.x - current_position.x, goal_pos_.y - current_position.y};\r\n    bool pos_in_tolerance = dp.norm() < tol.trans_pos_err &&\r\n                            fabs(angle_diff(goal_pos_.a, current_position.a)) < tol.ang_pos_err; \r\n\r\n    // Verify actual translational and rotational velocities are within tolerance\r\n    Eigen::Vector2f dv = {current_velocity.vx, current_velocity.vy};\r\n    bool vel_in_tolerance = dv.norm() < tol.trans_vel_err && fabs(current_velocity.va) < tol.ang_vel_err;\r\n\r\n    // Trajectory is done when we aren't commanding any velocity, our both our actual velocity and\r\n    // position are within the correct tolerance.\r\n    bool traj_complete = zero_cmd_vel && vel_in_tolerance && pos_in_tolerance;\r\n    \r\n    if(traj_complete) \r\n    {\r\n        move_running_ = false;\r\n    }\r\n\r\n    return traj_complete;  \r\n}"
  },
  {
    "path": "src/robot/src/robot_controller_modes/RobotControllerModePosition.h",
    "content": "#ifndef RobotControllerModePosition_h\r\n#define RobotControllerModePosition_h\r\n\r\n#include \"RobotControllerModeBase.h\"\r\n#include \"SmoothTrajectoryGenerator.h\"\r\n#include \"utils.h\"\r\n\r\nclass RobotControllerModePosition : public RobotControllerModeBase\r\n{\r\n\r\n  public:\r\n\r\n    RobotControllerModePosition(bool fake_perfect_motion);\r\n\r\n    bool startMove(Point current_position, Point target_position, LIMITS_MODE limits_mode);\r\n\r\n    virtual Velocity computeTargetVelocity(Point current_position, Velocity current_velocity, bool log_this_cycle) override;\r\n\r\n    virtual bool checkForMoveComplete(Point current_position, Velocity current_velocity) override;\r\n\r\n  protected:\r\n\r\n    SmoothTrajectoryGenerator traj_gen_; \r\n    LIMITS_MODE limits_mode_;\r\n    Point goal_pos_;\r\n    PVTPoint current_target_;\r\n\r\n    TrajectoryTolerances coarse_tolerances_;\r\n    TrajectoryTolerances fine_tolerances_;\r\n\r\n    PositionController x_controller_;\r\n    PositionController y_controller_;\r\n    PositionController a_controller_;\r\n    \r\n\r\n};\r\n\r\n#endif //RobotControllerModePosition_h"
  },
  {
    "path": "src/robot/src/robot_controller_modes/RobotControllerModeStopFast.cpp",
    "content": "#include \"RobotControllerModeStopFast.h\"\n#include \"constants.h\"\n#include <plog/Log.h>\n\nRobotControllerModeStopFast::RobotControllerModeStopFast(bool fake_perfect_motion)\n: RobotControllerModeBase(fake_perfect_motion),\n  current_target_()\n{\n    fine_tolerances_.trans_pos_err = cfg.lookup(\"motion.translation.position_threshold.fine\");\n    fine_tolerances_.ang_pos_err = cfg.lookup(\"motion.rotation.position_threshold.fine\");\n    fine_tolerances_.trans_vel_err = cfg.lookup(\"motion.translation.velocity_threshold.fine\");\n    fine_tolerances_.ang_vel_err = cfg.lookup(\"motion.rotation.velocity_threshold.fine\");\n\n    max_decel_ = {cfg.lookup(\"motion.translation.max_acc.coarse\"),\n                  cfg.lookup(\"motion.translation.max_acc.coarse\"),\n                  cfg.lookup(\"motion.rotation.max_acc.coarse\")};\n                          \n    PositionController::Gains position_gains;\n    position_gains.kp = cfg.lookup(\"motion.translation.gains.kp\");\n    position_gains.ki = cfg.lookup(\"motion.translation.gains.ki\");\n    position_gains.kd = cfg.lookup(\"motion.translation.gains.kd\");\n    x_controller_ = PositionController(position_gains);\n    y_controller_ = PositionController(position_gains);\n\n    PositionController::Gains angle_gains;\n    angle_gains.kp = cfg.lookup(\"motion.rotation.gains.kp\");\n    angle_gains.ki = cfg.lookup(\"motion.rotation.gains.ki\");\n    angle_gains.kd = cfg.lookup(\"motion.rotation.gains.kd\");\n    a_controller_ = PositionController(angle_gains);\n}\n\nvoid RobotControllerModeStopFast::startMove(Point current_position, Velocity current_velocity)\n{\n    (void) current_position;\n    initial_vel_sign_ = {\n        sgn(current_velocity.vx),\n        sgn(current_velocity.vy),\n        sgn(current_velocity.va),\n    };\n    current_decel_ = {\n        -initial_vel_sign_[0] * max_decel_[0],\n        -initial_vel_sign_[1] * max_decel_[1],\n        -initial_vel_sign_[2] * max_decel_[2],\n    };\n    PLOGW.printf(\"Starting STOP_FAST from velocity %s at decel: %f,%f,%f\",current_velocity.toString().c_str(),\n        current_decel_[0],current_decel_[1], current_decel_[2]);\n    RobotControllerModeBase::startMove();\n}\n\nVelocity RobotControllerModeStopFast::computeTargetVelocity(Point current_position, Velocity current_velocity, bool log_this_cycle)\n{\n    float dt_from_traj_start = move_start_timer_.dt_s();\n    float dt = loop_timer_.dt_s();\n    Point target_pos = {\n        current_position.x + current_velocity.vx * dt + 0.5f * current_decel_[0] * dt * dt,\n        current_position.y + current_velocity.vy * dt + 0.5f * current_decel_[1] * dt * dt,\n        current_position.a + current_velocity.va * dt + 0.5f * current_decel_[2] * dt * dt,\n    };\n    Velocity target_vel = {\n        current_velocity.vx + current_decel_[0]*dt,\n        current_velocity.vy + current_decel_[1]*dt,\n        current_velocity.va + current_decel_[2]*dt,\n    };\n\n    if(current_velocity.vx != 0 && sgn(current_velocity.vx) != initial_vel_sign_[0]) target_vel.vx = 0;\n    if(current_velocity.vy != 0 && sgn(current_velocity.vy) != initial_vel_sign_[1]) target_vel.vy = 0;\n    if(current_velocity.va != 0 && sgn(current_velocity.va) != initial_vel_sign_[2]) target_vel.va = 0;\n\n    current_target_ = {target_pos, target_vel, dt_from_traj_start};\n\n    // Print motion estimates to log\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"Target: \" << current_target_.toString();\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"Est Vel: \" << current_velocity.toString();\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"Est Pos: \" << current_position.toString();\n    \n    Velocity output;\n    if(fake_perfect_motion_)\n    {\n        output = current_target_.velocity;\n    }\n    else \n    {\n        float dt_since_last_loop = loop_timer_.dt_s();\n        loop_timer_.reset();\n        output.vx = x_controller_.compute(0, 0, current_target_.velocity.vx, current_velocity.vx, dt_since_last_loop);\n        output.vy = y_controller_.compute(0, 0, current_target_.velocity.vy, current_velocity.vy, dt_since_last_loop);\n        output.va = a_controller_.compute(0, 0, current_target_.velocity.va, current_velocity.va, dt_since_last_loop);\n    }\n\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"time,%.4f\",dt_from_traj_start);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"pos,%.4f,%.4f,%.4f\",current_position.x,current_position.y,current_position.a);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"target_pos,%.4f,%.4f,%.4f\",current_target_.position.x,current_target_.position.y,current_target_.position.a);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"vel,%.4f,%.4f,%.4f\",current_velocity.vx,current_velocity.vy,current_velocity.va);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"target_vel,%.4f,%.4f,%.4f\",current_target_.velocity.vx,current_target_.velocity.vy,current_target_.velocity.va);\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"control_vel,%.4f,%.4f,%.4f\",output.vx,output.vy,output.va);\n\n    return output;\n}\n\nbool RobotControllerModeStopFast::checkForMoveComplete(Point current_position, Velocity current_velocity)\n{\n    (void) current_position;\n    // Verify our target velocity is zero (i.e. we reached the end of the trajectory)\n    bool zero_cmd_vel = current_target_.velocity.nearZero();\n\n    // Verify actual translational and rotational velocities are within tolerance\n    Eigen::Vector2f dv = {current_velocity.vx, current_velocity.vy};\n    bool vel_in_tolerance = dv.norm() < fine_tolerances_.trans_vel_err && fabs(current_velocity.va) < fine_tolerances_.ang_vel_err;\n\n    // Trajectory is done when we aren't commanding any velocity, our both our actual velocity and\n    // position are within the correct tolerance.\n    bool traj_complete = zero_cmd_vel && vel_in_tolerance;\n    \n    if(traj_complete) \n    {\n        move_running_ = false;\n    }\n\n    return traj_complete;  \n}"
  },
  {
    "path": "src/robot/src/robot_controller_modes/RobotControllerModeStopFast.h",
    "content": "#ifndef RobotControllerModeStopFast_h\n#define RobotControllerModeStopFast_h\n\n#include \"RobotControllerModeBase.h\"\n#include \"SmoothTrajectoryGenerator.h\"\n#include \"utils.h\"\n\nclass RobotControllerModeStopFast : public RobotControllerModeBase\n{\n\n  public:\n\n    RobotControllerModeStopFast(bool fake_perfect_motion);\n\n    void startMove(Point current_position, Velocity current_velocity);\n\n    virtual Velocity computeTargetVelocity(Point current_position, Velocity current_velocity, bool log_this_cycle) override;\n\n    virtual bool checkForMoveComplete(Point current_position, Velocity current_velocity) override;\n\n  protected:\n\n    PVTPoint current_target_;\n    TrajectoryTolerances fine_tolerances_;\n    std::vector<float> max_decel_; \n    std::vector<float> current_decel_;\n    std::vector<int> initial_vel_sign_;  \n\n    PositionController x_controller_;\n    PositionController y_controller_;\n    PositionController a_controller_;\n    \n\n};\n\n#endif //RobotControllerModePosition_h"
  },
  {
    "path": "src/robot/src/robot_controller_modes/RobotControllerModeVision.cpp",
    "content": "#include \"RobotControllerModeVision.h\"\r\n#include \"constants.h\"\r\n#include <plog/Log.h>\r\n#include \"camera_tracker/CameraTrackerFactory.h\"\r\n\r\nRobotControllerModeVision::RobotControllerModeVision(bool fake_perfect_motion, StatusUpdater& status_updater)\r\n: RobotControllerModeBase(fake_perfect_motion),\r\n  status_updater_(status_updater),\r\n  traj_gen_(),\r\n  goal_point_(0,0,0),\r\n  current_point_(0,0,0),\r\n  current_target_(),\r\n  camera_tracker_(CameraTrackerFactory::getFactoryInstance()->get_camera_tracker()),\r\n  traj_done_timer_(),\r\n  kf_(3,3)\r\n{\r\n    tolerances_.trans_pos_err = cfg.lookup(\"motion.translation.position_threshold.vision\");\r\n    tolerances_.ang_pos_err = cfg.lookup(\"motion.rotation.position_threshold.vision\");\r\n    tolerances_.trans_vel_err = cfg.lookup(\"motion.translation.velocity_threshold.vision\");\r\n    tolerances_.ang_vel_err = cfg.lookup(\"motion.rotation.velocity_threshold.vision\");\r\n\r\n    PositionController::Gains position_gains;\r\n    position_gains.kp = cfg.lookup(\"motion.translation.gains_vision.kp\");\r\n    position_gains.ki = cfg.lookup(\"motion.translation.gains_vision.ki\");\r\n    position_gains.kd = cfg.lookup(\"motion.translation.gains_vision.kd\");\r\n    x_controller_ = PositionController(position_gains);\r\n    y_controller_ = PositionController(position_gains);\r\n\r\n    PositionController::Gains angle_gains;\r\n    angle_gains.kp = cfg.lookup(\"motion.rotation.gains_vision.kp\");\r\n    angle_gains.ki = cfg.lookup(\"motion.rotation.gains_vision.ki\");\r\n    angle_gains.kd = cfg.lookup(\"motion.rotation.gains_vision.kd\");\r\n    a_controller_ = PositionController(angle_gains);\r\n\r\n    Eigen::MatrixXf A = Eigen::MatrixXf::Identity(3,3);\r\n    Eigen::MatrixXf B = Eigen::MatrixXf::Identity(3,3);\r\n    Eigen::MatrixXf C = Eigen::MatrixXf::Identity(3,3);\r\n    Eigen::MatrixXf Q(3,3);\r\n    Q << cfg.lookup(\"vision_tracker.kf.predict_trans_cov\"),0,0, \r\n         0,cfg.lookup(\"vision_tracker.kf.predict_trans_cov\"),0,\r\n         0,0,cfg.lookup(\"vision_tracker.kf.predict_angle_cov\");\r\n    Eigen::MatrixXf R(3,3);\r\n    R << cfg.lookup(\"vision_tracker.kf.meas_trans_cov\"),0,0, \r\n         0,cfg.lookup(\"vision_tracker.kf.meas_trans_cov\"),0,\r\n         0,0,cfg.lookup(\"vision_tracker.kf.meas_angle_cov\");\r\n    kf_ = KalmanFilter(A,B,C,Q,R);\r\n}\r\n\r\nbool RobotControllerModeVision::startMove(Point target_point)\r\n{\r\n    CameraTrackerOutput tracker_output = camera_tracker_->getPoseFromCamera();\r\n    if(!tracker_output.ok) \r\n    {\r\n        PLOGE << \"Cannot start vision move, camera pose not ok\";\r\n        return false;\r\n    }\r\n    current_point_ = tracker_output.pose;\r\n    goal_point_ = target_point;\r\n    bool ok = traj_gen_.generatePointToPointTrajectory(current_point_, target_point, LIMITS_MODE::VISION);\r\n    if(ok) RobotControllerModeBase::startMove();\r\n    return ok;\r\n}\r\n\r\nVelocity RobotControllerModeVision::computeTargetVelocity(Point current_position, Velocity current_velocity, bool log_this_cycle)\r\n{   \r\n    // Get current target global position and global velocity according to the trajectory\r\n    float dt_from_traj_start = move_start_timer_.dt_s();\r\n    current_target_ = traj_gen_.lookup(dt_from_traj_start);\r\n\r\n    // Get previous loop time\r\n    float dt_since_last_loop = loop_timer_.dt_s();\r\n    loop_timer_.reset();\r\n\r\n    // Convert global velocity into local robot frame and use it to predict the filter\r\n    Eigen::Vector3f local_vel;\r\n    local_vel[0] = cos(current_position.a) * current_velocity.vx + sin(current_position.a) * current_velocity.vy;\r\n    local_vel[1] = - sin(current_position.a) * current_velocity.vx + cos(current_position.a) * current_velocity.vy;\r\n    local_vel[2] = current_velocity.va;\r\n    Eigen::Vector3f udt = local_vel * dt_since_last_loop;\r\n    if(udt.norm() > 0)\r\n    {\r\n        kf_.predict(udt);\r\n    }\r\n\r\n    // Get latest pose from cameras and do update step if data is available\r\n    CameraTrackerOutput tracker_output = camera_tracker_->getPoseFromCamera();\r\n    if(tracker_output.ok && tracker_output.timestamp > last_vision_update_time_)\r\n    {\r\n        // Get the current distance measurements from the sensors and update the filter\r\n        Eigen::Vector3f point_vec = {tracker_output.pose.x,tracker_output.pose.y,tracker_output.pose.a};\r\n        kf_.update(point_vec);\r\n        last_vision_update_time_ = tracker_output.timestamp;\r\n    }\r\n    \r\n    // Update current point from state\r\n    Eigen::VectorXf state = kf_.state();\r\n    current_point_ = {state[0], state[1], state[2]};\r\n    status_updater_.updateVisionControllerPose(current_point_);\r\n\r\n    // Print motion estimates to log\r\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"\\nTarget: \" << current_target_.toString();\r\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"Est Vel: \" << current_velocity.toString();\r\n    PLOGD_IF_(MOTION_LOG_ID, log_this_cycle) << \"Est camera pose: \" << current_point_.toString();\r\n\r\n    // Compute control\r\n    Velocity output_local;\r\n    if(fake_perfect_motion_)\r\n    {\r\n        output_local = current_target_.velocity;\r\n    }\r\n    else\r\n    {\r\n        output_local.vx =  x_controller_.compute(current_target_.position.x, current_point_.x, current_target_.velocity.vx, local_vel[0], dt_since_last_loop);\r\n        output_local.vy =  y_controller_.compute(current_target_.position.y, current_point_.y, current_target_.velocity.vy, local_vel[1], dt_since_last_loop);\r\n        output_local.va =  a_controller_.compute(current_target_.position.a, current_point_.a, current_target_.velocity.va, local_vel[2], dt_since_last_loop);\r\n    }\r\n\r\n    // Rotate this commanded velocity back into global frame because currently it is in the local robot frame\r\n    Velocity output_global;\r\n    output_global.vx = cos(current_position.a) * output_local.vx - sin(current_position.a) * output_local.vy;\r\n    output_global.vy = sin(current_position.a) * output_local.vx + cos(current_position.a) * output_local.vy;\r\n    output_global.va = output_local.va;\r\n\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"time,%.4f\",dt_from_traj_start);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"pos,%.4f,%.4f,%.4f\",current_point_.x,current_point_.y,current_point_.a);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"target_pos,%.4f,%.4f,%.4f\",current_target_.position.x,current_target_.position.y,current_target_.position.a);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"vel,%.4f,%.4f,%.4f\",local_vel[0],local_vel[1],local_vel[2]);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"target_vel,%.4f,%.4f,%.4f\",current_target_.velocity.vx,current_target_.velocity.vy,current_target_.velocity.va);\r\n    PLOGI_(MOTION_CSV_LOG_ID).printf(\"control_vel,%.4f,%.4f,%.4f\",output_local.vx,output_local.vy,output_local.va);\r\n\r\n    return output_global;\r\n}\r\n\r\nbool RobotControllerModeVision::checkForMoveComplete(Point current_position, Velocity current_velocity)\r\n{\r\n    (void) current_position;\r\n\r\n    // Verify our target velocity is zero (i.e. we reached the end of the trajectory)\r\n    bool zero_cmd_vel = current_target_.velocity.nearZero();\r\n\r\n    // Verify actual translational and rotational positions are within tolerance\r\n    Eigen::Vector2f dp = {goal_point_.x - current_point_.x, goal_point_.y - current_point_.y};\r\n    bool pos_in_tolerance = dp.norm() < tolerances_.trans_pos_err &&\r\n                            fabs(angle_diff(goal_point_.a, current_point_.a)) < tolerances_.ang_pos_err; \r\n\r\n    // Verify actual translational and rotational velocities are within tolerance\r\n    Eigen::Vector2f dv = {current_velocity.vx, current_velocity.vy};\r\n    bool vel_in_tolerance = dv.norm() < tolerances_.trans_vel_err && fabs(current_velocity.va) < tolerances_.ang_vel_err;\r\n\r\n    // Trajectory is done when we aren't commanding any velocity, our both our actual velocity and\r\n    // position are within the correct tolerance.\r\n    bool maybe_traj_complete = zero_cmd_vel && vel_in_tolerance && pos_in_tolerance;\r\n\r\n    if (!maybe_traj_complete) traj_done_timer_.reset();\r\n    else PLOGI_(MOTION_LOG_ID) << \"Camera move maybe done\";\r\n\r\n    if(maybe_traj_complete && traj_done_timer_.dt_s() > 0.8) \r\n    {\r\n        move_running_ = false;\r\n        PLOGI_(MOTION_LOG_ID) << \"Camera move done\";\r\n        return true;  \r\n    }\r\n\r\n    return false;  \r\n}"
  },
  {
    "path": "src/robot/src/robot_controller_modes/RobotControllerModeVision.h",
    "content": "#ifndef RobotControllerModeVision_h\r\n#define RobotControllerModeVision_h\r\n\r\n#include \"RobotControllerModeBase.h\"\r\n#include \"SmoothTrajectoryGenerator.h\"\r\n#include \"utils.h\"\r\n#include \"camera_tracker/CameraTracker.h\"\r\n#include \"KalmanFilter.h\"\r\n#include \"StatusUpdater.h\"\r\n\r\nclass RobotControllerModeVision : public RobotControllerModeBase\r\n{\r\n\r\n  public:\r\n\r\n    RobotControllerModeVision(bool fake_perfect_motion, StatusUpdater& status_updater);\r\n\r\n    bool startMove(Point target_point);\r\n\r\n    virtual Velocity computeTargetVelocity(Point current_position, Velocity current_velocity, bool log_this_cycle) override;\r\n\r\n    virtual bool checkForMoveComplete(Point current_position, Velocity current_velocity) override;\r\n\r\n  protected:\r\n\r\n    StatusUpdater& status_updater_;\r\n    SmoothTrajectoryGenerator traj_gen_; \r\n    Point goal_point_;\r\n    Point current_point_;\r\n    PVTPoint current_target_;\r\n    CameraTrackerBase* camera_tracker_;\r\n    Timer traj_done_timer_;\r\n    ClockTimePoint last_vision_update_time_;\r\n\r\n    TrajectoryTolerances tolerances_;\r\n\r\n    PositionController x_controller_;\r\n    PositionController y_controller_;\r\n    PositionController a_controller_;    \r\n    KalmanFilter kf_;\r\n\r\n};\r\n\r\n#endif //RobotControllerModeVision_h"
  },
  {
    "path": "src/robot/src/serial/MockSerialComms.cpp",
    "content": "#include \"MockSerialComms.h\"\n\n#include <plog/Log.h> \n#include <iostream>\n\n\nMockSerialComms::MockSerialComms(std::string portName)\n: SerialCommsBase(),\n  send_base_data_(),\n  send_lift_data_(),\n  send_distance_data_(),\n  rcv_base_data_(),\n  rcv_lift_data_(),\n  rcv_distance_data_(),\n  port_(portName)\n{\n    connected_ = true;\n}\n\nMockSerialComms::~MockSerialComms()\n{\n}\n\nvoid MockSerialComms::send(std::string msg)\n{\n    if (msg.rfind(\"base:\", 0) == 0)\n    {\n        send_base_data_.push(msg.substr(5, std::string::npos));\n    }\n    else if (msg.rfind(\"lift:\", 0) == 0)\n    {\n        send_lift_data_.push(msg.substr(5, std::string::npos));\n    }\n    else\n    {\n        // Distance send from pi -> arduino doesn't have dist prefix\n        send_distance_data_.push(msg);\n    }\n}\n\nstd::string MockSerialComms::rcv_base()\n{\n    if(rcv_base_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = rcv_base_data_.front();\n    rcv_base_data_.pop();\n    return outdata;\n}\n\nstd::string MockSerialComms::rcv_lift()\n{\n    if(rcv_lift_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = rcv_lift_data_.front();\n    rcv_lift_data_.pop();\n    return outdata;\n}\n\nstd::string MockSerialComms::rcv_distance()\n{\n    if(rcv_distance_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = rcv_distance_data_.front();\n    rcv_distance_data_.pop();\n    return outdata;\n}\n\nvoid MockSerialComms::mock_send(std::string msg)\n{\n    if (msg.rfind(\"base:\", 0) == 0)\n    {\n        rcv_base_data_.push(msg.substr(5, std::string::npos));\n    }\n    else if (msg.rfind(\"lift:\", 0) == 0)\n    {\n        rcv_lift_data_.push(msg.substr(5, std::string::npos));\n    }\n    else if (msg.rfind(\"dist:\", 0) == 0)\n    {\n        rcv_distance_data_.push(msg.substr(5, std::string::npos));\n    }\n    else\n    {\n        PLOGE << \"Unknown message type, skipping: \" << msg;\n    }\n}\n\nstd::string MockSerialComms::mock_rcv_base()\n{\n    if(send_base_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = send_base_data_.front();\n    send_base_data_.pop();\n    return outdata;\n}\n\nstd::string MockSerialComms::mock_rcv_lift()\n{\n    if(send_lift_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = send_lift_data_.front();\n    send_lift_data_.pop();\n    return outdata;\n}\n\nstd::string MockSerialComms::mock_rcv_distance()\n{\n    if(send_distance_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = send_distance_data_.front();\n    send_distance_data_.pop();\n    return outdata;\n}\n\nvoid MockSerialComms::purge_data()\n{\n    while(!send_lift_data_.empty())\n    {\n        send_lift_data_.pop();\n    }\n    while(!send_base_data_.empty())\n    {\n        send_base_data_.pop();\n    }\n    while(!send_distance_data_.empty())\n    {\n        send_distance_data_.pop();\n    }\n    while(!rcv_base_data_.empty())\n    {\n        rcv_base_data_.pop();\n    }\n    while(!rcv_lift_data_.empty())\n    {\n        rcv_lift_data_.pop();\n    }\n    while(!rcv_distance_data_.empty())\n    {\n        rcv_distance_data_.pop();\n    }\n}"
  },
  {
    "path": "src/robot/src/serial/MockSerialComms.h",
    "content": "#ifndef MockSerialComms_h\n#define MockSerialComms_h\n\n#include <string>\n#include <queue>\n\n#include \"SerialCommsBase.h\"\n\nclass MockSerialComms : public SerialCommsBase\n{\n  public:\n    \n    // Constructor\n    MockSerialComms(std::string portName);\n\n    virtual ~MockSerialComms();\n\n    void send(std::string msg) override;\n\n    std::string rcv_base() override;\n\n    std::string rcv_lift() override;\n\n    std::string rcv_distance() override;\n\n    void mock_send(std::string msg);\n    \n    std::string mock_rcv_lift();\n    \n    std::string mock_rcv_base();\n\n    std::string mock_rcv_distance();\n\n    void purge_data();\n\n  protected:\n\n    std::queue<std::string> send_base_data_;\n    std::queue<std::string> send_lift_data_;\n    std::queue<std::string> send_distance_data_;\n    std::queue<std::string> rcv_base_data_;\n    std::queue<std::string> rcv_lift_data_;\n    std::queue<std::string> rcv_distance_data_;\n    std::string port_;\n    \n\n};\n\n#endif"
  },
  {
    "path": "src/robot/src/serial/SerialComms.cpp",
    "content": "#include \"SerialComms.h\"\n\n#include <plog/Log.h>\n\nSerialComms::SerialComms(std::string portName)\n: SerialCommsBase(),\n  serial_(portName),\n  recvInProgress_(false),\n  recvIdx_(0),\n  buffer_(\"\"),\n  base_data_(),\n  lift_data_(),\n  distance_data_()\n{\n    // If we get here, that means serial_ was constructed correctly which means \n    // we have a valid connection\n    connected_ = true;\n    serial_.SetBaudRate(LibSerial::BaudRate::BAUD_115200);    \n}\n\nSerialComms::~SerialComms()\n{\n}\n\nstd::string SerialComms::rcv_base()\n{\n    rcv();\n    if(base_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = base_data_.front();\n    base_data_.pop();\n    return outdata;\n}\n\nstd::string SerialComms::rcv_lift()\n{\n    rcv();\n    if(lift_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = lift_data_.front();\n    lift_data_.pop();\n    return outdata;\n}\n\nstd::string SerialComms::rcv_distance()\n{\n    rcv();\n    if(distance_data_.empty())\n    {\n        return \"\";\n    }\n    std::string outdata = distance_data_.front();\n    distance_data_.pop();\n    return outdata;\n}\n\nvoid SerialComms::rcv()\n{\n    if(!connected_)\n    {\n        PLOGE.printf(\"Cannot receive if port isn't connected\");\n        return;\n    }\n    \n    bool newData = false;\n    const int timeout_ms = 5;\n    std::string new_msg;\n    while (serial_.IsDataAvailable() && newData == false) \n    {\n        char rc = ' ';\n        try\n        {\n            serial_.ReadByte(rc, timeout_ms);\n        }\n        catch (LibSerial::ReadTimeout&)\n        {\n            PLOGI.printf(\"Serial timeout\");\n            break;\n        }\n\n        if (recvInProgress_ == true) \n        {\n            if (rc == START_CHAR)\n            {\n              buffer_ = \"\";\n            }\n            else if (rc != END_CHAR) \n            {\n                buffer_ += rc;\n            }\n            else \n            {\n                recvInProgress_ = false;\n                newData = true;\n                new_msg = buffer_;\n                buffer_ = \"\";\n            }\n        }\n        else if (rc == START_CHAR) \n        {\n            recvInProgress_ = true;\n        }\n    }\n\n    // Figure out what to do with the message based on the identifier\n    if (new_msg.rfind(\"DEBUG\", 0) == 0)\n    {\n        PLOGI << new_msg;\n    }\n    else if (new_msg.rfind(\"base:\", 0) == 0)\n    {\n        base_data_.push(new_msg.substr(5, std::string::npos));\n    }\n    else if (new_msg.rfind(\"lift:\", 0) == 0)\n    {\n        lift_data_.push(new_msg.substr(5, std::string::npos));\n    }\n    else if (new_msg.rfind(\"dist:\", 0) == 0)\n    {\n        distance_data_.push(new_msg.substr(5, std::string::npos));\n    }\n    else if (new_msg.empty())\n    {\n        // Do nothing\n    }\n    else\n    {\n        PLOGE << \"Unknown message type, skipping: \" << new_msg;\n    }\n}\n\nvoid SerialComms::send(std::string msg)\n{\n    if(!connected_)\n    {\n        PLOGE.printf(\"Cannot send if port isn't connected\");\n        return;\n    }\n    \n    if (msg.length() > 0)\n    {\n      std::string toSend = START_CHAR + msg + END_CHAR;\n      PLOGD.printf(\"Serial send: %s\",toSend.c_str());\n      serial_.Write(toSend);\n    }\n}"
  },
  {
    "path": "src/robot/src/serial/SerialComms.h",
    "content": "#ifndef SerialComms_h\n#define SerialComms_h\n\n#include <libserial/SerialPort.h>\n#include <queue>\n\n#include \"SerialCommsBase.h\"\n\n// Factory method\nstd::unique_ptr<SerialCommsBase> buildSerialComms(std::string portName);\n\nclass SerialComms : public SerialCommsBase\n{\n  public:\n    \n    // Constructor\n    SerialComms(std::string portName);\n\n    virtual ~SerialComms();\n\n    void send(std::string msg) override;\n\n    std::string rcv_base() override;\n\n    std::string rcv_lift() override;\n\n    std::string rcv_distance() override;\n\n  protected:\n\n    void rcv();\n\n    LibSerial::SerialPort serial_;;\n\n    bool recvInProgress_;\n    int recvIdx_;\n    std::string buffer_;\n    \n    std::queue<std::string> base_data_;\n    std::queue<std::string> lift_data_;\n    std::queue<std::string> distance_data_;\n    \n\n};\n\n#endif"
  },
  {
    "path": "src/robot/src/serial/SerialCommsBase.cpp",
    "content": "#include \"SerialCommsBase.h\"\n#include <plog/Log.h>\n\nSerialCommsBase::SerialCommsBase() : connected_(false) {}\n\nSerialCommsBase::~SerialCommsBase() {}\n\nstd::string SerialCommsBase::rcv_base()\n{\n    return \"\";\n} \n\nstd::string SerialCommsBase::rcv_lift()\n{\n    return \"\";\n} \n\nstd::string SerialCommsBase::rcv_distance()\n{\n    return \"\";\n} \n\nvoid SerialCommsBase::send(std::string msg)\n{\n    (void) msg; // Silence warnings\n    return;\n}"
  },
  {
    "path": "src/robot/src/serial/SerialCommsBase.h",
    "content": "#ifndef SerialCommsBase_h\n#define SerialCommsBase_h\n\n#include <string>\n\n#define START_CHAR '<'\n#define END_CHAR '>'\n\nclass SerialCommsBase\n{\n  public:\n\n    SerialCommsBase();\n\n    virtual ~SerialCommsBase();\n\n    virtual void send(std::string msg);\n\n    virtual std::string rcv_base();\n\n    virtual std::string rcv_lift();\n\n    virtual std::string rcv_distance();\n\n    bool isConnected() {return connected_;};\n\n  protected:\n\n    bool connected_;\n\n};\n\n#endif"
  },
  {
    "path": "src/robot/src/serial/SerialCommsFactory.cpp",
    "content": "#include \"SerialCommsFactory.h\"\n#include \"SerialComms.h\"\n#include \"MockSerialComms.h\"\n\n#include <plog/Log.h> \n\nSerialCommsFactory* SerialCommsFactory::instance = NULL;\n\n\nSerialCommsFactory* SerialCommsFactory::getFactoryInstance()\n{\n    if(!instance)\n    {\n        instance = new SerialCommsFactory;\n    }\n    return instance;\n}\n\nvoid SerialCommsFactory::set_mode(SERIAL_FACTORY_MODE mode)\n{\n    mode_ = mode;\n}\n\nSerialCommsBase* SerialCommsFactory::get_serial_comms(std::string portName)\n{\n    if (comms_objects_.count(portName) == 0)\n    {\n        build_serial_comms(portName);\n    }\n    return comms_objects_[portName].get();\n}\n\n\nvoid SerialCommsFactory::build_serial_comms(std::string portName)\n{\n    std::unique_ptr<SerialCommsBase> serial_comms;\n    \n    if(mode_ == SERIAL_FACTORY_MODE::STANDARD)\n    {\n        try\n        {\n            serial_comms = std::make_unique<SerialComms>(portName);\n            PLOGI << \"Built SerialComms on port: \"<< portName;\n        }\n        catch (const LibSerial::OpenFailed&)\n        {\n            serial_comms = std::make_unique<SerialCommsBase>();\n            PLOGW << \"Could not open serial port: \" << portName;\n        }\n\n    }\n    else if (mode_ == SERIAL_FACTORY_MODE::MOCK)\n    {\n        serial_comms = std::make_unique<MockSerialComms>(portName);\n        PLOGI << \"Built MockSerialComms for \" << portName;\n    }\n\n    comms_objects_[portName] = std::move(serial_comms);\n}\n\n\n// Private constructor\nSerialCommsFactory::SerialCommsFactory()\n: mode_(SERIAL_FACTORY_MODE::STANDARD),\n  comms_objects_()\n{}\n  "
  },
  {
    "path": "src/robot/src/serial/SerialCommsFactory.h",
    "content": "#ifndef SerialCommsFactory_h\n#define SerialCommsFactory_h\n\n#include <memory>\n#include <map>\n\n#include \"SerialCommsBase.h\"\n\nenum class SERIAL_FACTORY_MODE\n{\n  STANDARD,\n  MOCK,\n};\n\nclass SerialCommsFactory\n{\n\n  public:\n\n    static SerialCommsFactory* getFactoryInstance();\n\n    void set_mode(SERIAL_FACTORY_MODE mode);\n\n    SerialCommsBase* get_serial_comms(std::string portName);\n\n    // Delete copy and assignment constructors\n    SerialCommsFactory(SerialCommsFactory const&) = delete;\n    SerialCommsFactory& operator= (SerialCommsFactory const&) = delete;\n\n  private:\n\n    // Make standard constructor private so it can't be created\n    SerialCommsFactory();\n\n    void build_serial_comms(std::string portName);\n\n    static SerialCommsFactory* instance;\n\n    SERIAL_FACTORY_MODE mode_;\n\n    std::map<std::string, std::unique_ptr<SerialCommsBase>> comms_objects_;\n\n};\n\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/ClientSocket.cpp",
    "content": "// Implementation of the ClientSocket class\n\n#include \"ClientSocket.h\"\n#include \"SocketException.h\"\n\n\nClientSocket::ClientSocket ( std::string host, int port )\n{\n  if ( ! Socket::create() )\n    {\n      throw SocketException ( \"Could not create client socket.\" );\n    }\n\n  if ( ! Socket::connect ( host, port ) )\n    {\n      throw SocketException ( \"Could not bind to port.\" );\n    }\n\n}\n\n\nconst ClientSocket& ClientSocket::operator << ( const std::string& s ) const\n{\n  if ( ! Socket::send ( s ) )\n    {\n      throw SocketException ( \"Could not write to socket.\" );\n    }\n\n  return *this;\n\n}\n\n\nconst ClientSocket& ClientSocket::operator >> ( std::string& s ) const\n{\n  if ( ! Socket::recv ( s ) )\n    {\n      throw SocketException ( \"Could not read from socket.\" );\n    }\n\n  return *this;\n}"
  },
  {
    "path": "src/robot/src/sockets/ClientSocket.h",
    "content": "// Definition of the ClientSocket class\n\n#ifndef ClientSocket_class\n#define ClientSocket_class\n\n#include \"Socket.h\"\n\n\nclass ClientSocket : private Socket\n{\n public:\n\n  ClientSocket ( std::string host, int port );\n  virtual ~ClientSocket(){};\n\n  const ClientSocket& operator << ( const std::string& ) const;\n  const ClientSocket& operator >> ( std::string& ) const;\n\n};\n\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/MockSocketMultiThreadWrapper.cpp",
    "content": "#include \"MockSocketMultiThreadWrapper.h\"\n\n#include <plog/Log.h>\n\nMockSocketMultiThreadWrapper::MockSocketMultiThreadWrapper()\n: SocketMultiThreadWrapperBase(),\n  send_data_(),\n  rcv_data_(),\n  ms_until_next_command_(-1),\n  send_immediate_(true),\n  timer_()\n{\n}\n\nvoid MockSocketMultiThreadWrapper::add_mock_data(std::string data)\n{\n    rcv_data_.push(data);\n}\n\nstd::string MockSocketMultiThreadWrapper::getData()\n{\n    if (rcv_data_.empty())\n    {\n        return \"\";\n    }\n    \n    std::string outdata = rcv_data_.front();\n    rcv_data_.pop();\n    PLOGI << \"Popped: \" << outdata << \" data left: \" << rcv_data_.size();\n\n    if(outdata[1] != '{')\n    {\n        ms_until_next_command_ = stoi(outdata);\n        timer_.reset();\n        PLOGI << \"Waiting \" << ms_until_next_command_ << \" ms to send next command\";\n        send_immediate_ = false;\n        outdata = \"\";\n    }\n\n    return outdata;\n}\n\nvoid MockSocketMultiThreadWrapper::sendData(std::string data)\n{\n    send_data_.push(data);\n}\n\nbool MockSocketMultiThreadWrapper::dataAvailableToRead()\n{\n    if(send_immediate_ || timer_.dt_ms() > ms_until_next_command_)\n    {\n        return !rcv_data_.empty();\n    }\n    else\n    {\n        return false;\n    }\n}\n\nstd::string MockSocketMultiThreadWrapper::getMockData()\n{\n    if (send_data_.empty())\n    {\n        return \"\";\n    }\n    \n    std::string outdata = send_data_.front();\n    send_data_.pop();\n    return outdata;\n}\n\nvoid MockSocketMultiThreadWrapper::sendMockData(std::string data)\n{\n    rcv_data_.push(data);\n}\n\nvoid MockSocketMultiThreadWrapper::purge_data()\n{\n    while(!send_data_.empty())\n    {\n        send_data_.pop();\n    }\n    while(!rcv_data_.empty())\n    {\n        rcv_data_.pop();\n    }\n}"
  },
  {
    "path": "src/robot/src/sockets/MockSocketMultiThreadWrapper.h",
    "content": "#ifndef MockSocketMultiThreadWrapper_h\n#define MockSocketMultiThreadWrapper_h\n\n#include <string>\n#include <queue>\n\n#include \"SocketMultiThreadWrapperBase.h\"\n#include \"utils.h\"\n\nclass MockSocketMultiThreadWrapper : public SocketMultiThreadWrapperBase\n{\n  public:\n    MockSocketMultiThreadWrapper();\n\n    std::string getData();\n    void sendData(std::string data);\n    bool dataAvailableToRead();\n\n    void add_mock_data(std::string data);\n    std::string getMockData();\n    void sendMockData(std::string data);\n\n    void purge_data();\n    void set_send_immediate(bool send_immediate) {send_immediate_ = send_immediate;};\n\n  private:\n    std::queue<std::string> send_data_;\n    std::queue<std::string> rcv_data_;\n    int ms_until_next_command_;\n    bool send_immediate_;\n    Timer timer_;\n\n};\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/README.md",
    "content": "The low level socket implimentation in this folder comes from Rob Tougher and is published here at https://tldp.org/LDP/LG/issue74/tougher.html under the Open Publication License (http://www.opencontent.org/openpub/) wich allows copying and modification as long as a disclosure is present. The following files are copied from the site specified and provided here with minor modifications:\n-`Socket.h/.cpp`\n-`ServerSocket.h/.cpp`\n-`ClientSocket.h/.cpp`\n-`SocketException.h`\n\nAll other files in this directory are written by Alex Baucom.\n"
  },
  {
    "path": "src/robot/src/sockets/ServerSocket.cpp",
    "content": "// Implementation of the ServerSocket class\n\n#include \"ServerSocket.h\"\n#include \"SocketException.h\"\n#include \"SocketTimeoutException.h\"\n#include <iostream>\n\n\nServerSocket::ServerSocket ( int port )\n{\n  if ( ! Socket::create() )\n    {\n      throw SocketException ( \"Could not create server socket.\" );\n    }\n\n  if ( ! Socket::bind ( port ) )\n    {\n      throw SocketException ( \"Could not bind to port.\" );\n    }\n\n  if ( ! Socket::listen() )\n    {\n      throw SocketException ( \"Could not listen to socket.\" );\n    }\n\n}\n\nServerSocket::~ServerSocket()\n{\n}\n\n\nconst ServerSocket& ServerSocket::operator << ( const std::string& s ) const\n{\n  if ( ! Socket::send ( s ) )\n    {\n      throw SocketException ( \"Could not write to socket.\" );\n    }\n\n  return *this;\n\n}\n\n\nconst ServerSocket& ServerSocket::operator >> ( std::string& s ) const\n{\n  int status = Socket::recv ( s );\n  if ( status == -1 )\n  {\n    throw SocketTimeoutException ( \"No data available in blocking time\" );\n  }\n  else if (status == 0)\n  {\n    throw SocketException ( \"Could not read from socket.\" );\n  }\n\n  return *this;\n}\n\nvoid ServerSocket::accept ( ServerSocket& sock )\n{\n  if ( ! Socket::accept ( sock ) )\n    {\n      throw SocketException ( \"Could not accept socket.\" );\n    }\n}\n\nvoid ServerSocket::set_non_blocking()\n{\n  Socket::set_non_blocking(true);\n}"
  },
  {
    "path": "src/robot/src/sockets/ServerSocket.h",
    "content": "// From https://tldp.org/LDP/LG/issue74/tougher.html\n\n// Definition of the ServerSocket class\n\n#ifndef ServerSocket_class\n#define ServerSocket_class\n\n#include \"Socket.h\"\n\n\nclass ServerSocket : private Socket\n{\n public:\n\n  ServerSocket ( int port );\n  ServerSocket (){};\n  virtual ~ServerSocket();\n\n  const ServerSocket& operator << ( const std::string& ) const;\n  const ServerSocket& operator >> ( std::string& ) const;\n\n  void accept ( ServerSocket& );\n  void set_non_blocking();\n\n};\n\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/Socket.cpp",
    "content": "// Implementation of the Socket class.\n\n\n#include \"Socket.h\"\n#include \"string.h\"\n#include <string.h>\n#include <errno.h>\n#include <fcntl.h>\n#include <iostream>\n\n\n\nSocket::Socket() :\n  m_sock ( -1 )\n{\n\n  memset ( &m_addr,\n\t   0,\n\t   sizeof ( m_addr ) );\n\n}\n\nSocket::~Socket()\n{\n  if ( is_valid() )\n    ::close ( m_sock );\n}\n\nbool Socket::create()\n{\n  m_sock = socket ( AF_INET,\n\t\t    SOCK_STREAM,\n\t\t    0 );\n\n  if ( ! is_valid() )\n    return false;\n\n\n  // TIME_WAIT - argh\n  int on = 1;\n  if ( setsockopt ( m_sock, SOL_SOCKET, SO_REUSEADDR, ( const char* ) &on, sizeof ( on ) ) == -1 )\n    return false;\n\n\n  return true;\n\n}\n\n\n\nbool Socket::bind ( const int port )\n{\n\n  if ( ! is_valid() )\n    {\n      return false;\n    }\n\n\n\n  m_addr.sin_family = AF_INET;\n  m_addr.sin_addr.s_addr = INADDR_ANY;\n  m_addr.sin_port = htons ( port );\n\n  int bind_return = ::bind ( m_sock,\n\t\t\t     ( struct sockaddr * ) &m_addr,\n\t\t\t     sizeof ( m_addr ) );\n\n\n  if ( bind_return == -1 )\n    {\n      return false;\n    }\n\n  return true;\n}\n\n\nbool Socket::listen() const\n{\n  if ( ! is_valid() )\n    {\n      return false;\n    }\n\n  int listen_return = ::listen ( m_sock, MAXCONNECTIONS );\n\n\n  if ( listen_return == -1 )\n    {\n      return false;\n    }\n\n  return true;\n}\n\n\nbool Socket::accept ( Socket& new_socket ) const\n{\n  int addr_length = sizeof ( m_addr );\n  new_socket.m_sock = ::accept ( m_sock, ( sockaddr * ) &m_addr, ( socklen_t * ) &addr_length );\n\n  if ( new_socket.m_sock <= 0 )\n    return false;\n  else\n    return true;\n}\n\n\nbool Socket::send ( const std::string s ) const\n{\n  int status = ::send ( m_sock, s.c_str(), s.size(), MSG_NOSIGNAL );\n  if ( status == -1 )\n    {\n      return false;\n    }\n  else\n    {\n      return true;\n    }\n}\n\n\nint Socket::recv ( std::string& s ) const\n{\n  char buf [ MAXRECV + 1 ];\n\n  s = \"\";\n\n  memset ( buf, 0, MAXRECV + 1 );\n\n  int status = ::recv ( m_sock, buf, MAXRECV, 0 );\n\n  if ( status == -1 )\n    {\n      if (errno == EAGAIN )\n      {\n        return -1;\n      }\n      else\n      {\n        std::cout << \"status == -1   errno == \" << errno << \"  in Socket::recv\\n\";\n        return 0;\n      }\n    }\n  else if ( status == 0 )\n    {\n      return 0;\n    }\n  else\n    {\n      s = buf;\n      return status;\n    }\n}\n\n\n\nbool Socket::connect ( const std::string host, const int port )\n{\n  if ( ! is_valid() ) return false;\n\n  m_addr.sin_family = AF_INET;\n  m_addr.sin_port = htons ( port );\n\n  int status = inet_pton ( AF_INET, host.c_str(), &m_addr.sin_addr );\n\n  if ( errno == EAFNOSUPPORT ) return false;\n\n  status = ::connect ( m_sock, ( sockaddr * ) &m_addr, sizeof ( m_addr ) );\n\n  if ( status == 0 )\n    return true;\n  else\n    return false;\n}\n\nvoid Socket::set_non_blocking ( const bool b )\n{\n\n  int opts;\n\n  opts = fcntl ( m_sock,\n\t\t F_GETFL );\n\n  if ( opts < 0 )\n    {\n      return;\n    }\n\n  if ( b )\n    opts = ( opts | O_NONBLOCK );\n  else\n    opts = ( opts & ~O_NONBLOCK );\n\n  fcntl ( m_sock,\n\t  F_SETFL,opts );\n\n}"
  },
  {
    "path": "src/robot/src/sockets/Socket.h",
    "content": "// Definition of the Socket class\n\n#ifndef Socket_class\n#define Socket_class\n\n\n#include <sys/types.h>\n#include <sys/socket.h>\n#include <netinet/in.h>\n#include <netdb.h>\n#include <unistd.h>\n#include <string>\n#include <arpa/inet.h>\n\n\nconst int MAXHOSTNAME = 200;\nconst int MAXCONNECTIONS = 5;\nconst int MAXRECV = 500;\n\nclass Socket\n{\n public:\n  Socket();\n  virtual ~Socket();\n\n  // Server initialization\n  bool create();\n  bool bind ( const int port );\n  bool listen() const;\n  bool accept ( Socket& ) const;\n\n  // Client initialization\n  bool connect ( const std::string host, const int port );\n\n  // Data Transimission\n  bool send ( const std::string ) const;\n  int recv ( std::string& ) const;\n\n\n  void set_non_blocking ( const bool );\n\n  bool is_valid() const { return m_sock != -1; }\n\n private:\n\n  int m_sock;\n  sockaddr_in m_addr;\n\n\n};\n\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/SocketException.h",
    "content": "// SocketException class\n\n\n#ifndef SocketException_class\n#define SocketException_class\n\n#include <string>\n\nclass SocketException\n{\n public:\n  SocketException ( std::string s ) : m_s ( s ) {};\n  ~SocketException (){};\n\n  std::string description() { return m_s; }\n\n private:\n\n  std::string m_s;\n\n};\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/SocketMultiThreadWrapper.cpp",
    "content": "#include \"SocketMultiThreadWrapper.h\"\n\n#include <plog/Log.h>\n#include \"sockets/ServerSocket.h\"\n#include \"sockets/SocketException.h\"\n#include \"sockets/SocketTimeoutException.h\"\n#include <mutex>\n\n#define PORT 8123\n\nstd::mutex read_mutex;\nstd::mutex send_mutex;\n\nSocketMultiThreadWrapper::SocketMultiThreadWrapper()\n: SocketMultiThreadWrapperBase(),\n  data_buffer(),\n  send_buffer()\n{\n    run_thread = std::thread(&SocketMultiThreadWrapper::socket_loop, this);\n    run_thread.detach();\n}\n\nbool SocketMultiThreadWrapper::dataAvailableToRead()\n{\n    std::lock_guard<std::mutex> read_lock(read_mutex);\n    return data_buffer.size() > 0;\n}\n\nstd::string SocketMultiThreadWrapper::getData()\n{\n    std::string outstring;\n    const std::lock_guard<std::mutex> lock(read_mutex);\n    while(!data_buffer.empty())\n    {\n        outstring.push_back(data_buffer.front());\n        data_buffer.pop();\n    }\n    return outstring;\n}\n\nvoid SocketMultiThreadWrapper::sendData(std::string data)\n{\n    const std::lock_guard<std::mutex> lock(send_mutex);\n    if(send_buffer.size() + data.size() >= BUFFER_SIZE)\n    {\n        PLOGE.printf(\"Send buffer overflow, dropping message: %s\", data.c_str());\n        return;\n    }\n    for(const char c : data)\n    {\n        send_buffer.push(c);\n    }\n}\n\nvoid SocketMultiThreadWrapper::socket_loop()\n{\n    ServerSocket server(PORT);\n\n    while(true)\n    {\n        // Wait for a client to connect\n        PLOGI.printf(\"Ready for client connection\");\n        ServerSocket socket;\n        server.accept(socket);\n        socket.set_non_blocking();\n        bool keep_connection = true;\n        PLOGI.printf(\"Client connected\");\n\n        // Stay connected to the client while the connection is valid\n        while(keep_connection)\n        {\n            // Try to read data from the client\n            std::string read_data;\n            try\n            {\n                socket >> read_data;\n            }\n            catch (SocketTimeoutException &e) {}\n            catch (SocketException& e)\n            {\n                //PLOGI.printf(\"Caught exception while reading: %s\", e.description().c_str());\n                // If read fails for something other than timeout, disconnect\n                keep_connection = false;\n                continue;\n            }\n\n            {\n                // Copy data into buffer for output if we got info from the client\n                std::lock_guard<std::mutex> read_lock(read_mutex);\n                if(data_buffer.size() + read_data.size() >= BUFFER_SIZE)\n                {\n                    PLOGE.printf(\"Data buffer overflow, dropping message\");\n                }\n                else\n                {\n                    for (uint i = 0; i < read_data.size(); i++)\n                    {\n                        //PLOGD << std::hex << std::to_integer<int>(read_buff[i]) << std::dec << ' ';\n                        data_buffer.push(read_data[i]);\n                    }\n                }\n            }\n\n            {\n                // If we have data ready to be sent, send it\n                std::lock_guard<std::mutex> send_lock(send_mutex);\n                if(send_buffer.size() > 0)\n                {\n                    PLOGD.printf(\"Length to send: %i\", send_buffer.size());\n                    std::string send_data;\n                    while(!send_buffer.empty())\n                    {\n                        send_data.push_back(send_buffer.front());\n                        send_buffer.pop();\n                    }\n                    socket << send_data;\n                }\n\n            }\n        }\n\n        PLOGI.printf(\"Closing socket connection\");\n\n    }\n}\n"
  },
  {
    "path": "src/robot/src/sockets/SocketMultiThreadWrapper.h",
    "content": "#ifndef SocketMultiThreadWrapper_h\n#define SocketMultiThreadWrapper_h\n\n#include <queue>\n#include <thread>\n#include \"SocketMultiThreadWrapperBase.h\"\n\n#define BUFFER_SIZE 2048\n\nclass SocketMultiThreadWrapper : public SocketMultiThreadWrapperBase\n{\n\n  public:\n    SocketMultiThreadWrapper();\n    std::string getData();\n    void sendData(std::string data);\n    bool dataAvailableToRead();\n\n  private:\n    \n    void socket_loop();\n\n    std::queue<char> data_buffer;\n    std::queue<char> send_buffer;\n    std::thread run_thread;\n\n};\n\n\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/SocketMultiThreadWrapperBase.h",
    "content": "#ifndef SocketMultiThreadWrapperBase_h\n#define SocketMultiThreadWrapperBase_h\n\n#include <string>\n\nclass SocketMultiThreadWrapperBase\n{\n  public:\n    virtual ~SocketMultiThreadWrapperBase() {};\n    virtual std::string getData() = 0;\n    virtual void sendData(std::string data) = 0;\n    virtual bool dataAvailableToRead() = 0;\n};\n\n\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/SocketMultiThreadWrapperFactory.cpp",
    "content": "#include \"SocketMultiThreadWrapperFactory.h\"\n#include \"SocketMultiThreadWrapper.h\"\n#include \"MockSocketMultiThreadWrapper.h\"\n\n#include <plog/Log.h> \n\nSocketMultiThreadWrapperFactory* SocketMultiThreadWrapperFactory::instance = NULL;\n\n\nSocketMultiThreadWrapperFactory* SocketMultiThreadWrapperFactory::getFactoryInstance()\n{\n    if(!instance)\n    {\n        instance = new SocketMultiThreadWrapperFactory;\n    }\n    return instance;\n}\n\nvoid SocketMultiThreadWrapperFactory::set_mode(SOCKET_FACTORY_MODE mode)\n{\n    mode_ = mode;\n}\n\nvoid SocketMultiThreadWrapperFactory::add_mock_data(std::string data)\n{\n    if(mode_ == SOCKET_FACTORY_MODE::MOCK && socket_)\n    {\n        // This is slightly dangerous, but should be safe as long as I don't do something silly like \n        // change the mode of the factory after building the socket\n        SocketMultiThreadWrapperBase* raw_ptr = socket_.get();\n        dynamic_cast<MockSocketMultiThreadWrapper*>(raw_ptr)->add_mock_data(data);\n        PLOGI << \"Added mock data: \" << data;\n    }\n}\n\nvoid SocketMultiThreadWrapperFactory::build_socket()\n{\n    if(socket_)\n    {\n        return;\n    }\n\n    if(mode_ == SOCKET_FACTORY_MODE::STANDARD)\n    {\n        socket_ = std::make_unique<SocketMultiThreadWrapper> ();\n        PLOGI << \"Built SocketMultiThreadWrapper\";\n    }\n    else if (mode_ == SOCKET_FACTORY_MODE::MOCK)\n    {\n        socket_ = std::make_unique<MockSocketMultiThreadWrapper> ();\n        PLOGI << \"Built MockSocketMultiThreadWrapper\";\n    }\n}\n\nSocketMultiThreadWrapperBase* SocketMultiThreadWrapperFactory::get_socket()\n{\n    if(!socket_)\n    {\n        build_socket();\n    }\n    return socket_.get();\n}\n\n\n// Private constructor\nSocketMultiThreadWrapperFactory::SocketMultiThreadWrapperFactory()\n: mode_(SOCKET_FACTORY_MODE::STANDARD)\n{}\n  "
  },
  {
    "path": "src/robot/src/sockets/SocketMultiThreadWrapperFactory.h",
    "content": "#ifndef SocketMultiThreadWrapperFactory_h\n#define SocketMultiThreadWrapperFactory_h\n\n#include <memory>\n\n#include \"SocketMultiThreadWrapperBase.h\"\n\nenum class SOCKET_FACTORY_MODE\n{\n    STANDARD,\n    MOCK,\n};\n\nclass SocketMultiThreadWrapperFactory\n{\n\n  public:\n\n    static SocketMultiThreadWrapperFactory* getFactoryInstance();\n\n    void set_mode(SOCKET_FACTORY_MODE mode);\n\n    void add_mock_data(std::string data);\n\n    void build_socket();\n\n    SocketMultiThreadWrapperBase* get_socket();\n\n    // Delete copy and assignment constructors\n    SocketMultiThreadWrapperFactory(SocketMultiThreadWrapperFactory const&) = delete;\n    SocketMultiThreadWrapperFactory& operator= (SocketMultiThreadWrapperFactory const&) = delete;\n\n  private:\n\n    // Make standard constructor private so it can't be created\n    SocketMultiThreadWrapperFactory();\n\n    static SocketMultiThreadWrapperFactory* instance;\n\n    SOCKET_FACTORY_MODE mode_;\n    std::unique_ptr<SocketMultiThreadWrapperBase> socket_;\n    \n\n};\n\n\n#endif"
  },
  {
    "path": "src/robot/src/sockets/SocketTimeoutException.h",
    "content": "// SocketTimeoutException class\n\n\n#ifndef SocketTimeoutException_class\n#define SocketTimeoutException_class\n\n#include <string>\n\nclass SocketTimeoutException\n{\n public:\n  SocketTimeoutException ( std::string s ) : m_s ( s ) {};\n  ~SocketTimeoutException (){};\n\n  std::string description() { return m_s; }\n\n private:\n\n  std::string m_s;\n\n};\n\n#endif"
  },
  {
    "path": "src/robot/src/utils.cpp",
    "content": "#define _USE_MATH_DEFINES\n \n#include \"utils.h\"\n\n#include <cmath>\n#include <chrono>\n#include <plog/Log.h>\n#include <iostream>\n#include <stdio.h>\n#include <sstream>\n#include <plog/Init.h>\n#include <plog/Formatters/CsvFormatter.h>\n#include <plog/Appenders/RollingFileAppender.h>\n\nfloat wrap_angle(float a)\n{\n  while(std::abs(a) > M_PI)\n  {\n    if(a > M_PI)\n    {\n      a -= 2*M_PI;\n    }\n    else if (a < -1*M_PI)\n    {\n      a += 2*M_PI;\n    }\n  }\n  return a;\n}\n\nfloat angle_diff(float a1, float a2)\n{\n  float outA = a1 - a2;\n  // Handle angle wrapping and compute the correct error amount\n  outA = wrap_angle(outA);\n  return outA;\n}\n\nRateController::RateController(int hz)\n: timer_(),\n  dt_us_(1000000 / hz),\n  always_ready_(cfg.lookup(\"motion.rate_always_ready\"))\n{\n}\n\nbool RateController::ready()\n{\n  if(always_ready_ || timer_.dt_us() > dt_us_)\n  {\n    timer_.reset();\n    return true;\n  }\n  return false;\n}\n\n\nTimeRunningAverage::TimeRunningAverage(int window_size)\n: buf_(),\n  buf_idx_(0),\n  window_size_(window_size),\n  filled_(false),\n  started_(false),\n  timer_()\n{\n    buf_.reserve(window_size_);\n}\n\nint TimeRunningAverage::get_ms()\n{\n    int sum_end_idx = buf_idx_;\n    if(filled_)\n    {\n        sum_end_idx = window_size_;\n    }\n\n    if (sum_end_idx == 0)\n    {\n        return 0;\n    }\n\n    int sum = 0;\n    for(int i = 0; i < sum_end_idx; i++)\n    {\n        sum += buf_[i];\n    }\n    return sum / sum_end_idx;\n}\n\nfloat TimeRunningAverage::get_sec()\n{\n    float ms = static_cast<float>(get_ms());\n    return ms / 1000.0;\n}\n\nvoid TimeRunningAverage::mark_point()\n{\n    if(!started_)\n    {\n        timer_.reset();\n        started_ = true;\n        return;\n    }\n\n    buf_[buf_idx_] = timer_.dt_ms();\n    timer_.reset();\n\n    buf_idx_++;\n    if (buf_idx_ >= window_size_)\n    {\n        filled_ = true;\n        buf_idx_ = 0;\n    }\n}\n\n\nTimer::Timer() \n: prev_time_(ClockFactory::getFactoryInstance()->get_clock()->now())\n{ }\n\nvoid Timer::reset()\n{\n    ClockWrapperBase* clock = ClockFactory::getFactoryInstance()->get_clock();\n    prev_time_ = clock->now();\n}\n\nint Timer::dt_us()\n{\n    ClockWrapperBase* clock = ClockFactory::getFactoryInstance()->get_clock();\n    return std::chrono::duration_cast<std::chrono::microseconds>(clock->now() - prev_time_).count();\n}\n\nint Timer::dt_ms()\n{\n    ClockWrapperBase* clock = ClockFactory::getFactoryInstance()->get_clock();\n    return std::chrono::duration_cast<std::chrono::milliseconds>(clock->now() - prev_time_).count();\n}\n\nfloat Timer::dt_s() \n{\n    ClockWrapperBase* clock = ClockFactory::getFactoryInstance()->get_clock();\n    return std::chrono::duration_cast<FpSeconds>(clock->now() - prev_time_).count();\n}\n\n\nClockTimePoint ClockWrapper::now()\n{\n    return std::chrono::steady_clock::now();\n}\n\n\nMockClockWrapper::MockClockWrapper()\n: ClockWrapperBase(),\n  internal_time_(std::chrono::steady_clock::now())\n{}\n\nClockTimePoint MockClockWrapper::now()\n{\n    return internal_time_;\n}\n\nvoid MockClockWrapper::set_now()\n{\n    set(std::chrono::steady_clock::now());\n}\n\nvoid MockClockWrapper::set(ClockTimePoint time_point)\n{\n    internal_time_ = time_point;\n}\n\nvoid MockClockWrapper::advance_us(int us)\n{\n    internal_time_ += std::chrono::microseconds(us);\n}\n\nvoid MockClockWrapper::advance_ms(int ms)\n{\n    internal_time_ += std::chrono::milliseconds(ms);\n}\n\nvoid MockClockWrapper::advance_sec(float sec)\n{\n    internal_time_ += std::chrono::duration_cast<std::chrono::microseconds>(FpSeconds(sec));\n}\n\n\n\nClockFactory* ClockFactory::factory_instance_ = NULL;\n\nClockFactory* ClockFactory::getFactoryInstance()\n{\n    if(!factory_instance_)\n    {\n        factory_instance_ = new ClockFactory;\n    }\n    return factory_instance_;\n}\n\nvoid ClockFactory::set_mode(CLOCK_FACTORY_MODE mode)\n{\n    mode_ = mode;\n}\n\nClockWrapperBase* ClockFactory::get_clock()\n{\n    if (!clock_instance_)\n    {\n        build_clock_instance();\n    }\n    return clock_instance_.get();\n}\n\n\nvoid ClockFactory::build_clock_instance()\n{\n    if(mode_ == CLOCK_FACTORY_MODE::STANDARD)\n    {\n        //PLOGI << \"Building STANDARD clock wrapper\";\n        clock_instance_ = std::make_unique<ClockWrapper>();\n    }\n    else if (mode_ == CLOCK_FACTORY_MODE::MOCK)\n    {\n        //PLOGI << \"Building MOCK clock wrapper\";\n        clock_instance_ = std::make_unique<MockClockWrapper>();\n    }\n}\n\n// Private constructor\nClockFactory::ClockFactory()\n: mode_(CLOCK_FACTORY_MODE::STANDARD),\n  clock_instance_()\n{}\n\n\nPositionController::PositionController(Gains gains) \n: gains_(gains),\n  error_sum_(0.0)\n{ }\n\nvoid PositionController::reset()\n{\n    error_sum_ = 0.0;\n}\n\nfloat PositionController::compute(float target_position, float actual_position, float target_velocity, float actual_velocity, float dt)\n{\n    float pos_err = target_position - actual_position;\n    float vel_err = target_velocity - actual_velocity;\n    error_sum_ += pos_err * dt;\n    float output_velocity = target_velocity + gains_.kp * pos_err + gains_.kd * vel_err + gains_.ki * error_sum_;\n    return output_velocity;\n}\n\n\nfloat vectorMean(const std::vector<float>& data)\n{\n    if(data.empty()) return 0;\n\n    float sum = 0.0;\n    for (const auto& val : data)\n    {\n        sum += val;\n    }\n    return sum/data.size();\n}\n\nfloat vectorStddev(const std::vector<float>& data, float mean)\n{\n    if(data.empty()) return 0;\n    \n    float variance = 0;\n    for (const auto& val : data)\n    {\n        variance += (val - mean) * (val - mean);\n    }\n    variance /= data.size();\n    return sqrt(variance);\n}\n\nfloat zScore(float mean, float stddev, float reading)\n{\n    if (stddev < 0.0001) return 0;\n    return fabs((reading - mean)/stddev);\n}\n\n\n// From: https://www.tutorialspoint.com/parsing-a-comma-delimited-std-string-in-cplusplus\nstd::vector<std::string> parseCommaDelimitedString(const std::string& str_in)\n{\n   std::vector<std::string> result;\n   std::stringstream s_stream(str_in); //create string stream from the string\n   while(s_stream.good()) \n   {\n      std::string substr;\n      getline(s_stream, substr, ','); //get first string delimited by comma\n      result.push_back(substr);\n   }\n   return result;\n}\n\nstd::vector<float> parseCommaDelimitedStringToFloat(const std::string& str_in)\n{\n    std::vector<std::string> str_parsed = parseCommaDelimitedString(str_in);\n    std::vector<float> result;\n    result.reserve(str_parsed.size());\n    for(const auto& val : str_parsed)\n    {\n        if(val.empty()) continue;\n        result.push_back(std::stof(val));\n    }\n    return result;\n}\n\n\n\nvoid reset_last_motion_logger()\n{\n    // Delete the loggers\n    g_logger.reset();\n    g_appender.reset();\n\n    // Delete the file\n    remove(\"log/last_motion_log.csv\");\n\n    // Re-initialize the logger\n    g_appender.reset(new plog::RollingFileAppender<plog::CsvFormatter>(\"log/last_motion_log.csv\"));\n    g_logger.reset(new plog::Logger<MOTION_CSV_LOG_ID>(plog::debug));\n    g_logger->addAppender(g_appender.get());\n}"
  },
  {
    "path": "src/robot/src/utils.h",
    "content": "#ifndef utils_h\n#define utils_h\n\n#include <chrono>\n#include <vector>\n#include <memory>\n#include <math.h>\n#include <plog/Log.h> \n#include <plog/Logger.h> \n\n#include \"constants.h\"\n\nusing ClockTimePoint = std::chrono::time_point<std::chrono::steady_clock>;\nusing FpSeconds = std::chrono::duration<float, std::chrono::seconds::period>;\n\n// Utility for getting sign of values\n// From: https://stackoverflow.com/questions/1903954/is-there-a-standard-sign-function-signum-sgn-in-c-c\ntemplate <typename T> int sgn(T val) {\n    return (T(0) < val) - (val < T(0));\n}\n\n\n//*******************************************\n//           Angle stuff\n//*******************************************\n\n// Wrap an angle between +/- 2 pi\nfloat wrap_angle(float a);\n// Take diff of two angles, ensures the result is wrapped betwee +/- 2 pi\nfloat angle_diff(float a1, float a2);\n\n\n//*******************************************\n//           Clock/time stuff\n//*******************************************\n\n// The ClockWrapperX classes provide wrappers around the system clock to simplified mocking the clock\nclass ClockWrapperBase\n{\n public:\n  virtual ClockTimePoint now() = 0;\n};\n\nclass ClockWrapper : public ClockWrapperBase\n{\n public:\n  ClockTimePoint now() override;\n};\n\nclass MockClockWrapper : public ClockWrapperBase\n{\n public:\n  MockClockWrapper();\n  ClockTimePoint now() override;\n  void advance_sec(float sec);\n  void advance_ms(int ms);\n  void advance_us(int us);\n  void set(ClockTimePoint time_point);\n  void set_now();\n private:\n  ClockTimePoint internal_time_;\n};\n\n\nenum class CLOCK_FACTORY_MODE\n{\n  STANDARD,\n  MOCK\n};\n\n// Provides a way to get the current clock\nclass ClockFactory\n{\n public:\n  static ClockFactory* getFactoryInstance();\n  void set_mode(CLOCK_FACTORY_MODE mode);\n  ClockWrapperBase* get_clock();\n\n  // Delete copy and assignment constructors\n  ClockFactory(ClockFactory const&) = delete;\n  ClockFactory& operator= (ClockFactory const&) = delete;\n \n private:\n  // Make standard constructor private so it can't be created\n  ClockFactory();\n  void build_clock_instance();\n  static ClockFactory* factory_instance_;\n  CLOCK_FACTORY_MODE mode_;\n  std::unique_ptr<ClockWrapperBase> clock_instance_;\n};\n\n\n// Simple timer class that can return the time since reset in various formats\nclass Timer\n{\n  public:\n    Timer();\n    void reset();\n    int dt_us();\n    int dt_ms();\n    float dt_s();\n\n  private:\n    ClockTimePoint prev_time_;\n};\n\n\n// Keeps an average of how long the time delta is between events\nclass TimeRunningAverage\n{\n  public:\n    TimeRunningAverage(int window_size);\n\n    int get_ms();\n\n    float get_sec();\n\n    void mark_point();\n\n  private:\n\n    std::vector<int> buf_;\n    int buf_idx_;\n    int window_size_;\n    bool filled_;\n    bool started_;\n    Timer timer_;\n\n};\n\n// Simple class to run a loop at a certian rate\nclass RateController\n{\n  public:\n    RateController(int hz);\n    bool ready(); \n  private:\n    Timer timer_;\n    int dt_us_;\n    bool always_ready_;\n};\n\n\n//*******************************************\n//           Useful data structures\n//*******************************************\n\nstruct Point\n{\n    float x;\n    float y;\n    float a;\n\n    Point(float x=0, float y=0, float a=0)\n    : x(x), y(y), a(a)\n    {}\n\n    std::string toString() const\n    {\n        char s[100];\n        sprintf(s, \"[x: %.3f, y: %.3f, a: %.3f]\", x, y, a);\n        return static_cast<std::string>(s);\n    }\n\n    bool operator== (const Point& other) const\n    {\n        return x == other.x && y == other.y && a == other.a;\n    }\n};\n\nstruct Velocity\n{\n    float vx;\n    float vy;\n    float va;\n\n    Velocity(float vx=0, float vy=0, float va=0)\n    : vx(vx), vy(vy), va(va)\n    {}\n\n    std::string toString() const\n    {\n        char s[100];\n        sprintf(s, \"[vx: %.3f, vy: %.3f, va: %.3f]\", vx, vy, va);\n        return static_cast<std::string>(s);\n    }\n\n    bool operator== (const Velocity& other) const \n    {\n        return vx == other.vx && vy == other.vy && va == other.va;\n    }\n\n    bool nearZero(float eps=1e-6) const\n    {\n        if (fabs(vx) < eps && fabs(vy) < eps && fabs(va) < eps)\n        {\n            return true;\n        }\n        else\n        {\n            return false;\n        }\n    }\n};\n\nclass LatchedBool\n{\n  public:\n    LatchedBool(float seconds_to_latch)\n    : latch_time_(seconds_to_latch),\n      time_since_true_(),\n      value_(false)\n      {}\n\n    void add(bool value_in)\n    {\n        if(value_in)\n        {\n            value_ = true;\n            time_since_true_.reset();\n        }\n        else\n        {\n            if(time_since_true_.dt_s() > latch_time_)\n            {\n                value_ = false;\n            }\n        }\n    }\n\n    bool get() {return value_;}\n    bool update(bool value_in)\n    {\n      add(value_in);\n      return get();\n    }\n  \n  private:\n    float latch_time_;\n    Timer time_since_true_;\n    bool value_;\n};\n\n// Very simple circular buffer class\ntemplate<class T>\nclass CircularBuffer \n{\n  public:\n    CircularBuffer(int size)\n    : data_(std::make_unique<std::vector<T>>()),\n      size_(size),\n      idx_(0),\n      full_(false)\n    {\n        data_->reserve(size_);\n    }\n\n    void insert(T val)\n    {\n        if(!full_) \n        {\n            data_->push_back(val);\n            idx_++;\n            if (idx_ % size_ == 0)\n            {\n                full_ = true;\n                idx_ = 0;\n            }\n        }\n        else \n        {\n            data_->at(idx_) = val;\n            idx_ = (idx_ + 1) % size_;\n        }\n    }\n\n    void clear()\n    {\n        data_->clear();\n        idx_ = 0;\n        full_ = false;\n    }\n\n    std::vector<T> get_contents() const\n    {\n      if(!full_ || idx_ == 0)\n      {\n        return *data_;\n      }\n      else \n      {\n        std::vector<T> front(data_->begin() + idx_, data_->end());\n        std::vector<T> back(data_->begin(), data_->begin() + idx_);\n        front.insert(front.end(), back.begin(), back.end());\n        return front;\n      }\n    }\n\n    bool isFull() const { return full_;}\n    \n  private:\n    std::unique_ptr<std::vector<T>> data_;\n    const int size_;\n    int idx_;\n    bool full_;\n\n};\n\n\nclass PositionController\n{\n  public:\n\n    struct Gains\n    {\n        float kp;\n        float ki;\n        float kd;    \n    };\n\n    PositionController(Gains gains = {0,0,0});\n\n    // Resets error sum to avoid integral windup\n    void reset();\n\n    // Compute control velocity for target position with feedforward velocity\n    float compute(float target_position, float actual_position, float target_velocity, float actual_velocity, float dt);\n\n  private:\n\n    Gains gains_;\n    float error_sum_;\n};\n\nstruct LocalizationMetrics \n{\n    float last_position_uncertainty;\n    float confidence_x;\n    float confidence_y;\n    float confidence_a;\n    float total_confidence;\n};\n\nstruct CameraDebug\n{\n    bool side_ok = false;\n    bool rear_ok = false;\n    bool both_ok = false;\n    float side_u;\n    float side_v;\n    float rear_u;\n    float rear_v;\n    float side_x;\n    float side_y;\n    float rear_x;\n    float rear_y;\n    float pose_x;\n    float pose_y;\n    float pose_a;\n    int loop_ms = 0;\n};\n\n//*******************************************\n//           Misc math\n//*******************************************\n\nfloat vectorMean(const std::vector<float>& data);\n\nfloat vectorStddev(const std::vector<float>& data, float mean);\n\nfloat zScore(float mean, float stddev, float reading);\n\n\n//*******************************************\n//           Strings\n//*******************************************\n\nstd::vector<std::string> parseCommaDelimitedString(const std::string& str_in);\nstd::vector<float> parseCommaDelimitedStringToFloat(const std::string& str_in);\n\n\n//*******************************************\n//           Logging\n//*******************************************\n\nstatic std::unique_ptr<plog::IAppender> g_appender;\nstatic std::unique_ptr<plog::Logger<MOTION_CSV_LOG_ID>> g_logger;\nvoid reset_last_motion_logger();\n\n#endif\n"
  },
  {
    "path": "src/robot/test/CameraPipeline_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n\n#include \"camera_tracker/CameraPipeline.h\"\n#include \"test-utils.h\"\n\n\nTEST_CASE(\"Robot coords from image coords\", \"[Camera]\")\n{\n    SafeConfigModifier<bool> config_modifier_1(\"vision_tracker.debug.use_debug_image\", true);\n    SafeConfigModifier<std::string> config_modifier_2(\"vision_tracker.side.debug_image\", \"/home/pi/images/test/TestImage2.jpg\");\n    SafeConfigModifier<std::string> config_modifier_3(\"vision_tracker.rear.debug_image\", \"/home/pi/images/test/TestImage2.jpg\");\n    \n    float side_offset_x = 0.0;\n    float side_offset_y = -1.5;\n    SafeConfigModifier<float> config_modifier_4(\"vision_tracker.physical.side.x_offset\", side_offset_x);\n    SafeConfigModifier<float> config_modifier_5(\"vision_tracker.physical.side.y_offset\", side_offset_y);\n    SafeConfigModifier<float> config_modifier_6(\"vision_tracker.physical.side.z_offset\", 0.55);\n\n    CameraPipeline c(CAMERA_ID::SIDE, /*start_thread=*/ false);\n\n    SECTION(\"Side cam, center point\")\n    {\n        float u = 320/2;\n        float v = 240/2;\n        Eigen::Vector2f xy_world = c.cameraToRobot({u,v});\n        CHECK(xy_world[0] == Approx(side_offset_x).margin(0.2));\n        CHECK(xy_world[1] == Approx(side_offset_y).margin(0.2));\n    }\n    \n    SECTION(\"Side cam, top corner\")\n    {\n        float u = 0;\n        float v = 0;\n        Eigen::Vector2f xy_world = c.cameraToRobot({u,v});\n        CHECK(xy_world[0] < side_offset_x);\n        CHECK(xy_world[1] > side_offset_y);\n    }\n}\n\n// Verify that camera params between actual and test config are synced if this test is giving trouble\nTEST_CASE(\"Marker Detection\", \"[Camera]\")\n{\n    struct TestImageMetadata\n    {\n        std::string name;\n        bool expected_detection;\n        CAMERA_ID id;\n        Eigen::Vector2f expected_point_px = {-1, -1};\n    };\n\n    std::vector<TestImageMetadata> test_images = {\n        {.name = \"20210712175430_side_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::SIDE},\n        {.name = \"20210712175436_rear_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::REAR},\n        {.name = \"20210712175519_side_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::SIDE},\n        {.name = \"20210712175525_rear_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::REAR},\n        {.name = \"20210712175606_side_img_raw.jpg\",\n        .expected_detection = false,\n        .id = CAMERA_ID::SIDE},\n        {.name = \"20210712175612_rear_img_raw.jpg\",\n        .expected_detection = false,\n        .id = CAMERA_ID::REAR},\n        {.name = \"20210712175719_side_img_raw.jpg\",\n        .expected_detection = false,\n        .id = CAMERA_ID::SIDE},\n        {.name = \"20210712175728_rear_img_raw.jpg\",\n        .expected_detection = false,\n        .id = CAMERA_ID::REAR},\n        {.name = \"20210712175942_rear_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::REAR},\n        {.name = \"20210712180047_rear_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::REAR},\n        {.name = \"20210712180208_side_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::SIDE},\n        {.name = \"20210712180322_side_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::SIDE},\n        {.name = \"20210712180446_side_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::SIDE},\n        {.name = \"20210712180528_side_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::SIDE},\n        {.name = \"20210712182429_side_img_raw.jpg\",\n        .expected_detection = true,\n        .id = CAMERA_ID::SIDE},\n    };\n\n    for (const auto& item : test_images) \n    {\n        std::string image_dir = \"/home/pi/DominoRobot/src/robot/test/testdata/new_images/\";\n        std::string image_path = image_dir + item.name;\n        SafeConfigModifier<bool> config_modifier_1(\"vision_tracker.debug.use_debug_image\", true);\n        SafeConfigModifier<std::string> config_modifier_2(\"vision_tracker.side.debug_image\", image_path);\n        SafeConfigModifier<std::string> config_modifier_3(\"vision_tracker.rear.debug_image\", image_path);\n       \n        CameraPipeline c(item.id, /*start_thread=*/ false);\n        c.oneLoop();\n        CameraPipelineOutput output = c.getData();\n\n        PLOGI << \"Checking detection in image \" << item.name;\n        CHECK(output.ok == item.expected_detection);\n        if(item.expected_point_px != Eigen::Vector2f({-1,-1}))\n        {\n            CHECK(output.uv == item.expected_point_px);\n        }\n    }\n}"
  },
  {
    "path": "src/robot/test/CameraTracker_Test.cpp",
    "content": "#include <Catch/catch.hpp>\r\n\r\n#include \"camera_tracker/CameraTracker.h\"\r\n#include \"test-utils.h\"\r\n#include <unistd.h>\r\n\r\nTEST_CASE(\"Nominal case\", \"[Camera]\")\r\n{\r\n    SafeConfigModifier<bool> config_modifier_1(\"vision_tracker.debug.use_debug_image\", true);\r\n    SafeConfigModifier<std::string> config_modifier_2(\"vision_tracker.side.debug_image\", \"/home/pi/images/test/TestImage2.jpg\");\r\n    SafeConfigModifier<std::string> config_modifier_3(\"vision_tracker.rear.debug_image\", \"/home/pi/images/test/TestImage2.jpg\");\r\n\r\n    CameraTracker c(/*start_thread=*/ true);\r\n    int counter = 0;\r\n    CameraTrackerOutput output;\r\n    output.ok = false;\r\n    while(!output.ok && counter < 50){\r\n        c.update();\r\n        output = c.getPoseFromCamera();\r\n        counter++;\r\n        usleep(100000);\r\n    }\r\n    REQUIRE(output.ok == true);\r\n    Point p = output.pose;\r\n    bool any_nonzero = p.x != 0 || p.y != 0 || p.a != 0;\r\n    CHECK(any_nonzero == true);\r\n}\r\n\r\nTEST_CASE(\"Robot pose from coords\", \"[Camera]\")\r\n{\r\n    SafeConfigModifier<bool> config_modifier_1(\"vision_tracker.debug.use_debug_image\", true);\r\n    SafeConfigModifier<std::string> config_modifier_2(\"vision_tracker.side.debug_image\", \"/home/pi/images/test/TestImage2.jpg\");\r\n    SafeConfigModifier<std::string> config_modifier_3(\"vision_tracker.rear.debug_image\", \"/home/pi/images/test/TestImage2.jpg\");\r\n    \r\n    float side_target_x = 0.0;\r\n    float side_target_y = -1.4;\r\n    SafeConfigModifier<float> config_modifier_4(\"vision_tracker.physical.side.x_offset\", side_target_x);\r\n    SafeConfigModifier<float> config_modifier_5(\"vision_tracker.physical.side.y_offset\", side_target_y);\r\n    SafeConfigModifier<float> config_modifier_6(\"vision_tracker.physical.side.z_offset\", 0.55);\r\n    SafeConfigModifier<float> config_modifier_7(\"vision_tracker.physical.side.target_x\", side_target_x);\r\n    SafeConfigModifier<float> config_modifier_8(\"vision_tracker.physical.side.target_y\", side_target_y);\r\n    float rear_target_x = -1.5;\r\n    float rear_target_y = 0.0;\r\n    SafeConfigModifier<float> config_modifier_9(\"vision_tracker.physical.rear.x_offset\", rear_target_x);\r\n    SafeConfigModifier<float> config_modifier_10(\"vision_tracker.physical.rear.y_offset\", rear_target_y);\r\n    SafeConfigModifier<float> config_modifier_11(\"vision_tracker.physical.rear.z_offset\", 0.55);\r\n    SafeConfigModifier<float> config_modifier_12(\"vision_tracker.physical.rear.target_x\", rear_target_x);\r\n    SafeConfigModifier<float> config_modifier_13(\"vision_tracker.physical.rear.target_y\", rear_target_y);\r\n\r\n    SECTION(\"Exact match, target at center images\")\r\n    {\r\n        CameraTracker c(/*start_thread=*/ false);\r\n        Point p = c.computeRobotPoseFromImagePoints({side_target_x, side_target_y},{rear_target_x,rear_target_y});\r\n        CHECK(p.x == Approx(0).margin(0.0005));\r\n        CHECK(p.y == Approx(0).margin(0.0005));\r\n        CHECK(p.a == Approx(0).margin(0.0005));\r\n    }\r\n    SECTION(\"Offset translation, target at center images\")\r\n    {\r\n        CameraTracker c(/*start_thread=*/ false);\r\n        float x_offset = 0.1;\r\n        float y_offset = -0.2;\r\n        Point p = c.computeRobotPoseFromImagePoints({side_target_x + x_offset, side_target_y + y_offset},\r\n                                                    {rear_target_x + x_offset, rear_target_y + y_offset});\r\n        CHECK(p.x == Approx(-x_offset).margin(0.0005));\r\n        CHECK(p.y == Approx(-y_offset).margin(0.0005));\r\n        CHECK(p.a == Approx(0).margin(0.0005));\r\n    }\r\n    SECTION(\"Offset angle, target at center images\")\r\n    {\r\n        CameraTracker c(/*start_thread=*/ false);\r\n        float a_offset = 5 * M_PI / 180.0;\r\n        Point p = c.computeRobotPoseFromImagePoints({side_target_y * -sin(a_offset), side_target_y * cos(a_offset) },\r\n                                                    {rear_target_x * cos(a_offset), rear_target_x * sin(a_offset)});\r\n        CHECK(p.x == Approx(0).margin(0.0005));\r\n        CHECK(p.y == Approx(0).margin(0.0005));\r\n        CHECK(p.a == Approx(-a_offset).margin(0.0005));\r\n    }\r\n}\r\n"
  },
  {
    "path": "src/robot/test/Localization_Test.cpp",
    "content": "// #include <Catch/catch.hpp>\r\n\r\n// #include <random>\r\n\r\n// #include \"Localization.h\"\r\n// #include \"StatusUpdater.h\"\r\n// #include \"test-utils.h\"\r\n\r\n\r\n\r\n// TEST_CASE(\"Simple velocity only\", \"[Localization]\")\r\n// {\r\n//     // Setup x,y offset for mm to be 0 for this simple case\r\n//     SafeConfigModifier<float> mm_x_config_modifier(\"localization.mm_x_offset\", 0.0);\r\n//     SafeConfigModifier<float> mm_y_config_modifier(\"localization.mm_y_offset\", 0.0);\r\n    \r\n//     Localization L;\r\n\r\n//     L.updateVelocityReading({1,0,0}, 1);\r\n//     REQUIRE(L.getVelocity() == Velocity{1,0,0});\r\n//     REQUIRE(L.getPosition() == Point{1,0,0});\r\n// }\r\n\r\n// TEST_CASE(\"Simple position adjustment\", \"[Localization]\")\r\n// {\r\n//     // Use fake perfect motion for simplicity\r\n//     SafeConfigModifier<bool> config_modifier(\"motion.fake_perfect_motion\", true);\r\n\r\n//     SECTION(\"Zero Vel, Same Position, No offset\")\r\n//     {\r\n//         // Setup x,y offset for mm to be 0 for this simple case\r\n//         SafeConfigModifier<float> mm_x_config_modifier(\"localization.mm_x_offset\", 0.0);\r\n//         SafeConfigModifier<float> mm_y_config_modifier(\"localization.mm_y_offset\", 0.0);\r\n//         Localization L;\r\n\r\n//         // \"Move\" robot to target position\r\n//         float move_x = 1.0;\r\n//         float move_y = 0.0;\r\n//         float move_a = 0.0;\r\n//         Point move_target = {move_x, move_y, move_a};\r\n//         L.forceSetPosition(move_target);\r\n\r\n//         // Verify robot is in expected position\r\n//         Point position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(move_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(move_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(move_a).margin(0.0005));\r\n\r\n//         // Send position update that should match current position\r\n//         L.updatePositionReading(move_target);\r\n        \r\n//         // Verify position has not changed\r\n//         position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(move_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(move_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(move_a).margin(0.0005));\r\n//     }\r\n\r\n//     SECTION(\"Zero Vel, Position adjustment, No offset\")\r\n//     {\r\n//         // Setup x,y offset for mm to be 0 for this simple case\r\n//         SafeConfigModifier<float> mm_x_config_modifier(\"localization.mm_x_offset\", 0.0);\r\n//         SafeConfigModifier<float> mm_y_config_modifier(\"localization.mm_y_offset\", 0.0);\r\n//         Localization L;\r\n\r\n//         // \"Move\" robot to target position\r\n//         float move_x = 1.0;\r\n//         float move_y = 0.0;\r\n//         float move_a = 0.0;\r\n//         Point move_target = {move_x, move_y, move_a};\r\n//         L.forceSetPosition(move_target);\r\n\r\n//         // Verify robot is in expected position\r\n//         Point position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(move_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(move_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(move_a).margin(0.0005));\r\n\r\n//         // Send position update that should trigger adjustment\r\n//         float update_x = 2.0;\r\n//         float update_y = 0.0;\r\n//         float update_a = 0.0;\r\n//         Point update_target = {update_x, update_y, update_a};\r\n//         L.updatePositionReading(update_target);\r\n        \r\n//         // Verify position changed as expected\r\n//         float expected_x = 0.0;\r\n//         position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(expected_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(move_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(move_a).margin(0.0005));\r\n//     }\r\n\r\n// }\r\n\r\n// TEST_CASE(\"Position update with MM offset\", \"[Localization]\")\r\n// {\r\n//     // Setup x,y offset for mm\r\n//     SafeConfigModifier<float> mm_x_config_modifier(\"localization.mm_x_offset\", -100.0);\r\n//     SafeConfigModifier<float> mm_y_config_modifier(\"localization.mm_y_offset\", 10.0);\r\n//     Localization L;\r\n\r\n//     // \"Move\" robot to target position\r\n//     float move_x = 1.0;\r\n//     float move_y = 0.0;\r\n//     float move_a = 3.14;\r\n//     Point move_target = {move_x, move_y, move_a};\r\n//     L.forceSetPosition(move_target);\r\n\r\n//     // Verify robot is in expected position\r\n//     Point position = L.getPosition();\r\n//     REQUIRE(position.x == Approx(move_x).margin(0.0005));\r\n//     REQUIRE(position.y == Approx(move_y).margin(0.0005));\r\n//     REQUIRE(position.a == Approx(move_a).margin(0.0005));\r\n\r\n//     // Send position update that should not trigger adjustment\r\n//     // Since there is an offset, this position update needs to account for that\r\n//     float update_x = 1.1;\r\n//     float update_y = -0.01;\r\n//     float update_a = 3.14;\r\n//     Point update_target = {update_x, update_y, update_a};\r\n//     L.updatePositionReading(update_target);\r\n    \r\n//     // Verify position did not change\r\n//     position = L.getPosition();\r\n//     REQUIRE(position.x == Approx(move_x).margin(0.0005));\r\n//     REQUIRE(position.y == Approx(move_y).margin(0.0005));\r\n//     REQUIRE(position.a == Approx(move_a).margin(0.0005));\r\n// }\r\n\r\n\r\n// // TODO: Re-enable once velocity adjustments are turned back on\r\n// // TEST_CASE(\"Position update with velocity adjustment\", \"[Localization]\") \r\n// // {\r\n\r\n// //     // Setup x,y offset for mm to be 0\r\n// //     SafeConfigModifier<float> mm_x_config_modifier(\"localization.mm_x_offset\", 0.0);\r\n// //     SafeConfigModifier<float> mm_y_config_modifier(\"localization.mm_y_offset\", 0.0);\r\n// //     // Fix these values for this test so that I know exactly what they are. This test shouldn't break\r\n// //     // because I modified those values in the config\r\n// //     SafeConfigModifier<float> frac_config_modifier(\"localization.update_fraction_at_zero_vel\", 1.0);\r\n// //     SafeConfigModifier<float> val_config_modifier(\"localization.val_for_zero_update\", 0.1);\r\n// //     Localization L;\r\n\r\n// //     // Start 'robot' moving at specific velocity so we can inject the test conditions\r\n// //     float target_vel_x = 0.05;\r\n// //     float target_vel_y = 0.0;\r\n// //     float target_vel_a = 0.0;\r\n// //     L.updateVelocityReading({target_vel_x,target_vel_y,target_vel_a}, 1);\r\n\r\n// //     // Send position update that should trigger adjustment, but will be adjusted due to velocity\r\n// //     Point position = L.getPosition();\r\n// //     float cur_x = position.x;\r\n// //     float cur_y = position.y;\r\n// //     float update_x = cur_x;  // This tests that an exact match of the position doesn't cause a change\r\n// //     float update_y = 1.0;    // This tests that a difference causes an adjustment modified by velocity\r\n// //     float update_a = 0.0;\r\n// //     L.updatePositionReading({update_x, update_y, update_a});\r\n    \r\n// //     // Verify position changed as expected\r\n// //     position = L.getPosition();\r\n// //     const float val_for_zero_update = cfg.lookup(\"localization.val_for_zero_update\");\r\n// //     float expected_y = (target_vel_x / val_for_zero_update) * (update_y - cur_y) + cur_y;\r\n// //     REQUIRE(position.x == Approx(cur_x).margin(0.0005));\r\n// //     REQUIRE(position.y == Approx(expected_y).margin(0.002)); // Some extra margin here because the robot is still moving\r\n// //     REQUIRE(position.a == Approx(update_a).margin(0.0005));\r\n// // }\r\n\r\n// TEST_CASE(\"Test reading reliability filter\", \"[Localization]\")\r\n// {\r\n//     // Setup x,y offset for mm to be 0\r\n//     SafeConfigModifier<float> mm_x_config_modifier(\"localization.mm_x_offset\", 0.0);\r\n//     SafeConfigModifier<float> mm_y_config_modifier(\"localization.mm_y_offset\", 0.0);\r\n//     // Fix these values for this test so that I know exactly what they are. This test shouldn't break\r\n//     // because I modified those values in the config\r\n//     std::default_random_engine generator;\r\n\r\n//     SECTION(\"Filter spurious value\")\r\n//     {\r\n//         Localization L;\r\n//         float update_x = 0.0;\r\n//         float update_y = 0.0; \r\n//         float update_a = 0.0;\r\n//         std::normal_distribution<float> distribution(0.0,0.0001);\r\n\r\n//         // Load in a number of reliable position updates\r\n//         const int n_loops = 10;\r\n//         for (int i = 0; i < n_loops ; i++)\r\n//         {\r\n//             float noise = distribution(generator);\r\n//             L.updatePositionReading({update_x+noise, update_y+noise, update_a+noise});\r\n//         }\r\n\r\n//         // Give a very wrong update value\r\n//         L.updatePositionReading({1000.0, -123.0, 0.0});\r\n\r\n//         // Check that position didn't change\r\n//         Point position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(update_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(update_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(update_a).margin(0.0005));\r\n//     }\r\n//     SECTION(\"Position updates after shift\")\r\n//     {\r\n//         Localization L;\r\n//         float update_x = 1.0;\r\n//         float update_y = 0.0; \r\n//         float update_a = 0.0;\r\n//         std::normal_distribution<float> distribution(0.0,0.0001);\r\n\r\n//         // Load in a number of reliable position updates\r\n//         const int n_loops = 10;\r\n//         for (int i = 0; i < n_loops ; i++)\r\n//         {\r\n//             float noise = distribution(generator);\r\n//             L.updatePositionReading({update_x+noise, update_y+noise, update_a+noise});\r\n//         }\r\n\r\n//         // Shift the position to a new reliable position\r\n//         float new_update_x = 50.0;\r\n//         float new_update_y = 10.0;\r\n//         float new_update_a = -3.0;\r\n//         L.updatePositionReading({new_update_x, new_update_y, new_update_a});\r\n\r\n//         // After first update, position shouldn't change\r\n//         Point position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(update_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(update_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(update_a).margin(0.0005));\r\n\r\n//         // After many updates, the position should change\r\n//         for (int i = 0; i < 2*n_loops ; i++)\r\n//         {\r\n//             float noise = distribution(generator);\r\n//             L.updatePositionReading({new_update_x+noise, new_update_y+noise, new_update_a+noise});\r\n//         }\r\n//         position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(new_update_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(new_update_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(new_update_a).margin(0.0005));\r\n//     }\r\n//     SECTION(\"High variance sampling doesn't affect position\")\r\n//     {\r\n//         Localization L;\r\n//         float update_x = 1.0;\r\n//         float update_y = 0.0; \r\n//         float update_a = 0.0;\r\n//         std::normal_distribution<float> distribution(0.0,0.0001);\r\n\r\n//         // Load in a number of reliable position updates\r\n//         const int n_loops = 10;\r\n//         for (int i = 0; i < n_loops ; i++)\r\n//         {\r\n//             float noise = distribution(generator);\r\n//             L.updatePositionReading({update_x+noise, update_y+noise, update_a+noise});\r\n//         }\r\n\r\n//         // Verify position is as expected\r\n//         Point position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(update_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(update_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(update_a).margin(0.0005));\r\n\r\n//         // Now generate very noisy data around new position\r\n//         float new_update_x = 50.0;\r\n//         float new_update_y = 10.0;\r\n//         float new_update_a = -3.0;\r\n//         std::normal_distribution<float> noisy_distribution(0.0,1);\r\n\r\n//         // After many noisy updates, the position should not change\r\n//         for (int i = 0; i < n_loops ; i++)\r\n//         {\r\n//             float noise = noisy_distribution(generator);\r\n//             L.updatePositionReading({new_update_x+noise, new_update_y+noise, new_update_a+noise});\r\n//         }\r\n//         position = L.getPosition();\r\n//         REQUIRE(position.x == Approx(update_x).margin(0.0005));\r\n//         REQUIRE(position.y == Approx(update_y).margin(0.0005));\r\n//         REQUIRE(position.a == Approx(update_a).margin(0.0005));\r\n//     }\r\n\r\n// }"
  },
  {
    "path": "src/robot/test/RobotController_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n\n#include \"RobotController.h\"\n#include \"StatusUpdater.h\"\n#include \"test-utils.h\"\n\n\n// Helper that commands the robot to move to the given position\n// Returns the number of loops run before the robot either stopped or max_loops was hit\nint moveToPositionHelper(RobotController& r, float x, float y, float a, int max_loops)\n{\n    MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n    MockClockWrapper* mock_clock = get_mock_clock();\n    \n    mock_serial->purge_data();\n    r.moveToPosition(x,y,a);\n    REQUIRE(mock_serial->mock_rcv_base() == \"Power:ON\");\n\n    int count = 0;\n    while(r.isTrajectoryRunning() && count < max_loops)\n    {\n        count++;\n\n        r.update();\n        // Report back the exact velocity commanded\n        std::string cmd_vel = mock_serial->mock_rcv_base();\n        mock_serial->mock_send(\"base:\" + cmd_vel);\n\n        mock_clock->advance_ms(1);\n    }\n    return count;\n}\n\n// A simple position test that checks if the target position is reached within\n// a set amount of time\nvoid coarsePositionTest(float x, float y, float a)\n{\n    // Force all control parameters to 0 for perfect velocity tracking (due to ffd)\n    SafeConfigModifier<float> pos_kp_config_modifier(\"motion.translation.gains.kp\", 0.0);\n    SafeConfigModifier<float> pos_ki_config_modifier(\"motion.translation.gains.ki\", 0.0);\n    SafeConfigModifier<float> pos_kd_config_modifier(\"motion.translation.gains.kd\", 0.0);\n    SafeConfigModifier<float> ang_kp_config_modifier(\"motion.rotation.gains.kp\", 0.0);\n    SafeConfigModifier<float> ang_ki_config_modifier(\"motion.rotation.gains.ki\", 0.0);\n    SafeConfigModifier<float> ang_kd_config_modifier(\"motion.rotation.gains.kd\", 0.0);\n    SafeConfigModifier<float> limit_config_modifier(\"motion.limit_max_fraction\", 0.8);\n    \n    int max_loop_counts =  100000;\n    reset_mock_clock(); // Need to reset before creating RobotController which instantiates timers\n    StatusUpdater s;\n    RobotController r = RobotController(s);\n\n    int num_loops = moveToPositionHelper(r, x, y, a, max_loop_counts);\n\n    CHECK(num_loops != max_loop_counts);\n    CHECK(r.isTrajectoryRunning() == false);\n\n    StatusUpdater::Status status = s.getStatus();\n    REQUIRE(status.pos_x == Approx(x).margin(0.0005));\n    REQUIRE(status.pos_y == Approx(y).margin(0.0005));\n    REQUIRE(status.pos_a == Approx(a).margin(0.0005));\n}\n\nTEST_CASE(\"RobotController class\", \"[RobotController]\")\n{\n    StatusUpdater s;\n    RobotController r = RobotController(s);\n\n    r.moveToPosition(1,1,1);\n    r.update();\n\n    REQUIRE(r.isTrajectoryRunning() == true);    \n}\n\nTEST_CASE(\"Motor enable and disable\", \"[RobotController]\")\n{\n    MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n    StatusUpdater s;\n    RobotController r = RobotController(s);\n\n    mock_serial->purge_data();\n    r.enableAllMotors();\n    REQUIRE(mock_serial->mock_rcv_base() == \"Power:ON\");\n\n    mock_serial->purge_data();\n    r.disableAllMotors();\n    REQUIRE(mock_serial->mock_rcv_base() == \"Power:OFF\");\n}\n\nTEST_CASE(\"Simple coarse motion\", \"[RobotController]\")\n{\n    \n    SECTION(\"Straight forward\")\n    {\n        coarsePositionTest(0.5,0,0);\n    }\n    SECTION(\"Diagonal\")\n    {\n        coarsePositionTest(0.5,0.5,0);\n    }\n    SECTION(\"Positive rotation\")\n    {\n        coarsePositionTest(0,0,0.5);\n    }\n    SECTION(\"Negative rotation\")\n    {\n        coarsePositionTest(0,0,-0.5);\n    }\n    SECTION(\"All directions\")\n    {\n        coarsePositionTest(0.1,0.1,-0.5);\n    }\n}\n\nvoid fakeMotionHelper(float x, float y, float a, int max_loops, StatusUpdater& s)\n{\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n    MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n    mock_serial->purge_data();\n    RobotController r = RobotController(s);\n    r.moveToPosition(x,y,a);\n\n    int count = 0;\n    while(r.isTrajectoryRunning() && count < max_loops)\n    {\n        count++;\n        r.update();\n        mock_clock->advance_us(1000);\n    }\n}\n\nTEST_CASE(\"Validate fake_perfect_motion option\", \"[RobotController]\")\n{\n    StatusUpdater s;\n    float x = 0.5;\n    float y = 0.5;\n    float a = 0.5;\n    int test_counts = 10000;\n\n    // Prior to fake motion enabled, the robot should not move\n    fakeMotionHelper(x,y,a,test_counts,s);\n    StatusUpdater::Status status = s.getStatus();\n    REQUIRE(status.pos_x != Approx(x).margin(0.0005));\n    REQUIRE(status.pos_y != Approx(y).margin(0.0005));\n    REQUIRE(status.pos_a != Approx(a).margin(0.0005));\n\n    {\n        // Enable fake_perfect_motion with config modifier, which should reset when going out of scope\n        SafeConfigModifier<bool> config_modifier(\"motion.fake_perfect_motion\", true);\n        // Now this should work\n        fakeMotionHelper(x,y,a,test_counts,s);\n        status = s.getStatus();\n        REQUIRE(status.pos_x == Approx(x).margin(0.0005));\n        REQUIRE(status.pos_y == Approx(y).margin(0.0005));\n        REQUIRE(status.pos_a == Approx(a).margin(0.0005));\n    }\n\n    // Fake perfect motion should be reset, now it shouldn't move again\n    fakeMotionHelper(x,y,a,test_counts,s);\n    status = s.getStatus();\n    REQUIRE(status.pos_x != Approx(x).margin(0.0005));\n    REQUIRE(status.pos_y != Approx(y).margin(0.0005));\n    REQUIRE(status.pos_a != Approx(a).margin(0.0005));\n}\n\n\n\n\n// Trajectory generation can no longer trigger an error. Re-enable this test once something can trigger errors again\n// TEST_CASE(\"Block on error\", \"[RobotController]\")\n// {\n//     MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n//     StatusUpdater s;\n//     RobotController r = RobotController(s);\n\n//     mock_serial->purge_data();\n//     r.moveToPosition(0.00001,10000,0.0000001);\n//     REQUIRE(s.getErrorStatus() == true);\n//     REQUIRE(s.getInProgress() == false);\n//     REQUIRE(r.isTrajectoryRunning() == false); \n//     r.update();\n//     REQUIRE(mock_serial->mock_rcv_base() != \"Power:ON\");\n//     REQUIRE(s.getErrorStatus() == true);\n//     REQUIRE(s.getInProgress() == false);\n//     REQUIRE(r.isTrajectoryRunning() == false); \n// }\n\n// TEST_CASE(\"Benchmarking motion\", \"[RobotController]\")\n// {\n//     MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n//     StatusUpdater s;\n//     RobotController r = RobotController(s);\n\n//     mock_serial->purge_data();\n//     r.moveToPosition(10,0,0);\n//     REQUIRE(mock_serial->mock_rcv_base() == \"Power:ON\");\n\n//     BENCHMARK(\"RobotController benchmark\")\n//     {\n//         r.update();\n//         // Report back the exact velocity commanded\n//         std::string cmd_vel = mock_serial->mock_rcv_base();\n//         mock_serial->mock_send(cmd_vel);\n//     };\n// }"
  },
  {
    "path": "src/robot/test/RobotServer_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n\n#include \"RobotServer.h\"\n#include \"StatusUpdater.h\"\n#include \"test-utils.h\"\n#include \"sockets/MockSocketMultiThreadWrapper.h\"\n\n\nvoid testSimpleCommand(RobotServer& r, std::string msg, std::string expected_resp, COMMAND expected_command)\n{\n    MockSocketMultiThreadWrapper* mock_socket = build_and_get_mock_socket();\n    mock_socket->sendMockData(msg);\n    \n    COMMAND c = r.oneLoop();\n\n    std::string resp = mock_socket->getMockData();\n    REQUIRE(resp == expected_resp);\n    REQUIRE(c == expected_command);\n}\n\n\nTEST_CASE(\"Move\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'move','data':{'x':1,'y':2,'a':3}}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"move\\\"}>\";\n    COMMAND expected_command = COMMAND::MOVE;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n\n    RobotServer::PositionData data = r.getMoveData();\n    REQUIRE(data.x == 1);\n    REQUIRE(data.y == 2);\n    REQUIRE(data.a == 3);\n}\n\nTEST_CASE(\"Move rel\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'move_rel','data':{'x':1,'y':2,'a':3}}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"move_rel\\\"}>\";\n    COMMAND expected_command = COMMAND::MOVE_REL;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n\n    RobotServer::PositionData data = r.getMoveData();\n    REQUIRE(data.x == 1);\n    REQUIRE(data.y == 2);\n    REQUIRE(data.a == 3);\n}\n\nTEST_CASE(\"Move fine\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'move_fine','data':{'x':1,'y':2,'a':3}}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"move_fine\\\"}>\";\n    COMMAND expected_command = COMMAND::MOVE_FINE;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n\n    RobotServer::PositionData data = r.getMoveData();\n    REQUIRE(data.x == 1);\n    REQUIRE(data.y == 2);\n    REQUIRE(data.a == 3);\n}\n\n\nTEST_CASE(\"Move const vel\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'move_const_vel','data':{'vx':1,'vy':2,'va':3,'t':4}}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"move_const_vel\\\"}>\";\n    COMMAND expected_command = COMMAND::MOVE_CONST_VEL;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n\n    RobotServer::VelocityData data = r.getVelocityData();\n    REQUIRE(data.vx == 1);\n    REQUIRE(data.vy == 2);\n    REQUIRE(data.va == 3);\n    REQUIRE(data.t == 4);\n}\n\nTEST_CASE(\"Place\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'place'}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"place\\\"}>\";\n    COMMAND expected_command = COMMAND::PLACE_TRAY;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n}\n\nTEST_CASE(\"Load\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'load'}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"load\\\"}>\";\n    COMMAND expected_command = COMMAND::LOAD_TRAY;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n}\n\n\nTEST_CASE(\"Init tray\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'init'}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"init\\\"}>\";\n    COMMAND expected_command = COMMAND::INITIALIZE_TRAY;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n}\n\nTEST_CASE(\"Send position\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'p','data':{'x':1,'y':2,'a':3}}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"p\\\"}>\";\n    COMMAND expected_command = COMMAND::POSITION;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n\n    RobotServer::PositionData data = r.getPositionData();\n    REQUIRE(data.x == 1);\n    REQUIRE(data.y == 2);\n    REQUIRE(data.a == 3);\n}\n\nTEST_CASE(\"Estop\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'estop'}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"estop\\\"}>\";\n    COMMAND expected_command = COMMAND::ESTOP;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n}\n\n\nTEST_CASE(\"Load complete\", \"[RobotServer]\")\n{\n    std::string msg = \"<{'type':'lc'}>\";\n    std::string expected_response = \"<{\\\"type\\\":\\\"ack\\\",\\\"data\\\":\\\"lc\\\"}>\";\n    COMMAND expected_command = COMMAND::LOAD_COMPLETE;\n\n    StatusUpdater s;\n    RobotServer r = RobotServer(s);\n    testSimpleCommand(r, msg, expected_response, expected_command);\n}"
  },
  {
    "path": "src/robot/test/Robot_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n\n#include \"robot.h\"\n\n#include \"test-utils.h\"\n\nTEST_CASE(\"Robot move\", \"[Robot]\")\n{\n    std::string msg = \"<{'type':'move','data':{'x':0.5,'y':0.4,'a':0.3}}>\";\n    SafeConfigModifier<bool> config_modifier(\"motion.fake_perfect_motion\", true);\n    \n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n    MockSocketMultiThreadWrapper* mock_socket = build_and_get_mock_socket();\n    Robot r = Robot();\n\n    mock_socket->sendMockData(msg);\n    mock_clock->advance_ms(1);\n\n    r.runOnce();\n    REQUIRE(r.getCurrentCommand() == COMMAND::MOVE);\n    REQUIRE(r.getStatus().in_progress == true);\n\n    for (int i = 0; i < 10000; i++) \n    {\n        r.runOnce();\n        mock_clock->advance_ms(1);\n        if(r.getStatus().in_progress == false) {break;}\n    }\n\n    StatusUpdater::Status status = r.getStatus();\n    REQUIRE(status.pos_x == Approx(0.5).margin(0.0005));\n    REQUIRE(status.pos_y == Approx(0.4).margin(0.0005));\n    REQUIRE(status.pos_a == Approx(0.3).margin(0.0005));\n\n}"
  },
  {
    "path": "src/robot/test/SanityCheckTest.cpp",
    "content": "#include <Catch/catch.hpp>\n\n#include \"constants.h\"\n\n// TEST_CASE(\"Mock socket config disabled\", \"[SanityCheck]\")\n// {\n//     bool mock_socket_enabled = cfg.lookup(\"mock_socket.enabled\");\n//     REQUIRE(mock_socket_enabled == false);\n// }\n\nTEST_CASE(\"Fake perfect motion disabled\", \"[SanityCheck]\")\n{\n    bool fake_perfect_motion_enabled = cfg.lookup(\"motion.fake_perfect_motion\");\n    REQUIRE(fake_perfect_motion_enabled == false);\n}"
  },
  {
    "path": "src/robot/test/SmoothTrajectoryGenerator_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n\n#include \"SmoothTrajectoryGenerator.h\"\n#include \"constants.h\"\n\n// This stuff is needed to correctly print a custom type in Catch for debug\nnamespace Catch {\ntemplate<>\nstruct StringMaker<Point> \n{\n    static std::string convert(Point const& p) \n    {\n        return p.toString();\n    }\n};\ntemplate<>\nstruct StringMaker<Velocity> \n{\n    static std::string convert(Velocity const& p) \n    {\n        return p.toString();\n    }\n};\n}\n\nTEST_CASE(\"SmoothTrajectoryGenerator class\", \"[trajectory]\")\n{\n    SECTION(\"Smoke test - don't crash\")\n    {\n        SmoothTrajectoryGenerator stg;\n        Point p1 = {3,4,5};\n        Point p2 = {1,2,3};\n        LIMITS_MODE limits_mode = LIMITS_MODE::COARSE;\n\n        bool ok = stg.generatePointToPointTrajectory(p1, p2, limits_mode);\n        REQUIRE(ok == true);\n\n        PVTPoint output = stg.lookup(1.0);\n        REQUIRE(output.time == 1.0);\n\n        limits_mode = LIMITS_MODE::FINE;\n        ok = stg.generatePointToPointTrajectory(p1, p2, limits_mode);\n        REQUIRE(ok == true);\n\n        output = stg.lookup(1.0);\n        REQUIRE(output.time == 1.0);\n    }\n\n    SECTION(\"Final position\")\n    {\n        SmoothTrajectoryGenerator stg;\n        Point p1 = {0,0,0};\n        Point p2 = {1,2,3};\n        LIMITS_MODE limits_mode = LIMITS_MODE::COARSE;\n\n        bool ok = stg.generatePointToPointTrajectory(p1, p2, limits_mode);\n        REQUIRE(ok == true);\n\n        //Way in the future should return the final point\n        PVTPoint output = stg.lookup(60);\n        REQUIRE(output.time == 60);\n        CHECK(output.position.x == Approx(p2.x));\n        CHECK(output.position.y == Approx(p2.y));\n        CHECK(output.position.a == Approx(p2.a));\n        CHECK(output.velocity.vx == Approx(0.0).margin(0.0001));\n        CHECK(output.velocity.vy == Approx(0.0).margin(0.0001));\n        CHECK(output.velocity.va == Approx(0.0f).margin(0.0001));\n\n        limits_mode = LIMITS_MODE::FINE;\n        ok = stg.generatePointToPointTrajectory(p1, p2, limits_mode);\n        REQUIRE(ok == true);\n\n        //Way in the future should return the final point\n        output = stg.lookup(60);\n        REQUIRE(output.time == 60);\n        CHECK(output.position.x == Approx(p2.x));\n        CHECK(output.position.y == Approx(p2.y));\n        CHECK(output.position.a == Approx(p2.a));\n        CHECK(output.velocity.vx == Approx(0.0).margin(0.0001));\n        CHECK(output.velocity.vy == Approx(0.0).margin(0.0001));\n        CHECK(output.velocity.va == Approx(0.0).margin(0.0001));\n    }\n\n    SECTION(\"Zeros\")\n    {\n        SmoothTrajectoryGenerator stg;\n        Point p1 = {0,0,0};\n        Point p2 = {10,0,0};\n        LIMITS_MODE limits_mode = LIMITS_MODE::COARSE;\n\n        bool ok = stg.generatePointToPointTrajectory(p1, p2, limits_mode);\n        REQUIRE(ok == true);\n\n        //Way in the future should return the final point\n        PVTPoint output = stg.lookup(60);\n        REQUIRE(output.time == 60);\n        CHECK(output.position.x == Approx(p2.x));\n        CHECK(output.position.y == Approx(p2.y));\n        CHECK(output.position.a == Approx(p2.a));\n        CHECK(output.velocity.vx == Approx(0));\n        CHECK(output.velocity.vy == Approx(0));\n        CHECK(output.velocity.va == Approx(0).margin(0.0001));\n    }\n\n    SECTION(\"Lookup too early\")\n    {\n        SmoothTrajectoryGenerator stg;\n        Point p1 = {5,4,3};\n        Point p2 = {10,0,0};\n        LIMITS_MODE limits_mode = LIMITS_MODE::COARSE;\n\n        bool ok = stg.generatePointToPointTrajectory(p1, p2, limits_mode);\n        REQUIRE(ok == true);\n\n        PVTPoint output = stg.lookup(0);\n        REQUIRE(output.time == 0);\n        CHECK(output.position.x == Approx(p1.x));\n        CHECK(output.position.y == Approx(p1.y));\n        CHECK(output.position.a == Approx(p1.a));\n        CHECK(output.velocity.vx == Approx(0));\n        CHECK(output.velocity.vy == Approx(0));\n        CHECK(output.velocity.va == Approx(0).margin(0.0001));\n\n        output = stg.lookup(-1);\n        REQUIRE(output.time == -1);\n        CHECK(output.position.x == Approx(p1.x));\n        CHECK(output.position.y == Approx(p1.y));\n        CHECK(output.position.a == Approx(p1.a));\n        CHECK(output.velocity.vx == Approx(0));\n        CHECK(output.velocity.vy == Approx(0));\n        CHECK(output.velocity.va == Approx(0).margin(0.0001));\n    }\n\n    SECTION(\"Angle wrap\")\n    {\n        SmoothTrajectoryGenerator stg;\n        Point p1 = {0,0,3};\n        Point p2 = {0,0,-3};\n        LIMITS_MODE limits_mode = LIMITS_MODE::COARSE;\n\n        bool ok = stg.generatePointToPointTrajectory(p1, p2, limits_mode);\n        REQUIRE(ok == true);\n\n        //Way in the future should return the final point\n        PVTPoint output = stg.lookup(60);\n        REQUIRE(output.time == 60);\n        CHECK(output.position.x == Approx(p2.x));\n        CHECK(output.position.y == Approx(p2.y));\n        CHECK(output.position.a == Approx(p2.a));\n        CHECK(output.velocity.vx == Approx(0));\n        CHECK(output.velocity.vy == Approx(0));\n        CHECK(output.velocity.va == Approx(0).margin(0.0001));\n\n        for (float t = 0; t < 10; t+=0.1)\n        {\n            PVTPoint output = stg.lookup(t);\n            REQUIRE(fabs(output.position.a) < 3.14);\n            REQUIRE(fabs(output.position.a) > 2.99);\n        }\n    }\n\n    SECTION(\"Angle non-wrap\")\n    {\n        SmoothTrajectoryGenerator stg;\n        Point p1 = {0,0,-1};\n        Point p2 = {0,0,1};\n        LIMITS_MODE limits_mode = LIMITS_MODE::COARSE;\n\n        bool ok = stg.generatePointToPointTrajectory(p1, p2, limits_mode);\n        REQUIRE(ok == true);\n\n        //Way in the future should return the final point\n        PVTPoint output = stg.lookup(60);\n        REQUIRE(output.time == 60);\n        CHECK(output.position.x == Approx(p2.x));\n        CHECK(output.position.y == Approx(p2.y));\n        CHECK(output.position.a == Approx(p2.a));\n        CHECK(output.velocity.vx == Approx(0));\n        CHECK(output.velocity.vy == Approx(0));\n        CHECK(output.velocity.va == Approx(0).margin(0.0001));\n\n        for (float t = 0; t < 10; t+=0.1)\n        {\n            PVTPoint output = stg.lookup(t);\n            REQUIRE(fabs(output.position.a) < 1.01);\n        }\n    }\n}\n\nTEST_CASE(\"BuildMotionPlanningProblem\", \"[trajectory]\")\n{\n    Point p1 = {0,0,0};\n    Point p2 = {10,0,-3};\n    LIMITS_MODE limits_mode;\n    SolverParameters solver;\n    solver.num_loops = 10;\n    solver.alpha_decay = 0.8;\n    solver.beta_decay = 0.8;\n    solver.exponent_decay = 0.1;\n\n    Eigen::Vector3f expected_p1 = {0,0,0};\n    Eigen::Vector3f expected_p2 = {10,0,-3};\n\n    REQUIRE(static_cast<std::string>(cfg.lookup(\"name\")) == \"Test constants\");\n    REQUIRE(static_cast<float>(cfg.lookup(\"motion.limit_max_fraction\")) == 1.0);\n\n    SECTION (\"Coarse mode\")\n    {\n        limits_mode = LIMITS_MODE::COARSE;\n        MotionPlanningProblem mpp = buildMotionPlanningProblem(p1, p2, limits_mode, solver);\n        REQUIRE(mpp.initialPoint == expected_p1);\n        REQUIRE(mpp.targetPoint == expected_p2);\n        REQUIRE(mpp.translationalLimits.max_vel == static_cast<float>(cfg.lookup(\"motion.translation.max_vel.coarse\")));\n        REQUIRE(mpp.translationalLimits.max_acc == static_cast<float>(cfg.lookup(\"motion.translation.max_acc.coarse\")));\n        REQUIRE(mpp.translationalLimits.max_jerk == static_cast<float>(cfg.lookup(\"motion.translation.max_jerk.coarse\")));\n        REQUIRE(mpp.rotationalLimits.max_vel ==static_cast<float>( cfg.lookup(\"motion.rotation.max_vel.coarse\")));\n        REQUIRE(mpp.rotationalLimits.max_acc == static_cast<float>(cfg.lookup(\"motion.rotation.max_acc.coarse\")));\n        REQUIRE(mpp.rotationalLimits.max_jerk == static_cast<float>(cfg.lookup(\"motion.rotation.max_jerk.coarse\")));\n        REQUIRE(mpp.solver_params.num_loops == solver.num_loops);\n        REQUIRE(mpp.solver_params.alpha_decay == solver.alpha_decay);\n        REQUIRE(mpp.solver_params.beta_decay == solver.beta_decay);\n        REQUIRE(mpp.solver_params.exponent_decay == solver.exponent_decay);\n    }\n\n    SECTION (\"Fine mode\")\n    {\n        limits_mode = LIMITS_MODE::FINE;\n        MotionPlanningProblem mpp = buildMotionPlanningProblem(p1, p2, limits_mode, solver);\n        REQUIRE(mpp.initialPoint == expected_p1);\n        REQUIRE(mpp.targetPoint == expected_p2);\n        REQUIRE(mpp.translationalLimits.max_vel == static_cast<float>(cfg.lookup(\"motion.translation.max_vel.fine\")));\n        REQUIRE(mpp.translationalLimits.max_acc == static_cast<float>(cfg.lookup(\"motion.translation.max_acc.fine\")));\n        REQUIRE(mpp.translationalLimits.max_jerk == static_cast<float>(cfg.lookup(\"motion.translation.max_jerk.fine\")));\n        REQUIRE(mpp.rotationalLimits.max_vel ==static_cast<float>( cfg.lookup(\"motion.rotation.max_vel.fine\")));\n        REQUIRE(mpp.rotationalLimits.max_acc == static_cast<float>(cfg.lookup(\"motion.rotation.max_acc.fine\")));\n        REQUIRE(mpp.rotationalLimits.max_jerk == static_cast<float>(cfg.lookup(\"motion.rotation.max_jerk.fine\")));\n        REQUIRE(mpp.solver_params.num_loops == solver.num_loops);\n        REQUIRE(mpp.solver_params.alpha_decay == solver.alpha_decay);\n        REQUIRE(mpp.solver_params.beta_decay == solver.beta_decay);\n        REQUIRE(mpp.solver_params.exponent_decay == solver.exponent_decay);\n    }\n}\n\nTEST_CASE(\"generateTrajectory\", \"[trajectory]\")\n{\n    Point p1 = {0,0,0};\n    Point p2 = {10,0,-3};\n    LIMITS_MODE limits_mode = LIMITS_MODE::COARSE;\n    SolverParameters solver = {25, 0.8, 0.8, 0.1};\n    MotionPlanningProblem mpp = buildMotionPlanningProblem(p1, p2, limits_mode, solver);\n\n    Trajectory traj = generateTrajectory(mpp);\n\n    REQUIRE(traj.complete == true);\n    CHECK(traj.initialPoint == p1);\n    CHECK(traj.trans_direction(0) == 1);\n    CHECK(traj.trans_direction(1) == 0);\n    CHECK(traj.rot_direction == -1);\n}\n\nTEST_CASE(\"generateSCurve\", \"[trajectory]\")\n{\n    float dist = 10;\n    DynamicLimits limits = {1, 2, 8};\n    SolverParameters solver = {25, 0.8, 0.8, 0.1};\n    SCurveParameters params;\n    bool ok = generateSCurve(dist, limits, solver, &params);\n    REQUIRE(ok == true);\n    REQUIRE(params.v_lim == 1.0);\n    REQUIRE(params.a_lim == 2.0);\n    REQUIRE(params.j_lim == 8.0);\n    float dt_v = 9.25; // Expected values\n    float dt_a = 0.25;\n    float dt_j = 0.25;\n    CHECK(params.switch_points[0].t == 0);\n    CHECK(params.switch_points[1].t == dt_j);\n    CHECK(params.switch_points[2].t == dt_j + dt_a);\n    CHECK(params.switch_points[3].t == 2*dt_j + dt_a);\n    CHECK(params.switch_points[4].t == 2*dt_j + dt_a + dt_v);\n    CHECK(params.switch_points[5].t == 3*dt_j + dt_a + dt_v);\n    CHECK(params.switch_points[6].t == 3*dt_j + 2*dt_a + dt_v);\n    CHECK(params.switch_points[7].t == 4*dt_j + 2*dt_a + dt_v);\n\n    SECTION(\"Zero\")\n    {\n        float dist = 0;\n        DynamicLimits limits = {1, 2, 8};\n        SolverParameters solver = {25, 0.8, 0.8, 0.1};\n        SCurveParameters params;\n        bool ok = generateSCurve(dist, limits, solver, &params);\n        REQUIRE(ok == true);\n        REQUIRE(params.v_lim == 0);\n        REQUIRE(params.a_lim == 0);\n        REQUIRE(params.j_lim == 0);\n        for (int i = 0; i < 8; i++)\n        {\n            CHECK(params.switch_points[i].t == 0);\n            CHECK(params.switch_points[i].p == 0);\n            CHECK(params.switch_points[i].v == 0);\n            CHECK(params.switch_points[i].a == 0);\n        }\n    }\n\n    SECTION(\"Very close to zero\")\n    {\n        float dist = 0.000001;\n        DynamicLimits limits = {1, 2, 8};\n        SolverParameters solver = {25, 0.8, 0.8, 0.1};\n        SCurveParameters params;\n        bool ok = generateSCurve(dist, limits, solver, &params);\n        REQUIRE(ok == true);\n        REQUIRE(params.v_lim == 0);\n        REQUIRE(params.a_lim == 0);\n        REQUIRE(params.j_lim == 0);\n        for (int i = 0; i < 8; i++)\n        {\n            CHECK(params.switch_points[i].t == 0);\n            CHECK(params.switch_points[i].p == 0);\n            CHECK(params.switch_points[i].v == 0);\n            CHECK(params.switch_points[i].a == 0);\n        }\n    }\n\n    SECTION(\"Small dist that is above min_dist_limit\")\n    {\n        float dist = 0.0001;\n        DynamicLimits limits = {4, 2, 1};\n        SolverParameters solver = {30, 0.8, 0.8, 0.1};\n        SCurveParameters params;\n        bool ok = generateSCurve(dist, limits, solver, &params);\n        REQUIRE(ok == true);\n        REQUIRE(params.v_lim < 4);\n        REQUIRE(params.a_lim < 2);\n        REQUIRE(params.j_lim == 1);\n    }\n\n    SECTION(\"Modified v limit\")\n    {\n        float dist = 10;\n        DynamicLimits limits = {3, 1, 1};\n        SolverParameters solver = {25, 0.8, 0.8, 0.1};\n        SCurveParameters params;\n        bool ok = generateSCurve(dist, limits, solver, &params);\n        REQUIRE(ok == true);\n        REQUIRE(params.v_lim < 3);\n        REQUIRE(params.a_lim == 1);\n        REQUIRE(params.j_lim == 1);\n    }\n\n    SECTION(\"Modified a limit\")\n    {\n        float dist = 10;\n        DynamicLimits limits = {1, 2, 1};\n        SolverParameters solver = {25, 0.8, 0.8, 0.1};\n        SCurveParameters params;\n        bool ok = generateSCurve(dist, limits, solver, &params);\n        REQUIRE(ok == true);\n        REQUIRE(params.v_lim == 1);\n        REQUIRE(params.a_lim < 2);\n        REQUIRE(params.j_lim == 1);\n    }\n\n    SECTION(\"Modified both limits\")\n    {\n        float dist = 10;\n        DynamicLimits limits = {4, 2, 1};\n        SolverParameters solver = {25, 0.8, 0.8, 0.1};\n        SCurveParameters params;\n        bool ok = generateSCurve(dist, limits, solver, &params);\n        REQUIRE(ok == true);\n        REQUIRE(params.v_lim < 4);\n        REQUIRE(params.a_lim < 2);\n        REQUIRE(params.j_lim == 1);\n    }\n}\n\nTEST_CASE(\"populateSwitchTimeParameters\", \"[trajectory]\")\n{\n    SCurveParameters params;\n    params.v_lim = 1.0;\n    params.a_lim = 2.0;\n    params.j_lim = 8.0;\n    float dt_v = 9.25;\n    float dt_a = 0.25;\n    float dt_j = 0.25;\n\n    populateSwitchTimeParameters(&params, dt_j, dt_a, dt_v);\n\n    REQUIRE(params.switch_points[0].t == Approx(0));\n    REQUIRE(params.switch_points[0].p == Approx(0));\n    REQUIRE(params.switch_points[0].v == Approx(0));\n    REQUIRE(params.switch_points[0].a == Approx(0));\n\n    CHECK(params.switch_points[1].t == Approx(0.25).margin(0.001));\n    CHECK(params.switch_points[1].p == Approx(0.02083).margin(0.001));\n    CHECK(params.switch_points[1].v == Approx(0.25).margin(0.001));\n    CHECK(params.switch_points[1].a == Approx(2.0).margin(0.001));\n\n    CHECK(params.switch_points[2].t == Approx(0.5).margin(0.001));\n    CHECK(params.switch_points[2].p == Approx(0.1458).margin(0.001));\n    CHECK(params.switch_points[2].v == Approx(0.75).margin(0.001));\n    CHECK(params.switch_points[2].a == Approx(2.0).margin(0.001));\n\n    CHECK(params.switch_points[3].t == Approx(0.75).margin(0.001));\n    CHECK(params.switch_points[3].p == Approx(0.375).margin(0.001));\n    CHECK(params.switch_points[3].v == Approx(1.0).margin(0.001));\n    CHECK(params.switch_points[3].a == Approx(0).margin(0.001));\n\n    CHECK(params.switch_points[4].t == Approx(10.0).margin(0.001));\n    CHECK(params.switch_points[4].p == Approx(9.625).margin(0.001));\n    CHECK(params.switch_points[4].v == Approx(1.0).margin(0.001));\n    CHECK(params.switch_points[4].a == Approx(0).margin(0.001));\n\n    CHECK(params.switch_points[5].t == Approx(10.25).margin(0.001));\n    CHECK(params.switch_points[5].p == Approx(9.854).margin(0.001));\n    CHECK(params.switch_points[5].v == Approx(0.75).margin(0.001));\n    CHECK(params.switch_points[5].a == Approx(-2.0).margin(0.001));\n\n    CHECK(params.switch_points[6].t == Approx(10.5).margin(0.001));\n    CHECK(params.switch_points[6].p == Approx(9.98).margin(0.001));\n    CHECK(params.switch_points[6].v == Approx(0.25).margin(0.001));\n    CHECK(params.switch_points[6].a == Approx(-2.0).margin(0.001));\n\n    CHECK(params.switch_points[7].t == Approx(10.75).margin(0.001));\n    CHECK(params.switch_points[7].p == Approx(10).margin(0.001));\n    CHECK(params.switch_points[7].v == Approx(0).margin(0.001));\n    CHECK(params.switch_points[7].a == Approx(0).margin(0.001));\n\n}\n\nTEST_CASE(\"lookup_1D\", \"[trajectory]\")\n{\n    SCurveParameters params;\n    params.v_lim = 1.0;\n    params.a_lim = 1.0;\n    params.j_lim = 1.0;\n    for (int i = 0; i < 8; i ++)\n    {\n        params.switch_points[i].a = 1.0;\n        params.switch_points[i].v = 1.0;\n        params.switch_points[i].p = 1.0;\n        params.switch_points[i].t = i;\n    }\n    // Need to manually set a for const vel region\n    params.switch_points[3].a = 0;\n\n    SECTION(\"Region 1 - Const Positive Jerk\")\n    {\n        int region = 1;\n        float time = static_cast<float>(region-1) + 0.5;\n        std::vector<float> test_vec = lookup_1D(time, params);\n        float expected_v = 1.625;\n        float expected_p = 1.645;\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[0] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 2 - Const Positive Acc\")\n    {\n        int region = 2;\n        float time = static_cast<float>(region-1) + 0.5;\n        std::vector<float> test_vec = lookup_1D(time, params);\n        float expected_v = 1.5;\n        float expected_p = 1.625;\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[0] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 3 - Const Negative Jerk\")\n    {\n        int region = 3;\n        float time = static_cast<float>(region-1) + 0.5;\n        std::vector<float> test_vec = lookup_1D(time, params);\n        float expected_v = 1.375;\n        float expected_p = 1.604;\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[0] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 4 - Const Vel\")\n    {\n        int region = 4;\n        float time = static_cast<float>(region-1) + 0.5;\n        std::vector<float> test_vec = lookup_1D(time, params);\n        float expected_v = 1.0;\n        float expected_p = 1.5;\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[0] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 5 - Const Negative Jerk\")\n    {\n        int region = 5;\n        float time = static_cast<float>(region-1) + 0.5;\n        std::vector<float> test_vec = lookup_1D(time, params);\n        float expected_v = 1.375;\n        float expected_p = 1.604;\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[0] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 6 - Const Negative Acc\")\n    {\n        int region = 6;\n        float time = static_cast<float>(region-1) + 0.5;\n        std::vector<float> test_vec = lookup_1D(time, params);\n        float expected_v = 1.5;\n        float expected_p = 1.625;\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[0] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 7 - Const Positive Jerk\")\n    {\n        int region = 7;\n        float time = static_cast<float>(region-1) + 0.5;\n        std::vector<float> test_vec = lookup_1D(time, params);\n        float expected_v = 1.625;\n        float expected_p = 1.645;\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[0] == Approx(expected_p).margin(0.001));\n    }\n}\n\nTEST_CASE(\"computeKinematicsBasedOnRegion\", \"[trajectory]\")\n{\n    SCurveParameters params;\n    params.v_lim = 1.0;\n    params.a_lim = 1.0;\n    params.j_lim = 1.0;\n    for (int i = 0; i < 8; i ++)\n    {\n        params.switch_points[i].a = 1.0;\n        params.switch_points[i].v = 1.0;\n        params.switch_points[i].p = 1.0;\n        params.switch_points[i].t = i;\n    }\n    // Need to manually set a for const vel region\n    params.switch_points[3].a = 0;\n\n    SECTION(\"Region 1 - Const Positive Jerk\")\n    {\n        int region = 1;\n        float dt = 0.5;\n        std::vector<float> test_vec = computeKinematicsBasedOnRegion(params, region, dt);\n        float expected_a = 1.5;\n        float expected_v = 1.625;\n        float expected_p = 1.645;\n        REQUIRE(test_vec[0] == Approx(expected_a).margin(0.001));\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[2] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 2 - Const Positive Acc\")\n    {\n        int region = 2;\n        float dt = 0.5;\n        std::vector<float> test_vec = computeKinematicsBasedOnRegion(params, region, dt);\n        float expected_a = 1.0;\n        float expected_v = 1.5;\n        float expected_p = 1.625;\n        REQUIRE(test_vec[0] == Approx(expected_a).margin(0.001));\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[2] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 3 - Const Negative Jerk\")\n    {\n        int region = 3;\n        float dt = 0.5;\n        std::vector<float> test_vec = computeKinematicsBasedOnRegion(params, region, dt);\n        float expected_a = 0.5;\n        float expected_v = 1.375;\n        float expected_p = 1.604;\n        REQUIRE(test_vec[0] == Approx(expected_a).margin(0.001));\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[2] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 4 - Const Vel\")\n    {\n        int region = 4;\n        float dt = 0.5;\n        std::vector<float> test_vec = computeKinematicsBasedOnRegion(params, region, dt);\n        float expected_a = 0;\n        float expected_v = 1.0;\n        float expected_p = 1.5;\n        REQUIRE(test_vec[0] == Approx(expected_a).margin(0.001));\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[2] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 5 - Const Negative Jerk\")\n    {\n        int region = 5;\n        float dt = 0.5;\n        std::vector<float> test_vec = computeKinematicsBasedOnRegion(params, region, dt);\n        float expected_a = 0.5;\n        float expected_v = 1.375;\n        float expected_p = 1.604;\n        REQUIRE(test_vec[0] == Approx(expected_a).margin(0.001));\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[2] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 6 - Const Negative Acc\")\n    {\n        int region = 6;\n        float dt = 0.5;\n        std::vector<float> test_vec = computeKinematicsBasedOnRegion(params, region, dt);\n        float expected_a = -1.0;\n        float expected_v = 1.5;\n        float expected_p = 1.625;\n        REQUIRE(test_vec[0] == Approx(expected_a).margin(0.001));\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[2] == Approx(expected_p).margin(0.001));\n    }\n    SECTION(\"Region 7 - Const Positive Jerk\")\n    {\n        int region = 7;\n        float dt = 0.5;\n        std::vector<float> test_vec = computeKinematicsBasedOnRegion(params, region, dt);\n        float expected_a = 1.5;\n        float expected_v = 1.625;\n        float expected_p = 1.645;\n        REQUIRE(test_vec[0] == Approx(expected_a).margin(0.001));\n        REQUIRE(test_vec[1] == Approx(expected_v).margin(0.001));\n        REQUIRE(test_vec[2] == Approx(expected_p).margin(0.001));\n    }\n}\n\nTEST_CASE(\"synchronizeParameters\", \"[trajectory]\")\n{\n    // These values should all be valid together\n    std::vector<float> switch_times {0.000, 0.200, 0.500, 0.700, 6.000, 6.200, 6.500, 6.700};\n    float dt_v = 5.3;\n    float dt_a = 0.3;\n    float dt_j = 0.2;\n    float dp = 3.0;\n    SCurveParameters valid_params;\n    valid_params.v_lim = 0.5;\n    valid_params.a_lim = 1.0;\n    valid_params.j_lim = 5.0;\n    populateSwitchTimeParameters(&valid_params, dt_j, dt_a, dt_v);\n    for (int i = 0; i < 8; i++)\n    {\n        REQUIRE(valid_params.switch_points[i].t == switch_times[i]);\n    }\n    REQUIRE(valid_params.switch_points[7].p == dp);\n    \n    SCurveParameters params_to_test = valid_params;\n    params_to_test.v_lim = 0.0;\n    params_to_test.a_lim = 0.0;\n    params_to_test.j_lim = 0.0;\n\n    SECTION(\"solveInverse\")\n    {\n        bool ok = solveInverse(&params_to_test);\n        CHECK(ok == true);\n        CHECK(params_to_test.v_lim == Approx(valid_params.v_lim));\n        CHECK(params_to_test.a_lim == Approx(valid_params.a_lim));\n        CHECK(params_to_test.j_lim == Approx(valid_params.j_lim));\n    }\n    SECTION(\"slowDownParamsToMatchTime\")\n    {\n        for (int i = 0; i < 8; i++)\n        {\n            params_to_test.switch_points[i].t /= 2.0;\n        }\n        bool ok = slowDownParamsToMatchTime(&params_to_test, switch_times[7]);\n        CHECK(ok == true);\n        CHECK(params_to_test.switch_points[7].t == switch_times[7]);\n        CHECK(params_to_test.switch_points[7].p == Approx(dp));\n    }\n    SECTION(\"synchronize\")\n    {\n        for (int i = 0; i < 8; i++)\n        {\n            params_to_test.switch_points[i].t /= 2.0;\n        }\n        SCurveParameters params_to_test2 = valid_params;\n        bool ok = synchronizeParameters(&params_to_test, &params_to_test2);\n        CHECK(ok == true);\n        CHECK(params_to_test.switch_points[7].t == switch_times[7]);\n        CHECK(params_to_test.switch_points[7].p == Approx(dp));\n        CHECK(params_to_test2.switch_points[7].t == switch_times[7]);\n        CHECK(params_to_test2.switch_points[7].p == Approx(dp));\n        for (int i = 0; i < 8; i++)\n        {\n            CHECK(params_to_test2.switch_points[i].t == switch_times[i]);\n        }\n    }\n}\n"
  },
  {
    "path": "src/robot/test/SocketWrapper_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n#include <unistd.h>\n\n#include \"sockets/ClientSocket.h\"\n#include \"sockets/SocketException.h\"\n#include \"sockets/SocketMultiThreadWrapper.h\"\n\n\n\nvoid SimpleSocketSend(std::string data_to_send)\n{\n    usleep(1000);\n    \n    ClientSocket test_socket ( \"localhost\", 8123 );\n\n    //Send message\n    test_socket << data_to_send;\n\n    // Brief wait to ensure thread processed the sent message\n    usleep(1000);\n}\n\nTEST_CASE(\"Socket send test\", \"[socket]\") \n{\n    \n    SocketMultiThreadWrapper s;\n    std::string test_msg = \"This is a test\";\n    for(int i = 0; i < 10; i++)\n    {\n        test_msg += std::to_string(i);\n        SimpleSocketSend(test_msg);\n        REQUIRE(s.dataAvailableToRead());\n        REQUIRE(s.getData() == test_msg);\n    }   \n}\n\n// TEST_CASE(\"Socket recv test\", \"[socket]\") \n// {\n    \n//     SocketMultiThreadWrapper s;\n//     std::string test_msg = \"This is a test\";\n//     for(int i = 0; i < 10; i++)\n//     {\n//         test_msg += std::to_string(i);\n\n//         usleep(1000);\n//         ClientSocket test_socket ( \"localhost\", 8123 );\n//         s.sendData(test_msg);\n\n//         std::string data;\n//         test_socket >> data;\n//         REQUIRE(data == test_msg);\n//     }   \n// }"
  },
  {
    "path": "src/robot/test/StatusUpdater_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n\n#include \"StatusUpdater.h\"\n\nusing Catch::Matchers::StartsWith;\nusing Catch::Matchers::EndsWith;\nusing Catch::Matchers::Contains;\n\nTEST_CASE(\"Position\", \"[StatusUpdater]\")\n{\n    StatusUpdater s;\n    s.updatePosition(1,2,3);\n\n    StatusUpdater::Status status = s.getStatus();\n    REQUIRE(status.pos_x == 1);\n    REQUIRE(status.pos_y == 2);\n    REQUIRE(status.pos_a == 3);\n}\n\nTEST_CASE(\"Velocity\", \"[StatusUpdater]\")\n{\n    StatusUpdater s;\n    s.updateVelocity(1,2,3);\n\n    StatusUpdater::Status status = s.getStatus();\n    REQUIRE(status.vel_x == 1);\n    REQUIRE(status.vel_y == 2);\n    REQUIRE(status.vel_a == 3);\n}\n\nTEST_CASE(\"LoopTimes\", \"[StatusUpdater]\")\n{\n    StatusUpdater s;\n    s.updateControlLoopTime(1);\n    s.updatePositionLoopTime(2);\n\n    StatusUpdater::Status status = s.getStatus();\n    REQUIRE(status.controller_loop_ms == 1);\n    REQUIRE(status.position_loop_ms == 2);\n}\n\nTEST_CASE(\"Progress\", \"[StatusUpdater]\")\n{\n    StatusUpdater s;\n\n    StatusUpdater::Status status = s.getStatus();\n    REQUIRE(status.in_progress == false);\n    REQUIRE(s.getInProgress() == false);\n\n    s.updateInProgress(true);\n\n    status = s.getStatus();\n    REQUIRE(status.in_progress == true);\n    REQUIRE(s.getInProgress() == true);\n}\n\nTEST_CASE(\"JSON\", \"[StatusUpdater]\")\n{\n    StatusUpdater s;\n    s.updatePosition(1,2,3);\n    s.updateVelocity(4,5,6);\n    s.updateControlLoopTime(7);\n    s.updatePositionLoopTime(8);\n    s.updateInProgress(true);\n\n    std::string json_string = s.getStatusJsonString();\n\n    REQUIRE_THAT(json_string, StartsWith(\"{\"));\n    REQUIRE_THAT(json_string, Contains(\"\\\"pos_x\\\":1\"));\n    REQUIRE_THAT(json_string, Contains(\"\\\"pos_y\\\":2\"));\n    REQUIRE_THAT(json_string, Contains(\"\\\"pos_a\\\":3\"));\n    REQUIRE_THAT(json_string, Contains(\"\\\"vel_x\\\":4\"));\n    REQUIRE_THAT(json_string, Contains(\"\\\"vel_y\\\":5\"));\n    REQUIRE_THAT(json_string, Contains(\"\\\"vel_a\\\":6\"));\n    REQUIRE_THAT(json_string, Contains(\"\\\"controller_loop_ms\\\":7\"));\n    REQUIRE_THAT(json_string, Contains(\"\\\"position_loop_ms\\\":8\"));\n    REQUIRE_THAT(json_string, EndsWith(\"}\"));\n}\n"
  },
  {
    "path": "src/robot/test/TrayController_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n\n#include \"TrayController.h\"\n#include \"test-utils.h\"\n\nTEST_CASE(\"Send and waiting\", \"[TrayController]\")\n{\n    MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n    TrayController t;\n\n    t.initialize();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"\");\n    REQUIRE(t.isActionRunning() == true);\n\n    // Expect to receive close command once and then start status requests\n    mock_clock->advance_ms(1);\n    t.update();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"close\");\n    \n    // Expect to receive status request command while lifter reports closing\n    for (int i = 0; i < 5; i ++)\n    {\n        mock_clock->advance_ms(1);\n        mock_serial->mock_send(\"lift:close\");\n        t.update();\n        REQUIRE(mock_serial->mock_rcv_lift() == \"status_req\");\n    }\n\n    // Expect to receive none response once lifter reports no action\n    mock_clock->advance_ms(1000);\n    mock_serial->mock_send(\"lift:none\");\n    t.update();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"status_req\");\n    REQUIRE(mock_serial->mock_rcv_lift() == \"home\");\n}\n\nTEST_CASE(\"Initialize tray\", \"[TrayController]\")\n{\n    MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n    TrayController t;\n\n    t.initialize();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"\");\n    REQUIRE(t.isActionRunning() == true);\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"close\");\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"status_req\");\n    REQUIRE(mock_serial->mock_rcv_lift() == \"home\");\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    float revs = cfg.lookup(\"tray.default_pos_revs\");\n    int steps_per_rev = cfg.lookup(\"tray.steps_per_rev\");\n    int steps = steps_per_rev * revs;\n    REQUIRE(mock_serial->mock_rcv_lift() == \"status_req\");\n    REQUIRE(mock_serial->mock_rcv_lift() == \"pos:\"+std::to_string(steps));\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    REQUIRE(t.isActionRunning() == false);\n}\n\nTEST_CASE(\"Place tray\", \"[TrayController]\")\n{\n    MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n    TrayController t;\n    t.setTrayInitialized(true);\n\n    bool status = t.place();    \n    REQUIRE(status == true);\n    REQUIRE(mock_serial->mock_rcv_lift() == \"\");\n    REQUIRE(t.isActionRunning() == true);\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    float revs = cfg.lookup(\"tray.place_pos_revs\");\n    int steps_per_rev = cfg.lookup(\"tray.steps_per_rev\");\n    int steps = steps_per_rev * revs;\n    REQUIRE(mock_serial->mock_rcv_lift() == \"pos:\"+std::to_string(steps));\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"status_req\");\n    REQUIRE(mock_serial->mock_rcv_lift() == \"open\");\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    revs = cfg.lookup(\"tray.default_pos_revs\");\n    steps = steps_per_rev * revs;\n    REQUIRE(mock_serial->mock_rcv_lift() == \"status_req\");\n    REQUIRE(mock_serial->mock_rcv_lift() == \"pos:\"+std::to_string(steps));\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"status_req\");\n    REQUIRE(mock_serial->mock_rcv_lift() == \"close\");\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    REQUIRE(t.isActionRunning() == false);\n}\n\nTEST_CASE(\"Load tray\", \"[TrayController]\")\n{\n    MockSerialComms* mock_serial = build_and_get_mock_serial(CLEARCORE_USB);;\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n    TrayController t;\n    t.setTrayInitialized(true);\n\n    bool status = t.load();\n    REQUIRE(status == true);\n    REQUIRE(mock_serial->mock_rcv_lift() == \"\");\n    REQUIRE(t.isActionRunning() == true);\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    float revs = cfg.lookup(\"tray.load_pos_revs\");\n    int steps_per_rev = cfg.lookup(\"tray.steps_per_rev\");\n    int steps = steps_per_rev * revs;\n    REQUIRE(mock_serial->mock_rcv_lift() == \"pos:\"+std::to_string(steps));\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    REQUIRE(mock_serial->mock_rcv_lift() == \"status_req\");\n    REQUIRE(mock_serial->mock_rcv_lift() == \"\");\n    t.setLoadComplete();\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    revs = cfg.lookup(\"tray.default_pos_revs\");\n    steps = steps_per_rev * revs;\n    REQUIRE(mock_serial->mock_rcv_lift() == \"pos:\"+std::to_string(steps));\n    mock_serial->mock_send(\"lift:none\");\n\n    mock_clock->advance_ms(1500);\n    t.update();\n    REQUIRE(t.isActionRunning() == false);\n}\n\nTEST_CASE(\"Errors when tray not initialized\", \"[TrayController]\")\n{\n    TrayController t;\n    t.setTrayInitialized(false);\n\n    REQUIRE(t.place() == false);\n    REQUIRE(t.load() == false);\n}"
  },
  {
    "path": "src/robot/test/test-main.cpp",
    "content": "#define CATCH_CONFIG_RUNNER\n#include <Catch/catch.hpp>\n#include \"constants.h\"\n#include \"serial/SerialCommsFactory.h\"\n#include \"sockets/SocketMultiThreadWrapperFactory.h\"\n#include \"camera_tracker/CameraTrackerFactory.h\"\n#include \"utils.h\"\n\n#include <plog/Log.h> \n#include <plog/Init.h>\n#include <plog/Formatters/TxtFormatter.h>\n#include <plog/Formatters/MessageOnlyFormatter.h>\n#include <plog/Appenders/ColorConsoleAppender.h>\n#include <plog/Appenders/RollingFileAppender.h>\n\nlibconfig::Config cfg = libconfig::Config();\n\nvoid configure_logger()\n{\n    // Make file names\n    std::string test_log_file_name = std::string(\"log/test_log.txt\");\n\n    // Initialize test logs to to go file and console\n    std::string log_level = cfg.lookup(\"log_level\");\n    plog::Severity severity = plog::severityFromString(log_level.c_str());\n    static plog::RollingFileAppender<plog::TxtFormatter> fileAppender(test_log_file_name.c_str(), 1000000, 5);\n    static plog::ColorConsoleAppender<plog::TxtFormatter> consoleAppender;\n    plog::init(severity, &fileAppender).addAppender(&consoleAppender); \n\n    PLOGI << \"Logger ready\";\n}\n\nint main( int argc, char* argv[] ) \n{\n    cfg.readFile(TEST_CONSTANTS_FILE);\n    configure_logger();\n    SerialCommsFactory::getFactoryInstance()->set_mode(SERIAL_FACTORY_MODE::MOCK);\n    SocketMultiThreadWrapperFactory::getFactoryInstance()->set_mode(SOCKET_FACTORY_MODE::MOCK);\n    ClockFactory::getFactoryInstance()->set_mode(CLOCK_FACTORY_MODE::MOCK);\n    CameraTrackerFactory::getFactoryInstance()->set_mode(CAMERA_TRACKER_FACTORY_MODE::MOCK);\n\n    int result = Catch::Session().run(argc, argv);\n\n    return result;\n}"
  },
  {
    "path": "src/robot/test/test-utils.h",
    "content": "#ifndef testutils_h\n#define testutils_h\n\n#include \"serial/MockSerialComms.h\"\n#include \"serial/SerialCommsFactory.h\"\n#include \"constants.h\"\n#include \"sockets/SocketMultiThreadWrapperFactory.h\"\n#include \"sockets/MockSocketMultiThreadWrapper.h\"\n#include \"utils.h\"\n#include <variant>\n\ninline MockSocketMultiThreadWrapper* build_and_get_mock_socket() \n{\n    SocketMultiThreadWrapperBase* base_socket = SocketMultiThreadWrapperFactory::getFactoryInstance()->get_socket();\n    // Slightly dangerous....\n    MockSocketMultiThreadWrapper* mock_socket = dynamic_cast<MockSocketMultiThreadWrapper*>(base_socket);\n    mock_socket->purge_data();\n    return mock_socket;\n}\n\ninline MockSerialComms* build_and_get_mock_serial(const std::string& portName)\n{\n    SerialCommsBase* base_serial = SerialCommsFactory::getFactoryInstance()->get_serial_comms(portName);\n    // Slightly dangerous....\n    MockSerialComms* mock_serial = dynamic_cast<MockSerialComms*>(base_serial);\n    mock_serial->purge_data();\n    return mock_serial;\n}\n\ninline MockClockWrapper* get_mock_clock()\n{\n    ClockWrapperBase* base_clock = ClockFactory::getFactoryInstance()->get_clock();\n    MockClockWrapper* mock_clock = dynamic_cast<MockClockWrapper*>(base_clock);\n    return mock_clock;\n}\n\ninline MockClockWrapper* get_mock_clock_and_reset()\n{\n    MockClockWrapper* mock_clock = get_mock_clock();\n    mock_clock->set_now();\n    return mock_clock;\n}\n\ninline void reset_mock_clock()\n{\n    MockClockWrapper* mock_clock = get_mock_clock();\n    mock_clock->set_now();\n}\n\n// Helper class to modify a global setting which will revert the setting change when going out of scope\ntemplate <typename T>\nclass SafeConfigModifier\n{\n  public:\n    SafeConfigModifier(std::string config_path, T value)\n     : cur_val_(cfg.lookup(config_path)),\n       old_val_(static_cast<T>(cur_val_))\n    {\n        cur_val_ = value;\n    }\n    ~SafeConfigModifier()\n    {\n        cur_val_ = old_val_;\n    }\n\n  private:\n    libconfig::Setting& cur_val_;\n    T old_val_;\n};\n\n\n#endif"
  },
  {
    "path": "src/robot/test/test_constants.cfg",
    "content": "name = \"Test constants\";\nlog_level = \"info\";   // verbose, debug, info, warning, error, fatal, none\n\nmotion = \n{\n  limit_max_fraction  = 1.0;   // Only generate a trajectory to this fraction of max speed to give motors headroom to compensate\n  controller_frequency  = 40;   // Hz for RobotController\n  log_frequency         = 20 ;   // HZ for logging to motion log\n  fake_perfect_motion   = false;   // Enable or disable bypassing clearcore to fake perfect motion for testing\n  rate_always_ready     = true;   // Bypasses rate limiter if set to true\n  translation = \n  {\n    max_vel = \n    {\n      vision = 1.0; // m/s\n      fine   = 1.0; // m/s\n      coarse = 1.0;  // m/s\n    };\n    max_acc = \n    {\n      vision = 2.0;  // m/s^2\n      fine   = 2.0;  // m/s^2\n      coarse = 2.0;  // m/s^2\n    };\n    max_jerk = \n    {\n      vision = 8.0;  // m/s^3\n      fine   = 8.0;  // m/s^3\n      coarse = 8.0;  // m/s^3\n    };\n    position_threshold = \n    {\n      vision = 0.01;   // m\n      fine   = 0.01;   // m\n      coarse = 0.10;   // m\n    };\n    velocity_threshold = \n    {\n      vision = 0.01;   // m/s\n      fine   = 0.01;   // m/s\n      coarse = 0.05;   // m/s\n    };\n    gains =\n    {\n      kp = 1.0;\n      ki = 0.0;\n      kd = 0.0;\n    };\n    gains_vision =\n    {\n      kp = 1.0;\n      ki = 0.0;\n      kd = 0.0;\n    };\n  };\n\n  rotation = \n  {\n    max_vel = \n    {\n      vision = 0.5;  // rad/s\n      fine   = 0.5;  // rad/s\n      coarse = 0.5;  // rad/s\n    };\n    max_acc = \n    {\n      vision = 1.0;  // rad/s^2\n      fine   = 1.0;  // rad/s^2\n      coarse = 1.0;  // rad/s^2\n    };\n    max_jerk = \n    {\n      vision = 5.0;  // rad/s^3\n      fine   = 5.0;  // rad/s^3\n      coarse = 5.0;  // rad/s^3\n    };\n    position_threshold = \n    {\n      vision = 0.02;   // rad\n      fine   = 0.02;   // rad\n      coarse = 0.08;   // rad\n    };\n    velocity_threshold = \n    {\n      vision = 0.01;   // rad/s\n      fine   = 0.01;   // rad/s\n      coarse = 0.05;   //rad/s\n    };\n    gains =\n    {\n      kp = 1.0;\n      ki = 0.0;\n      kd = 0.0;\n    };\n    gains_vision =\n    {\n      kp = 1.0;\n      ki = 0.0;\n      kd = 0.0;\n    };\n  };\n};\n\nphysical = \n{\n  wheel_diameter = 1.0;  // m\n  wheel_dist_from_center = 0.5; // m\n};\n\ntrajectory_generation = \n{\n  solver_max_loops   = 30;     // Only let the solver loop this many times before giving up\n  solver_alpha_decay = 0.8;    // Decay for velocity limit\n  solver_beta_decay  = 0.8;    // Decay for acceleration limit\n  solver_exponent_decay = 0.1; // Decay expoenent to apply each loop\n  min_dist_limit    = 0.0001;  // Smallest value solver will attempt to solve for\n};\n\ntray = \n{\n  default_pos_revs = 1.2;     // Default position for driving in revs from home\n  load_pos_revs    = 5.0;     // Loading position in revs from home\n  place_pos_revs   = 67.0;    // Placing position in revs from home\n  steps_per_rev    = 800;     // Number of steps per motor rev\n  controller_frequency  = 20000 ;    // Hz for controller rate - large so that tests don't skip updates\n  fake_tray_motions = false;   // Flag to fake tray motions for testing\n};\n\nlocalization = \n{\n  mm_x_offset = -55.0;                      // X offset of mm pair center from center of rotation (in millimeters, + is in front, - is behind)\n  mm_y_offset = 0.0;                        // Y offset of mm pair center from center of rotation (in millimeters, + is to left, - is to right)\n  max_wait_time = 10.0;                     // Max time to wait (s) when waiting for good localization\n  confidence_for_wait = 0.95;               // Confidence level required for waiting for localization \n  kf_predict_trans_cov = 0.1;               // How much noise is expected in the prediciton step for position, lower is less noise\n  kf_predict_angle_cov = 0.1;               // How much noise is expected in the prediction step for angle, lower is less noise\n  kf_meas_trans_cov = 0.1;                  // How much noise is expected in the marvelmind measurements of position, lower is less noise\n  kf_meas_angle_cov = 0.1;                  // How much noise is expected in the marvelmind measurements of angle, lower is less noise\n  kf_uncertainty_scale = 1.0;               // Scale factor for how much extra noise to add based on position confidence\n  variance_ref_trans = 1.0;                 // Max position varaince for 0 confidence\n  variance_ref_angle = 1.57;                // Max angle variance for 0 confidence\n  min_vel_uncertainty = 0.5;                // Minimum uncertianty for any nonzero velocity\n  vel_uncertainty_slope = 0.5;              // Uncertianty/velocity scale\n  max_vel_uncetainty = 1.0;                 // Uncertainty cap\n  vel_uncertainty_decay_time = 2.0;         // Number of seconds after completeing motion until uncertainty goes to 0\n};\n\n// USB ports on pi mapping to v4l path num\n//  USB2 - 3  |   USB3 - 1   |   Eth\n//  USB2 - 4  |   USB3 - 2   |\nvision_tracker = {\n  side = {\n    camera_path = \"/dev/v4l/by-path/platform-fd500000.pcie-pci-0000:01:00.0-usb-0:1.1:1.0-video-index0\"\n    calibration_file = \"/home/pi/IR_calibration_2.yml\"; // Calibration data\n    debug_output_path = \"/home/pi/images/debug/side/\"; // Path to output debug images\n    debug_image = \"/home/pi/images/dev/side_raw_1.jpg\"; // Debug image path\n    resolution_scale_x = 0.25;                             // Scale factor from calibration resolution to onboard resolution\n    resolution_scale_y = 0.333333;                         // Scale factor from calibration resolution to onboard resolution\n  }\n  rear = {\n    camera_path = \"/dev/v4l/by-path/platform-fd500000.pcie-pci-0000:01:00.0-usb-0:1.2:1.0-video-index0\"\n    calibration_file = \"/home/pi/IR_calibration_1.yml\"; // Calibration data\n    debug_output_path = \"/home/pi/images/debug/rear/\"; // Path to output debug images\n    debug_image = \"/home/pi/images/dev/rear_raw_1.jpg\"; // Debug image path\n    resolution_scale_x = 0.25;                             // Scale factor from calibration resolution to onboard resolution\n    resolution_scale_y = 0.333333;                         // Scale factor from calibration resolution to onboard resolution\n  }\n  debug = {\n    use_debug_image = true;                   // Use debug image instead of loading camera\n    save_camera_debug = false;\n  }\n  detection = {\n    threshold = 235;                         // Threshold value for image processing\n    blob = {\n      use_area = true;                      // Use area for blob detection\n      min_area = 80.0;                      // Min area for blob detection\n      max_area = 500.0;                    // Max area for blob detection\n      use_circularity = true;               // Use circularity for blob detection\n      min_circularity = 0.6;                // Min circularity for blob detection\n      max_circularity = 1.0;                  // Max circularity for blob detection\n    }\n  }\n  physical = {\n    pixels_per_meter_u = 1362.0;              // Hack to try and get something working\n    pixels_per_meter_v = 1411.0;\n    side = {\n      x_offset = 0.0;                       // x offset (meters) from robot frame\n      y_offset = -1.5;                      // y offset (meters) from robot frame\n      z_offset = 0.55;                      // z offset (meters) from robot frame\n      target_x = 0.0;                       // target x coord (meters) in robot frame           \n      target_y = -1.4;                      // target y coord (meters) in robot frame\n    }\n    rear = {\n      x_offset = -1.5;                      // x offset (meters) from robot frame\n      y_offset = 0.0;                       // y offset (meters) from robot frame\n      z_offset = 0.55;                      // z offset (meters) from robot frame\n      target_x = -1.4;                      // target x coord (meters) in robot frame           \n      target_y = 0.0;                       // target y coord (meters) in robot frame\n    }\n  }\n  kf = \n  {\n    predict_trans_cov = 0.0001;               // How much noise is expected in the prediciton step (each timestep) for position, lower is less noise\n    predict_angle_cov = 0.001;                // How much noise is expected in the prediciton step (each timestep) for angle, lower is less noise\n    meas_trans_cov = 0.01;                   // How much noise is expected in the update step for position, lower is less noise\n    meas_angle_cov = 1.0;                     // How much noise is expected in the update step for angle, lower is less noise\n  }\n};\n"
  },
  {
    "path": "src/robot/test/utils_Test.cpp",
    "content": "#include <Catch/catch.hpp>\n#include <unistd.h>\n\n#include \"utils.h\"\n#include \"test-utils.h\"\n\nTEST_CASE(\"Sign util test\", \"[utils]\") \n{\n    REQUIRE(sgn(1) == 1);\n    REQUIRE(sgn(-1) == -1);\n    REQUIRE(sgn(0) == 0);\n    REQUIRE(sgn(1.234) == 1);\n    REQUIRE(sgn(-2.854) == -1);\n    REQUIRE(sgn(-0.0000001) == -1);\n}\n\nTEST_CASE(\"Angle wrap test\", \"[utils]\")\n{\n    REQUIRE(Approx(wrap_angle(M_PI-0.1)) == M_PI-0.1);\n    REQUIRE(Approx(wrap_angle(-M_PI+0.1)) == -M_PI+0.1);\n    REQUIRE(Approx(wrap_angle(0)) == 0);\n    REQUIRE(Approx(wrap_angle(2*M_PI)).margin(0.0001) == 0);\n    REQUIRE(Approx(wrap_angle(1.5*M_PI)) == -0.5*M_PI);\n    REQUIRE(Approx(wrap_angle(-1.5*M_PI)) == 0.5*M_PI);\n    REQUIRE(Approx(wrap_angle(4.5*M_PI)) == 0.5*M_PI);\n}\n\nTEST_CASE(\"Angle diff test\", \"[utils]\")\n{\n    REQUIRE(angle_diff(0,0) == 0);\n    REQUIRE(angle_diff(M_PI,M_PI) == 0);\n    REQUIRE(Approx(angle_diff(0,2*M_PI)).margin(0.0001) == 0);\n    REQUIRE(Approx(angle_diff(1.5*M_PI,0.5*M_PI)) == M_PI);\n    REQUIRE(Approx(angle_diff(1.5*M_PI,3*M_PI)) == 0.5*M_PI);\n    REQUIRE(Approx(angle_diff(-1.5*M_PI,3*M_PI)) == -0.5*M_PI);\n}\n\nTEST_CASE(\"TimeRunningAverage - Test constant time pauses\", \"[utils]\") \n{\n    int window = 10;\n    int sleep_time_us = 2000;\n    TimeRunningAverage T = TimeRunningAverage(window);\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n\n    for(int i = 0; i < window+1; i++)\n    {\n        T.mark_point();\n        mock_clock->advance_us(sleep_time_us);\n    }\n\n    REQUIRE(T.get_ms() == sleep_time_us/1000);\n    REQUIRE(T.get_sec() == Approx(sleep_time_us/1000000.0));\n\n    SECTION(\"Wrap buffer\")\n    {\n        for(int i = 0; i < window+1; i++)\n        {\n            T.mark_point();\n            mock_clock->advance_us(sleep_time_us);\n        }\n\n        REQUIRE(T.get_ms() == sleep_time_us/1000);\n        REQUIRE(T.get_sec() == Approx(sleep_time_us/1000000.0));\n    }\n}\n\nTEST_CASE(\"TimeRunningAverage - Test larger buffer\", \"[utils]\") \n{\n    int window = 100;\n    int sleep_time_us = 1000;\n    TimeRunningAverage T = TimeRunningAverage(window);\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n\n    for(int i = 0; i < window+1; i++)\n    {\n        T.mark_point();\n        mock_clock->advance_us(sleep_time_us);\n    }\n\n    REQUIRE(T.get_ms() == sleep_time_us/1000);\n    REQUIRE(T.get_sec() == Approx(sleep_time_us/1000000.0));\n\n    SECTION(\"Wrap buffer\")\n    {\n        for(int i = 0; i < window+1; i++)\n        {\n            T.mark_point();\n            mock_clock->advance_us(sleep_time_us);\n        }\n\n        REQUIRE(T.get_ms() == sleep_time_us/1000);\n        REQUIRE(T.get_sec() == Approx(sleep_time_us/1000000.0));\n    }\n}\n\nTEST_CASE(\"TimeRunningAverage - Test no data\", \"[utils]\")\n{\n    TimeRunningAverage T = TimeRunningAverage(100);\n    REQUIRE(T.get_ms() == 0);\n    REQUIRE(T.get_sec() == 0);\n}\n\nTEST_CASE(\"TimeRunningAverage - Variable timing\", \"[utils]\")\n{\n    int window = 10;\n    int sleep_time_us = 2000;\n    TimeRunningAverage T = TimeRunningAverage(window);\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n\n    SECTION(\"Partial buffer\")\n    {\n\n        for(int i = 0; i < 6; i++)\n        {\n            T.mark_point();\n            mock_clock->advance_us(sleep_time_us);\n            sleep_time_us += 2000;\n        }\n        REQUIRE(T.get_ms() == 6);\n\n        SECTION(\"Full buffer\")\n        {\n\n            for(int i = 0; i < 5; i++)\n            {\n                T.mark_point();\n                mock_clock->advance_us(sleep_time_us);\n                sleep_time_us += 2000;\n            }\n            REQUIRE(T.get_ms() == 11);\n        }\n    }\n}\n\nTEST_CASE(\"Rate controller - slow\", \"[utils]\")\n{\n    SafeConfigModifier<bool> config_modifier(\"motion.rate_always_ready\", false);\n    SECTION(\"Slow\")\n    {\n        MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n        RateController r(1);\n        REQUIRE(r.ready() == false);\n        mock_clock->advance_sec(0.5);\n        REQUIRE(r.ready() == false);\n        mock_clock->advance_sec(0.55);\n        REQUIRE(r.ready() == true);\n        REQUIRE(r.ready() == false);\n    }\n    SECTION(\"Fast\")\n    {\n        MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n        RateController r(1000);\n        REQUIRE(r.ready() == false);\n        mock_clock->advance_us(500);\n        REQUIRE(r.ready() == false);\n        mock_clock->advance_us(600);\n        REQUIRE(r.ready() == true);\n        REQUIRE(r.ready() == false);\n        mock_clock->advance_us(1100);\n        REQUIRE(r.ready() == true);\n    }\n}\n\n\nTEST_CASE(\"MockClockWrapper\", \"[utils]\")\n{\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n    SECTION(\"Microseconds\")\n    {\n        int count = 100;\n        ClockTimePoint point = mock_clock->now();\n        mock_clock->advance_us(count);\n        ClockTimePoint new_point = mock_clock->now();\n        std::chrono::microseconds test_duration = \n            std::chrono::duration_cast<std::chrono::microseconds>(new_point - point);\n        REQUIRE(test_duration.count() == count);\n    }\n    SECTION(\"Milliseconds\")\n    {\n        int count = 100;\n        ClockTimePoint point = mock_clock->now();\n        mock_clock->advance_ms(count);\n        ClockTimePoint new_point = mock_clock->now();\n        std::chrono::milliseconds test_duration = \n            std::chrono::duration_cast<std::chrono::milliseconds>(new_point - point);\n        REQUIRE(test_duration.count() == count);\n    }\n    SECTION(\"Seconds\")\n    {\n        float count = 5.2;\n        ClockTimePoint point = mock_clock->now();\n        mock_clock->advance_sec(count);\n        ClockTimePoint new_point = mock_clock->now();\n        FpSeconds test_duration = \n            std::chrono::duration_cast<FpSeconds>(new_point - point);\n        REQUIRE(test_duration.count() == count);\n    }\n    SECTION(\"Set\")\n    {\n        ClockTimePoint point = std::chrono::steady_clock::now();\n        mock_clock->set(point);\n        REQUIRE(mock_clock->now() == point);\n    }\n}\n\nTEST_CASE(\"ClockWrapper\", \"[utils]\")\n{\n    ClockWrapper clock;\n    ClockTimePoint p1 = std::chrono::steady_clock::now();\n    usleep(1);\n    ClockTimePoint p2 = clock.now();\n    std::chrono::microseconds test_duration = \n            std::chrono::duration_cast<std::chrono::microseconds>(p2 - p1);\n    REQUIRE(test_duration.count() > 0);\n    REQUIRE(test_duration.count() < 1000);\n}\n\n\nTEST_CASE(\"Timer\", \"[utils]\")\n{\n    MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n\n    Timer t;\n    REQUIRE(t.dt_s() == 0);\n    REQUIRE(t.dt_ms() == 0);\n    REQUIRE(t.dt_us() == 0);\n\n    mock_clock->advance_sec(0.5);\n    REQUIRE(t.dt_s() == 0.5f);\n    REQUIRE(t.dt_ms() == 500);\n    REQUIRE(t.dt_us() == 500000);\n\n    t.reset();\n    REQUIRE(t.dt_s() == 0);\n    REQUIRE(t.dt_ms() == 0);\n    REQUIRE(t.dt_us() == 0);\n\n    mock_clock->advance_ms(100);\n    REQUIRE(t.dt_s() == 0.1f);\n    REQUIRE(t.dt_ms() == 100);\n    REQUIRE(t.dt_us() == 100000);\n\n    t.reset();\n    REQUIRE(t.dt_s() == 0);\n    REQUIRE(t.dt_ms() == 0);\n    REQUIRE(t.dt_us() == 0);\n\n    mock_clock->advance_us(100);\n    REQUIRE(t.dt_s() == 0.0001f);\n    REQUIRE(t.dt_ms() == 0);\n    REQUIRE(t.dt_us() == 100);\n\n    t.reset();\n    REQUIRE(t.dt_s() == 0);\n    REQUIRE(t.dt_ms() == 0);\n    REQUIRE(t.dt_us() == 0);\n}\n\nTEST_CASE(\"CircularBuffer\", \"[utils]\")\n{\n    SECTION(\"CheckFull\")\n    {\n        CircularBuffer<int> buf(5);\n\n        buf.insert(1);\n        buf.insert(2);\n        buf.insert(3);\n        buf.insert(4);\n        REQUIRE(buf.isFull() == false);\n\n        buf.insert(1);\n        REQUIRE(buf.isFull() == true);\n\n        buf.insert(2);\n        REQUIRE(buf.isFull() == true);\n\n        buf.clear();\n        REQUIRE(buf.isFull() == false);\n    }\n    SECTION(\"CheckContents\")\n    {\n        CircularBuffer<int> buf(5);\n\n        buf.insert(1);\n        buf.insert(2);\n        buf.insert(3);\n        buf.insert(4);\n        REQUIRE(buf.get_contents() == std::vector<int>{1,2,3,4});\n\n        buf.insert(5);\n        REQUIRE(buf.get_contents() == std::vector<int>{1,2,3,4,5});\n\n        buf.insert(6);\n        REQUIRE(buf.get_contents() == std::vector<int>{2,3,4,5,6});\n\n        buf.clear();\n        REQUIRE(buf.get_contents() == std::vector<int>{});\n\n    }\n    \n}\n\n\nTEST_CASE(\"PositionController\", \"[utils]\")\n{\n\n    SECTION(\"P only\")\n    {\n        PositionController::Gains gains {1,0,0};\n        PositionController controller(gains);\n\n        float target_pos = 1.0;\n        float actual_pos = 0.0;\n        float target_vel = 0.0;\n        float actual_vel = 0.0;\n        float dt = 0.1;\n        float cmd_vel = controller.compute(target_pos, actual_pos, target_vel, actual_vel, dt);\n\n        REQUIRE(cmd_vel == 1.0);\n    }\n\n    SECTION(\"PD\")\n    {\n        PositionController::Gains gains {1,0,0.5};\n        PositionController controller(gains);\n\n        float target_pos = 1.0;\n        float actual_pos = 0.0;\n        float target_vel = 1.0;\n        float actual_vel = 0.0;\n        float dt = 0.1;\n        float cmd_vel = controller.compute(target_pos, actual_pos, target_vel, actual_vel, dt);\n\n        REQUIRE(cmd_vel == 2.5);\n    }\n\n    SECTION(\"PID\")\n    {\n        PositionController::Gains gains {1,1,0.5};\n        PositionController controller(gains);\n\n        float target_pos = 1.0;\n        float actual_pos = 0.0;\n        float target_vel = 1.0;\n        float actual_vel = 0.0;\n        float dt = 0.1;\n        float cmd_vel = controller.compute(target_pos, actual_pos, target_vel, actual_vel, dt);\n\n        \n        // Compute again to get additive I gain\n        cmd_vel = controller.compute(target_pos, actual_pos, target_vel, actual_vel, dt);\n        REQUIRE(cmd_vel == 2.7f);\n\n        // Reset gain\n        controller.reset();\n        cmd_vel = controller.compute(target_pos, actual_pos, target_vel, actual_vel, dt);\n        REQUIRE(cmd_vel == 2.6f);\n    }\n\n    SECTION(\"PID step response no noise\")\n    {\n        PositionController::Gains gains {2,1,0.01};\n        PositionController controller(gains);\n\n        float target_pos = 1.0;\n        float actual_pos = 0.0;\n        float target_vel = 0.0;\n        float actual_vel = 0.0;\n        float dt = 0.1;\n\n        for (int i = 0; i < 100; i++) \n        {\n            actual_vel = controller.compute(target_pos, actual_pos, target_vel, actual_vel, dt);\n            actual_pos += actual_vel*dt;\n        }\n\n        REQUIRE(actual_vel == Approx(target_vel).margin(0.001));\n        REQUIRE(actual_pos == Approx(target_pos).margin(0.001));\n    }\n\n}\n\n\nTEST_CASE(\"VectorMath\", \"[utils]\")\n{\n    SECTION(\"Simple data\")\n    {\n        std::vector<float> data = {1,2,3,4,5,6,7,8,9,10};\n        float mean = vectorMean(data);\n        float stddev = vectorStddev(data, mean);\n        float zscore = zScore(mean, stddev, 11);\n        REQUIRE(mean == 5.5);\n        REQUIRE(stddev == Approx(2.8723).margin(0.001));\n        REQUIRE(zscore == Approx(1.9148).margin(0.001));\n    }\n    SECTION(\"No data\")\n    {\n        std::vector<float> data = {};\n        float mean = vectorMean(data);\n        float stddev = vectorStddev(data, mean);\n        float zscore = zScore(mean, stddev, 11);\n        REQUIRE(mean == 0);\n        REQUIRE(stddev == 0);\n        REQUIRE(zscore == 0);\n    }\n}\n\nTEST_CASE(\"StringParse\", \"[utils]\")\n{\n    SECTION(\"Empty string\")\n    {\n        std::string test_string = \"\";\n        std::vector<std::string> result = parseCommaDelimitedString(test_string);\n        REQUIRE(result.size() == 1);\n        REQUIRE(result[0].empty());\n    }\n    SECTION(\"Normal string, no trailing comma\")\n    {\n        std::string test_string = \"asdf,1234,hello\";\n        std::vector<std::string> result = parseCommaDelimitedString(test_string);\n        REQUIRE(result.size() == 3);\n        REQUIRE_THAT(result, Catch::Matchers::Equals(std::vector<std::string>{\"asdf\", \"1234\", \"hello\"}));\n    }\n    SECTION(\"Normal string, with trailing comma\")\n    {\n        std::string test_string = \"asdf,1234,hello,\";\n        std::vector<std::string> result = parseCommaDelimitedString(test_string);\n        REQUIRE(result.size() == 4);\n        REQUIRE_THAT(result, Catch::Matchers::Equals(std::vector<std::string>{\"asdf\", \"1234\", \"hello\", \"\"}));\n    }\n    SECTION(\"String with whitespace\")\n    {\n        std::string test_string = \"as df,1234  ,  hello\";\n        std::vector<std::string> result = parseCommaDelimitedString(test_string);\n        REQUIRE(result.size() == 3);\n        REQUIRE_THAT(result, Catch::Matchers::Equals(std::vector<std::string>{\"as df\", \"1234  \", \"  hello\"}));\n    }\n}\n\nTEST_CASE(\"StringToFloatParse\", \"[utils]\")\n{\n    SECTION(\"Empty string\")\n    {\n        std::string test_string = \"\";\n        std::vector<float> result = parseCommaDelimitedStringToFloat(test_string);\n        REQUIRE(result.empty());\n    }\n    SECTION(\"Normal string, no trailing comma\")\n    {\n        std::string test_string = \"1.0,2.345,-1.2363\";\n        std::vector<float> result = parseCommaDelimitedStringToFloat(test_string);\n        REQUIRE(result.size() == 3);\n        REQUIRE_THAT(result, Catch::Matchers::Equals(std::vector<float>{1.0, 2.345, -1.2363}));\n    }\n    SECTION(\"Normal string, with trailing comma\")\n    {\n        std::string test_string = \"1.0,2.345,-1.2363,\";\n        std::vector<float> result = parseCommaDelimitedStringToFloat(test_string);\n        REQUIRE(result.size() == 3);\n        REQUIRE_THAT(result, Catch::Matchers::Equals(std::vector<float>{1.0, 2.345, -1.2363}));\n    }\n    SECTION(\"String with whitespace\")\n    {\n        std::string test_string = \"1.0   ,2.345  ,  -1.2363\";\n        std::vector<float> result = parseCommaDelimitedStringToFloat(test_string);\n        REQUIRE(result.size() == 3);\n        REQUIRE_THAT(result, Catch::Matchers::Equals(std::vector<float>{1.0, 2.345, -1.2363}));\n    }\n}\n\nTEST_CASE(\"LatchedBool\", \"[utils]\")\n{\n    SECTION(\"All true\")\n    {\n        MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n        LatchedBool lb(1.0);\n        CHECK(lb.update(true) == true);\n        mock_clock->advance_sec(0.5);\n        CHECK(lb.update(true) == true);\n        mock_clock->advance_sec(0.6);\n        CHECK(lb.update(true) == true);\n    }\n    SECTION(\"All false\")\n    {\n        MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n        LatchedBool lb(1.0);\n        CHECK(lb.update(false) == false);\n        mock_clock->advance_sec(0.5);\n        CHECK(lb.update(false) == false);\n        mock_clock->advance_sec(0.6);\n        CHECK(lb.update(false) == false);\n    }\n    SECTION(\"True then false then true\")\n    {\n        MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n        LatchedBool lb(1.0);\n        CHECK(lb.update(true) == true);\n        mock_clock->advance_sec(0.5);\n        CHECK(lb.update(false) == true);\n        CHECK(lb.update(false) == true);\n        CHECK(lb.update(false) == true);\n        mock_clock->advance_sec(0.6);\n        CHECK(lb.update(false) == false);\n        CHECK(lb.update(true) == true);\n        mock_clock->advance_sec(0.5);\n        CHECK(lb.update(false) == true);\n        mock_clock->advance_sec(0.6);\n        CHECK(lb.update(false) == false);\n    }\n    SECTION(\"False then true then false\")\n    {\n        MockClockWrapper* mock_clock = get_mock_clock_and_reset();\n        LatchedBool lb(1.0);\n        CHECK(lb.update(false) == false);\n        mock_clock->advance_sec(0.5);\n        CHECK(lb.update(false) == false);\n        mock_clock->advance_sec(0.6);\n        CHECK(lb.update(true) == true);\n        mock_clock->advance_sec(0.6);\n        CHECK(lb.update(false) == true);\n    }\n}"
  },
  {
    "path": "src/robot_motor_driver/SerialComms.cpp",
    "content": "#include \"SerialComms.h\"\n\nSerialComms::SerialComms(HardwareSerial& serial)\n: serial_(serial),\n  recvInProgress_(false),\n  recvIdx_(0),\n  buffer_(\"\")\n{\n}\n\nSerialComms::~SerialComms()\n{\n}\n\nString SerialComms::rcv()\n{\n    bool newData = false;\n    String new_msg;\n    while (serial_.available() > 0 && newData == false) \n    {\n        char rc = serial_.read();\n        if (recvInProgress_ == true) \n        {\n            if (rc == START_CHAR)\n            {\n              buffer_ = \"\";\n            }\n            else if (rc != END_CHAR) \n            {\n                buffer_ += rc;\n            }\n            else \n            {\n                recvInProgress_ = false;\n                newData = true;\n                new_msg = buffer_;\n                buffer_ = \"\";\n            }\n        }\n        else if (rc == START_CHAR) \n        {\n            recvInProgress_ = true;\n        }\n    }\n    return new_msg;\n}\n\nvoid SerialComms::send(String msg)\n{\n    if (msg.length() > 0)\n    {\n      serial_.print(START_CHAR);\n      serial_.print(msg);\n      serial_.print(END_CHAR);\n    }\n}\n"
  },
  {
    "path": "src/robot_motor_driver/SerialComms.h",
    "content": "#ifndef SerialComms_h\n#define SerialComms_h\n\n#include <HardwareSerial.h>\n\n#define START_CHAR '<'\n#define END_CHAR '>'\n\nclass SerialComms\n{\n  public:\n    \n    // Constructor\n    SerialComms(HardwareSerial& serial);\n\n    virtual ~SerialComms();\n\n    void send(String msg);\n\n    String rcv();\n\n  protected:\n\n    HardwareSerial& serial_;\n\n    bool recvInProgress_;\n    int recvIdx_;\n    String buffer_;\n\n};\n\n#endif"
  },
  {
    "path": "src/robot_motor_driver/robot_motor_driver.ino",
    "content": "#include \"ClearCore.h\"\n#include \"SerialComms.h\"\n#include <math.h>\n\n// From https://teknic-inc.github.io/ClearCore-library/ArduinoRef.html\n// Serial = USB\n// Serial0 + Serial1 = COM ports\n\n#define PRINT_DEBUG false\n#define USE_FAKE_MOTOR false\n\n// Globals\nSerialComms comm(Serial);\n\n// --------------------------------------------------\n//                Base parameters\n// --------------------------------------------------\n\n// Constants\n#define WHEEL_RADIUS 0.075\n#define WHEEL_DIST_FROM_CENTER 0.405\n#define BELT_RATIO 4\n#define STEPS_PER_REV 800\n#define MOTOR_MAX_VEL_STEPS_PER_SECOND 10000\n#define MOTOR_MAX_ACC_STEPS_PER_SECOND_SQUARED 2600\n#define MAX_BASE_MSG_TIMEOUT 200\nconst float sq3 = sqrt(3.0);\nconst float rOver3 = WHEEL_RADIUS / 3.0;\nconst float radsPerSecondToStepsPerSecond = STEPS_PER_REV / (2 * PI);\n\n#if USE_FAKE_MOTOR\nfloat MOTOR_FRONT_LEFT_FAKE = 0;\nfloat MOTOR_FRONT_RIGHT_FAKE = 0;\nfloat MOTOR_REAR_CENTER_FAKE = 0;\n#else\n// Motor mapping\n#define MOTOR_FRONT_LEFT ConnectorM1\n#define MOTOR_FRONT_RIGHT ConnectorM2\n#define MOTOR_REAR_CENTER ConnectorM0\n#endif\n\n// --------------------------------------------------\n//                Lifter parameters\n// --------------------------------------------------\n\n#define LIFTER_MOTOR ConnectorM3\n#define INCREMENTAL_UP_PIN DI7\n#define INCREMENTAL_DOWN_PIN DI6\n#define SERVO_TOGGLE_PIN DI8\n#define LATCH_SERVO_PIN IO0 // Only IO0 does pwm\n#define HOMING_SWITCH_PIN IO4\n\n#define LIFTER_STEPS_PER_REV 800\n\n#define LIFTER_MAX_VEL 7  // revs/sec\n#define LIFTER_MAX_ACC 10  // rev/sec^2\n\n#define LIFTER_HOMING_VEL 3 // revs/sec\n\n#define SAFETY_MAX_POS 70*LIFTER_STEPS_PER_REV  // Revs, Sanity check on desired position to make sure it isn't larger than this\n#define SAFETY_MIN_POS 0 // Revs, Sanity check on desired position to make sure it isn't less than this\n\n#define LATCH_ACTIVE_MS 1000\n#define LATCH_OPEN_DUTY_CYCLE 45\n#define LATCH_CLOSE_DUTY_CYCLE 120\n#define MANUAL_LATCH_BUTTON_DELAY 200 // Poor man's debounce\n\n// --------------------------------------------------\n//                Helper structs\n// --------------------------------------------------\n\nenum MODE\n{\n    NONE,\n    MANUAL_VEL,\n    AUTO_POS,\n    HOMING,\n    LATCH_OPEN,\n    LATCH_CLOSE,\n};\n\nstruct Command\n{\n    int abs_pos;\n    bool home;\n    bool valid;\n    bool stop;\n    bool latch_open;\n    bool latch_close;\n    bool status_req;\n};\n\nstruct CartVelocity\n{\n    float vx;\n    float vy;\n    float va;\n\n    String toString() const\n    {\n        char s[100];\n        sprintf(s, \"[vx: %.4f, vy: %.4f, va: %.4f]\", vx, vy, va);\n        return static_cast<String>(s);\n    }\n};\n\nstruct MotorVelocity\n{\n    float v0;\n    float v1;\n    float v2;\n\n    String toString() const\n    {\n        char s[100];\n        sprintf(s, \"[v0: %.4f, v1: %.4f, v2: %.4f]\", v0, v1, v2);\n        return static_cast<String>(s);\n    }\n\n    bool nonzero() const\n    {\n      return v0 != 0 && v1 != 0 && v2 != 0;\n    }\n};\n\n\n// --------------------------------------------------\n//                Lifter control functions\n// --------------------------------------------------\n\nMODE activeMode = MODE::NONE;\nunsigned long prevLatchMillis = millis();\nunsigned long prevLatchManualTriggerMillis = millis();\nbool prev_latch_state_open = false;\nbool prev_button_state = false;\n\nvoid lifter_setup() \n{\n    \n    pinMode(INCREMENTAL_UP_PIN, INPUT);\n    pinMode(INCREMENTAL_DOWN_PIN, INPUT);\n    pinMode(SERVO_TOGGLE_PIN, INPUT);\n    pinMode(HOMING_SWITCH_PIN, INPUT);\n    pinMode(LATCH_SERVO_PIN, OUTPUT);\n\n    LIFTER_MOTOR.HlfbMode(MotorDriver::HLFB_MODE_STATIC);\n    LIFTER_MOTOR.PolarityInvertSDEnable(true);\n    LIFTER_MOTOR.PolarityInvertSDDirection(true);\n    LIFTER_MOTOR.VelMax(LIFTER_MAX_VEL*LIFTER_STEPS_PER_REV);\n    LIFTER_MOTOR.AccelMax(LIFTER_MAX_ACC*LIFTER_STEPS_PER_REV);\n    LIFTER_MOTOR.EnableRequest(false);\n}\n\nbool checkForLifterButtonFallingEdge(bool button_state) \n{\n    if (button_state != prev_button_state &&\n        millis() - prevLatchManualTriggerMillis > MANUAL_LATCH_BUTTON_DELAY)\n    {\n        prev_button_state = button_state;\n        prevLatchManualTriggerMillis = millis();\n        if (!button_state)\n        {\n            return true;\n        }\n    }\n    return false;\n}\n\n\n// Possible inputs are\n// \"home\", \"stop\", or an integer representing position to move to\n// \"open\" or \"close\" controls the latch servo\n// \"status_req\" will request a status of the mode\n// Note that this assumes the message has already been validated\nCommand decodeLifterMsg(String msg_in)\n{\n    // Strip identifier\n    String msg = msg_in.substring(5);\n    Command c = {0, false, false, false, false, false, false};\n\n#if PRINT_DEBUG\n    comm.send(\"DEBUG Incoming lifter message: \" + msg);\n#endif\n\n    if(msg == \"home\")\n    {\n        c.home = true;\n        c.valid = true;\n    }\n    else if(msg == \"stop\")\n    {\n        c.stop = true;\n        c.valid = true;\n    }\n    else if (msg == \"open\")\n    {\n        c.latch_open = true;\n        c.valid = true;\n    }\n    else if (msg == \"close\")\n    { \n        c.latch_close = true;\n        c.valid = true;\n    }\n    else if (msg == \"status_req\")\n    {\n      c.valid = true;\n      c.status_req = true;\n    }\n    else if (msg.startsWith(\"pos:\"))\n    {\n        String pos_msg = msg.substring(4);\n        c.abs_pos = pos_msg.toInt();\n        c.valid = true;\n    }\n\n    return c;\n}\n\nvoid lifter_update(String msg)\n{\n    // Verify if incoming message is for lifter\n    Command inputCommand = {0, false, false, false, false, false};\n    bool valid_msg = false;\n    if(msg.length() > 0 && msg.startsWith(\"lift:\")) \n    { \n        valid_msg = true; \n    }\n\n    // Parse command if it was valid\n    if(valid_msg) \n    {\n        inputCommand = decodeLifterMsg(msg);\n    }\n    \n    // Read inputs for manual mode\n    bool vel_up = digitalRead(INCREMENTAL_UP_PIN);\n    bool vel_down = digitalRead(INCREMENTAL_DOWN_PIN);\n    bool servo_button_state = digitalRead(SERVO_TOGGLE_PIN);\n\n    bool manaul_servo_toggle = false;\n    if (checkForLifterButtonFallingEdge(servo_button_state))\n    {\n        manaul_servo_toggle = true;\n    }\n\n    // For debugging only\n    // Serial.print(\"Vel up: \");\n    // Serial.print(vel_up);\n    // Serial.print(\" Vel dn: \");\n    // Serial.println(vel_down);\n\n    // If we got a stop command, make sure to handle it immediately\n    if (inputCommand.valid && inputCommand.stop)\n    {\n        activeMode = MODE::NONE;\n//        LIFTER_MOTOR.EnableRequest(false);\n    }\n    // For any other command, we will only handle it when there are no other active commands\n    else if (activeMode == MODE::NONE)\n    {\n        // Figure out what mode we are in\n        if(vel_up || vel_down)\n        {\n            activeMode = MODE::MANUAL_VEL;\n            LIFTER_MOTOR.EnableRequest(true);\n            if(vel_up)\n            {\n                LIFTER_MOTOR.MoveVelocity(-1*LIFTER_MAX_VEL*LIFTER_STEPS_PER_REV);\n            }\n            else if(vel_down)\n            {\n                LIFTER_MOTOR.MoveVelocity(LIFTER_MAX_VEL*LIFTER_STEPS_PER_REV);\n            }   \n        }\n        else if(inputCommand.valid && inputCommand.home)\n        {\n            activeMode = MODE::HOMING;\n            LIFTER_MOTOR.EnableRequest(true);\n            LIFTER_MOTOR.MoveVelocity(-1*LIFTER_HOMING_VEL*LIFTER_STEPS_PER_REV);\n        }\n        else if((inputCommand.valid && inputCommand.latch_open) || \n                 (manaul_servo_toggle && !prev_latch_state_open))\n        {\n            activeMode = MODE::LATCH_OPEN;\n            analogWrite(LATCH_SERVO_PIN, LATCH_OPEN_DUTY_CYCLE);\n            prev_latch_state_open = true;\n            prevLatchMillis = millis();\n        }\n        else if(inputCommand.valid && inputCommand.latch_close || \n                 (manaul_servo_toggle && prev_latch_state_open))\n        {\n            activeMode = MODE::LATCH_CLOSE;\n            analogWrite(LATCH_SERVO_PIN, LATCH_CLOSE_DUTY_CYCLE);\n            prev_latch_state_open = false;\n            prevLatchMillis = millis();\n        }\n        else if (inputCommand.valid && inputCommand.status_req)\n        {\n          #if PRINT_DEBUG\n            comm.send(\"DEBUG Lifter position: \" + String(LIFTER_MOTOR.PositionRefCommanded()));\n          #endif\n        }\n        else if(inputCommand.valid)\n        {\n            if(inputCommand.abs_pos <= SAFETY_MAX_POS && inputCommand.abs_pos >= SAFETY_MIN_POS)\n            {\n                activeMode = MODE::AUTO_POS;\n                long target = inputCommand.abs_pos;\n                LIFTER_MOTOR.EnableRequest(true);\n                LIFTER_MOTOR.Move(target, StepGenerator::MOVE_TARGET_ABSOLUTE);\n            }\n        }\n    }    \n\n\n    // Handle continous updates for each mode\n    String status_str = \"lift:none\";\n    if(activeMode == MODE::MANUAL_VEL)\n    {\n        status_str = \"lift:manual\";\n        if(!(vel_up || vel_down))\n        {\n            activeMode = MODE::NONE;\n        }\n    }\n    else if(activeMode == MODE::AUTO_POS)\n    {\n        status_str = \"lift:pos\";\n        if(LIFTER_MOTOR.StepsComplete())\n        {\n            activeMode = MODE::NONE;\n        }\n    }\n    else if(activeMode == MODE::HOMING)\n    {\n        if(!digitalRead(HOMING_SWITCH_PIN))\n        {\n            LIFTER_MOTOR.MoveStopAbrupt();\n            LIFTER_MOTOR.PositionRefSet(0);\n            activeMode = MODE::NONE;\n        }\n        status_str = \"lift:homing\";\n    }\n    else if (activeMode == MODE::LATCH_CLOSE)\n    {\n        status_str = \"lift:close\";\n        if(millis() - prevLatchMillis > LATCH_ACTIVE_MS)\n        {\n            activeMode = MODE::NONE;\n        }\n    }\n    else if (activeMode == MODE::LATCH_OPEN)\n    {\n        status_str = \"lift:open\";\n        if(millis() - prevLatchMillis > LATCH_ACTIVE_MS)\n        {\n            activeMode = MODE::NONE;\n        }\n    }\n    else if (activeMode == MODE::NONE)\n    {\n        LIFTER_MOTOR.MoveStopAbrupt();\n//        LIFTER_MOTOR.EnableRequest(false);\n    }\n\n    // Will send back one of [none, manual, pos, homing, open, close]\n    if (valid_msg) \n    {\n        comm.send(status_str);\n    }\n    // For debugging only\n    // Serial.println(\"\");\n}\n\n\n\n// --------------------------------------------------\n//                Base control functions\n// --------------------------------------------------\n\nunsigned long last_base_msg_millis = millis();\n\nCartVelocity decodeBaseMsg(String msg)\n{\n#if PRINT_DEBUG\n    comm.send(\"DEBUG Incoming base message: \" + msg);\n#endif\n    float vals[3];\n    int prev_idx = 0;\n    int j = 0;\n    for(int i = 0; i < msg.length(); i++)\n    {\n      if(msg[i] == ',')\n      {\n        vals[j] = msg.substring(prev_idx, i).toFloat();\n        j++;\n        prev_idx = i+1;\n      }\n\n      if (i == msg.length()-1)\n      {\n        vals[j] = msg.substring(prev_idx).toFloat();\n      }\n    }\n\n    CartVelocity cv;\n    cv.vx = vals[0];\n    cv.vy = vals[1];\n    cv.va = vals[2];\n    \n#if PRINT_DEBUG\n    comm.send(\"DEBUG Decoded message: \" + cv.toString());\n#endif\n\n    return cv;\n}\n\n// https://www.wolframalpha.com/input/?i=%5B%5B-cosd%2830%29%2C+sind%2830%29%2C+d%5D%2C%5Bcosd%2830%29%2C+sind%2830%29%2C+d%5D%2C%5B0%2C-1%2Cd%5D%5D\nMotorVelocity doIK(CartVelocity cmd)\n{\n    MotorVelocity motors;\n    motors.v0 = -1/WHEEL_RADIUS * (-1 * sq3 / 2 * cmd.vx  + 0.5 * cmd.vy + WHEEL_DIST_FROM_CENTER * cmd.va) * BELT_RATIO;\n    motors.v1 = -1/WHEEL_RADIUS * (     sq3 / 2 * cmd.vx  + 0.5 * cmd.vy + WHEEL_DIST_FROM_CENTER * cmd.va) * BELT_RATIO;\n    motors.v2 = -1/WHEEL_RADIUS * (                       - 1   * cmd.vy + WHEEL_DIST_FROM_CENTER * cmd.va) * BELT_RATIO;\n#if PRINT_DEBUG\n    comm.send(\"DEBUG After IK: \" + motors.toString());\n#endif\n    return motors;\n}\n\nvoid SendCommandsToMotors(MotorVelocity motors)\n{\n#if PRINT_DEBUG\n    comm.send(\"DEBUG Send to motor: \" + String(motors.v0 * radsPerSecondToStepsPerSecond));\n#endif\n\n#if USE_FAKE_MOTOR\n    MOTOR_FRONT_LEFT_FAKE  = motors.v0 * radsPerSecondToStepsPerSecond;\n    MOTOR_FRONT_RIGHT_FAKE = motors.v1 * radsPerSecondToStepsPerSecond;\n    MOTOR_REAR_CENTER_FAKE = motors.v2 * radsPerSecondToStepsPerSecond;\n#else\n    MOTOR_FRONT_LEFT.MoveVelocity(motors.v0 * radsPerSecondToStepsPerSecond);\n    MOTOR_FRONT_RIGHT.MoveVelocity(motors.v1 * radsPerSecondToStepsPerSecond);\n    MOTOR_REAR_CENTER.MoveVelocity(motors.v2 * radsPerSecondToStepsPerSecond);\n#endif\n}\n\nMotorVelocity ReadMotorSpeeds()\n{\n    MotorVelocity measured;\n\n#if USE_FAKE_MOTOR\n     measured.v0 = MOTOR_FRONT_LEFT_FAKE / radsPerSecondToStepsPerSecond;\n     measured.v1 = MOTOR_FRONT_RIGHT_FAKE / radsPerSecondToStepsPerSecond;\n     measured.v2 = MOTOR_REAR_CENTER_FAKE / radsPerSecondToStepsPerSecond;\n#else\n    measured.v0 = MOTOR_FRONT_LEFT.VelocityRefCommanded() / radsPerSecondToStepsPerSecond;\n    measured.v1 = MOTOR_FRONT_RIGHT.VelocityRefCommanded() / radsPerSecondToStepsPerSecond;\n    measured.v2 = MOTOR_REAR_CENTER.VelocityRefCommanded() / radsPerSecondToStepsPerSecond;\n#endif\n\n#if PRINT_DEBUG\n    comm.send(\"DEBUG Motor measured: \" + measured.toString());\n#endif\n\n    return measured;\n}\n\n// https://www.wolframalpha.com/input/?i=inv%28%5B%5B-cosd%2830%29%2C+sind%2830%29%2C+d%5D%2C%5Bcosd%2830%29%2C+sind%2830%29%2C+d%5D%2C%5B0%2C-1%2Cd%5D%5D%29\nCartVelocity doFK(MotorVelocity motor_measured)\n{\n    MotorVelocity wheel_speed;\n    wheel_speed.v0 = motor_measured.v0 / float(BELT_RATIO);\n    wheel_speed.v1 = motor_measured.v1 / float(BELT_RATIO);\n    wheel_speed.v2 = motor_measured.v2 / float(BELT_RATIO);\n    \n    CartVelocity robot_measured;\n    robot_measured.vx = -rOver3 * (-1 * sq3 * wheel_speed.v0 + sq3 * wheel_speed.v1);\n    robot_measured.vy = -rOver3 * ( wheel_speed.v0 + wheel_speed.v1 - 2.0 * wheel_speed.v2); \n    robot_measured.va = -rOver3 / WHEEL_DIST_FROM_CENTER * (wheel_speed.v0 + wheel_speed.v1 + wheel_speed.v2);\n#if PRINT_DEBUG\n    comm.send(\"DEBUG After FK: \" + robot_measured.toString());\n#endif\n    return robot_measured;\n\n}\n\nvoid ReportRobotVelocity(CartVelocity robot_v_measured)\n{\n    char msg[32];\n    sprintf(msg, \"base:%.3f,%.3f,%.3f\",robot_v_measured.vx, robot_v_measured.vy, robot_v_measured.va);\n    comm.send(msg);\n}\n\n\n// Checks for any motor on/off requests before trying to parse for commanded speeds\nbool handlePowerRequests(String msg)\n{\n    if(msg == \"Power:ON\")\n    {\n        MOTOR_FRONT_RIGHT.EnableRequest(true);\n        MOTOR_REAR_CENTER.EnableRequest(true);\n        MOTOR_FRONT_LEFT.EnableRequest(true);\n#if PRINT_DEBUG\n        comm.send(\"DEBUG Motors enabled\");\n#endif\n        return true;\n    }\n    else if (msg == \"Power:OFF\")\n    {\n        MOTOR_FRONT_RIGHT.EnableRequest(false);\n        MOTOR_REAR_CENTER.EnableRequest(false);\n        MOTOR_FRONT_LEFT.EnableRequest(false);\n#if PRINT_DEBUG\n        comm.send(\"DEBUG Motors disabled\");\n#endif\n        return true;\n    }\n    return false;\n}\n\nvoid base_setup()\n{    \n    MOTOR_FRONT_RIGHT.EnableRequest(false);\n    MOTOR_FRONT_RIGHT.VelMax(MOTOR_MAX_VEL_STEPS_PER_SECOND);\n    MOTOR_FRONT_RIGHT.AccelMax(MOTOR_MAX_ACC_STEPS_PER_SECOND_SQUARED);\n    MOTOR_REAR_CENTER.EnableRequest(false);\n    MOTOR_REAR_CENTER.VelMax(MOTOR_MAX_VEL_STEPS_PER_SECOND);\n    MOTOR_REAR_CENTER.AccelMax(MOTOR_MAX_ACC_STEPS_PER_SECOND_SQUARED);\n    MOTOR_FRONT_LEFT.EnableRequest(false);\n    MOTOR_FRONT_LEFT.VelMax(MOTOR_MAX_VEL_STEPS_PER_SECOND);\n    MOTOR_FRONT_LEFT.AccelMax(MOTOR_MAX_ACC_STEPS_PER_SECOND_SQUARED);\n}\n\nbool isValidBaseMessage(String msg)\n{\n    bool result = false;\n    if(msg.length() > 0 && msg.startsWith(\"base:\")) \n    { \n        result = true; \n    }\n    return result;\n}\n\n\nvoid base_update(String msg)\n{    \n    MotorVelocity cur_motor_v = ReadMotorSpeeds();\n    if(millis() - last_base_msg_millis > MAX_BASE_MSG_TIMEOUT && cur_motor_v.nonzero()) \n    {\n      SendCommandsToMotors({0,0,0});\n      handlePowerRequests(\"Power:OFF\");\n    }\n    \n    if(!isValidBaseMessage(msg)) { return; }\n    last_base_msg_millis = millis();\n\n    // Strip base identifier\n    String new_msg = msg.substring(5);\n    \n    if(!handlePowerRequests(new_msg))\n    {\n        CartVelocity cmd_v = decodeBaseMsg(new_msg);\n        MotorVelocity motor_v = doIK(cmd_v);\n        SendCommandsToMotors(motor_v);\n        MotorVelocity motor_v_measured = ReadMotorSpeeds();\n        CartVelocity robot_v_measured = doFK(motor_v_measured);\n        ReportRobotVelocity(robot_v_measured);\n    }\n}\n\n\n// --------------------------------------------------\n//                Top level functions\n// --------------------------------------------------\n\nvoid setup()\n{\n    Serial.begin(115200);\n\n    // Sets the input clocking rate.\n    MotorMgr.MotorInputClocking(MotorManager::CLOCK_RATE_LOW);\n    // Sets all motor connectors into step and direction mode.\n    MotorMgr.MotorModeSet(MotorManager::MOTOR_ALL, Connector::CPM_MODE_STEP_AND_DIR);\n\n    base_setup();\n    lifter_setup();\n}\n\nvoid loop()\n{\n    String msg = comm.rcv();\n    base_update(msg);\n    lifter_update(msg);\n}\n"
  },
  {
    "path": "src/tools/80-domino-bot.rules",
    "content": "SUBSYSTEM==\"tty\", ATTRS{idVendor}==\"2890\", ATTRS{idProduct}==\"8022\", SYMLINK+=\"clearcore\"\nSUBSYSTEM==\"tty\", ATTRS{idVendor}==\"0483\", ATTRS{idProduct}==\"5740\", SYMLINK+=\"marvelmind%n\"\nSUBSYSTEM==\"tty\", ATTRS{idVendor}==\"2341\", ATTRS{idProduct}==\"0043\", SYMLINK+=\"arduino\"\nSUBSYSTEM==\"tty\", ATTRS{idVendor}==\"2341\", ATTRS{idProduct}==\"0042\", SYMLINK+=\"arduino\""
  },
  {
    "path": "src/tools/IR_calibration_1.yml",
    "content": "%YAML:1.0\n---\nK: !!opencv-matrix\n   rows: 3\n   cols: 3\n   dt: d\n   data: [ 1.2117719731192783e+03, 0., 6.4569867933163880e+02, 0.,\n       1.2132218864815018e+03, 3.0315790346461841e+02, 0., 0., 1. ]\nD: !!opencv-matrix\n   rows: 1\n   cols: 5\n   dt: d\n   data: [ -4.6874878599433956e-01, 7.8637235803345207e-01,\n       -5.0536502302069442e-04, 5.8119887567900750e-04,\n       -1.6010067485682662e+00 ]\n"
  },
  {
    "path": "src/tools/IR_calibration_2.yml",
    "content": "%YAML:1.0\n---\nK: !!opencv-matrix\n   rows: 3\n   cols: 3\n   dt: d\n   data: [ 1.2054254220822804e+03, 0., 6.7330983848794142e+02, 0.,\n       1.2054587537926946e+03, 3.4257042557877725e+02, 0., 0., 1. ]\nD: !!opencv-matrix\n   rows: 1\n   cols: 5\n   dt: d\n   data: [ -4.0579303514821868e-01, 2.1496514338793393e-01,\n       5.1830677921908667e-04, 1.2914137465804134e-04,\n       -1.6468164001861275e-02 ]\n"
  },
  {
    "path": "src/tools/camera_calibration.py",
    "content": "import numpy as np\nimport cv2\nimport glob\nimport argparse\nimport os\n\n# termination criteria\ncriteria = (cv2.TERM_CRITERIA_EPS + cv2.TERM_CRITERIA_MAX_ITER, 30, 0.001)\n\n\ndef calibrate(dirpath, prefix, image_format, square_size, width=9, height=6):\n    \"\"\" Apply camera calibration operation for images in the given directory path. \"\"\"\n    # prepare object points, like (0,0,0), (1,0,0), (2,0,0) ....,(8,6,0)\n    objp = np.zeros((height*width, 3), np.float32)\n    objp[:, :2] = np.mgrid[0:width, 0:height].T.reshape(-1, 2)\n\n    objp = objp * square_size\n\n    # Arrays to store object points and image points from all the images.\n    objpoints = []  # 3d point in real world space\n    imgpoints = []  # 2d points in image plane.\n\n    glob_str = os.path.join(dirpath, prefix + '*.' + image_format)\n    print(\"Looking for images that match: {}\".format(glob_str))\n    images = glob.glob(glob_str)\n\n    for fname in images:\n        img = cv2.imread(fname)\n        print(\"Loading image from {}\".format(fname))\n        gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)\n\n        # Find the chess board corners\n        ret, corners = cv2.findChessboardCorners(gray, (width, height), None)\n\n        # If found, add object points, image points (after refining them)\n        if ret:\n            objpoints.append(objp)\n\n            corners2 = cv2.cornerSubPix(gray, corners, (11, 11), (-1, -1), criteria)\n            imgpoints.append(corners2)\n\n            # Draw and display the corners\n            img = cv2.drawChessboardCorners(img, (width, height), corners2, ret)\n            cv2.imshow('img', img)\n            cv2.waitKey(500)\n\n    cv2.destroyAllWindows()\n\n    ret, mtx, dist, rvecs, tvecs = cv2.calibrateCamera(objpoints, imgpoints, gray.shape[::-1], None, None)\n\n    # Show calibrated images using last image\n    h,  w = img.shape[:2]\n    newcameramtx, roi = cv2.getOptimalNewCameraMatrix(mtx, dist, (w,h), 1, (w,h))\n    # undistort\n    dst = cv2.undistort(img, mtx, dist, None, newcameramtx)\n    # crop the image\n    x, y, w, h = roi\n    dst = dst[y:y+h, x:x+w]\n    cv2.imshow('raw', img)\n    cv2.waitKey(1000)\n    cv2.imshow('calibrated', dst)\n    cv2.waitKey(1000)\n    cv2.destroyAllWindows()\n\n    # reprojection error\n    mean_error = 0\n    for i in range(len(objpoints)):\n        imgpoints2, _ = cv2.projectPoints(objpoints[i], rvecs[i], tvecs[i], mtx, dist)\n        error = cv2.norm(imgpoints[i], imgpoints2, cv2.NORM_L2)/len(imgpoints2)\n        mean_error += error\n    print( \"total reprotjection error: {}\".format(mean_error/len(objpoints)) )\n\n\n    return [ret, mtx, dist, rvecs, tvecs]\ndef save_coefficients(mtx, dist, path):\n    \"\"\" Save the camera matrix and the distortion coefficients to given path/file. \"\"\"\n    cv_file = cv2.FileStorage(path, cv2.FILE_STORAGE_WRITE)\n    cv_file.write(\"K\", mtx)\n    cv_file.write(\"D\", dist)\n    # note you *release* you don't close() a FileStorage object\n    cv_file.release()\n    print(\"Wrote calibration file to {}\".format(path))\ndef load_coefficients(path):\n    \"\"\" Loads camera matrix and distortion coefficients. \"\"\"\n    # FILE_STORAGE_READ\n    cv_file = cv2.FileStorage(path, cv2.FILE_STORAGE_READ)\n\n    # note we also have to specify the type to retrieve other wise we only get a\n    # FileNode object back instead of a matrix\n    camera_matrix = cv_file.getNode(\"K\").mat()\n    dist_matrix = cv_file.getNode(\"D\").mat()\n\n    cv_file.release()\n    return [camera_matrix, dist_matrix]\n\n\nif __name__ == '__main__':\n    parser = argparse.ArgumentParser(description='Camera calibration')\n    parser.add_argument('--image_dir', type=str, default='C:\\\\Users\\\\alexb\\\\Pictures\\\\DominoRobotIRCalibration\\\\Camera2', help='image directory path')\n    parser.add_argument('--image_format', type=str, default='jpg',  help='image format, png/jpg')\n    parser.add_argument('--prefix', type=str, default='', help='image prefix')\n    parser.add_argument('--square_size', type=float, default=0.023, help='chessboard square size')\n    parser.add_argument('--width', type=int, default=9, help='chessboard width size, default is 9')\n    parser.add_argument('--height', type=int, default=6, help='chessboard height size, default is 6')\n    parser.add_argument('--save_file', type=str, default='IR_calibration_2.yml', help='YML file to save calibration matrices')\n\n    args = parser.parse_args()\n    ret, mtx, dist, rvecs, tvecs = calibrate(args.image_dir, args.prefix, args.image_format, args.square_size, args.width, args.height)\n    save_coefficients(mtx, dist, args.save_file)\n    print(\"Calibration is finished. RMS: \", ret)"
  },
  {
    "path": "src/tools/motor_test_script.py",
    "content": "import serial\nimport time\nimport signal\nimport sys\n\nSERIAL_PORT = 'COM7'\nSERIAL_BAUD = 115200\nSERIAL_TIMEOUT = 1\n\nSTART_CHAR = \"<\"\nEND_CHAR = \">\"\n\n# Copied and modified from RobotClient.TcpClient\nclass SerialClient:\n\n    def __init__(self, port, baud, timeout):\n        self.ser = serial.Serial(port=port, baudrate=baud, timeout=timeout)\n        if not self.ser:\n            return None\n\n    def send(self, msg):\n        msg = START_CHAR + msg + END_CHAR\n        msg_bytes = msg.encode()\n        self.ser.write(msg_bytes)\n\n    def recieve(self, timeout=0.1):\n\n        new_msg = \"\"\n        new_msg_ready = False\n        start_time = time.time()\n        while not new_msg_ready and time.time() - start_time < timeout:\n            \n            # Get new data\n            data = self.ser.read()\n            if data == b'':\n                break\n\n            # Decode data and parse into message\n            new_str = data.decode(encoding='UTF-8',errors='strict')\n\n            start_idx = new_str.find(START_CHAR)\n            end_idx = new_str.find(END_CHAR)\n\n            if start_idx != -1 and end_idx != -1:\n                new_msg = new_str[start_idx+1:end_idx]\n                new_msg_ready = True\n            elif start_idx != -1:\n                new_msg += new_str[start_idx+1:]\n                msg_rcv_in_progress = True\n            elif end_idx != -1:\n                new_msg += new_str[:end_idx]\n                new_msg_ready = True\n            else:\n                new_msg += new_str\n\n        if not new_msg_ready:\n            new_msg = \"\"\n\n        return new_msg\n\ndef power_on(s):\n    msg = \"base:Power:ON\"\n    s.send(msg)\n    check_response(s)\n\ndef power_off(s):\n    msg = \"base:Power:OFF\"\n    s.send(msg)\n    check_response(s)\n\ndef send_vel(s, vel):\n    if len(vel) != 3:\n        raise ValueError(\"Velocity should be 3 elements long\")\n\n    msg = \"base:{:.3f},{:.3f},{:.3f}\".format(vel[0], vel[1], vel[2])\n    print(\"SND: {}\".format(msg))\n    s.send(msg)\n\ndef check_response(s):\n    msg = s.recieve()\n    while msg:\n        print(\"RCV: {}\".format(msg))\n        msg = s.recieve()\n\ndef move_vel_with_pause(s, vel, move_time):\n    start_time = time.time()\n    while time.time() - start_time < move_time:\n        send_vel(s, vel)\n        check_response(s)\n        time.sleep(0.2)\n\ndef timed_move(s, vel, move_time, pause_time):\n    print(\"Starting move\")\n    move_vel_with_pause(s, vel, move_time)\n    print(\"Done with move\")\n    stop_vel = [0,0,0]\n    move_vel_with_pause(s, stop_vel, pause_time)\n\n\nif __name__ == '__main__':\n    \n    ser = SerialClient(SERIAL_PORT, SERIAL_BAUD, SERIAL_TIMEOUT)\n\n    def signal_handler(sig, frame):\n        print('Gracefully powering down motors and exiting')\n        send_vel(ser, [0,0,0])\n        power_off(ser)\n        sys.exit(0)\n    signal.signal(signal.SIGINT, signal_handler)\n\n    fwd_vel =  [ 0.2, 0, 0]\n    bkwd_vel = [-0.2, 0, 0]\n    left_vel = [0, 0.2, 0]\n    right_vel = [0, -0.2, 0]\n    ccw_spin_vel = [0, 0, 0.2]\n    cw_spin_vel = [0, 0, -0.2]\n    diag_fwd_vel = [0.1, 0.173, 0]\n    diag_rev_vel = [-0.1, -0.173, 0]\n    move_time = 2\n    pause_time = 1\n\n    print(\"Motors on\")\n    power_on(ser)\n    while(True):\n\n        print(\"Move forward\")\n        timed_move(ser, fwd_vel, move_time, pause_time)\n        print(\"Move backward\")\n        timed_move(ser, bkwd_vel, move_time, pause_time)\n        # print(\"Move left\")\n        # timed_move(ser, left_vel, move_time, pause_time)\n        # print(\"Move right\")\n        # timed_move(ser, right_vel, move_time, pause_time)\n        # print(\"Move CCW\")\n        # timed_move(ser, ccw_spin_vel, move_time, pause_time)\n        # print(\"Move CW\")\n        # timed_move(ser, cw_spin_vel, move_time, pause_time)\n        # print(\"Move Diag Fwd\")\n        # timed_move(ser, diag_fwd_vel, move_time, pause_time)\n        # print(\"Move Diag Rev\")\n        # timed_move(ser, diag_rev_vel, move_time, pause_time)\n\n    print(\"Motors off\")\n    power_off(ser)\n\n"
  },
  {
    "path": "src/tools/plot_logs.py",
    "content": "import argparse\nimport os\nimport numpy as np\nimport json\nimport matplotlib.pyplot as plt\nimport PySimpleGUI as sg\n\nDEFAULT_LOG_PATH = \"C:\\\\Users\\\\alexb\\\\Documents\\\\Github\\\\DominoRobot\\\\log\"\n\n\ndef get_value(line, path):\n    idx = 0\n    for p in path:\n        idx = line.find(p, idx)\n        idx += len(p)\n\n    data = \"\"\n    for c in line[idx:]:\n        if c in ['[',']',',','\\n']:\n            break\n        elif c in [':',' ']:\n            continue\n        else:\n            data += c        \n\n    return float(data)\n\ndef get_array(line):\n    start_idx = line.find('[')\n    end_idx = line.find(']')\n    array_txt = line[start_idx+1:end_idx]\n    vals = array_txt.split(',')\n    return [float(v.strip()) for v in vals]\n\nclass LogParser:\n\n    def __init__(self, path):\n\n        self.path = path\n        self.time = np.zeros((1,1))\n        self.target_pos = np.zeros((3,1))\n        self.target_vel = np.zeros((3,1))\n        self.cmd_vel = np.zeros((3,1))\n        self.est_vel = np.zeros((3,1))\n        self.est_pos = np.zeros((3,1))\n        self.motor_cmd_vel = np.zeros((4,1))\n        self.motor_est_vel = np.zeros((4,1))\n        self.motor_counts = np.zeros((4,1))\n        self.cartesian_controller = np.zeros((3,1))\n        self.motor_info = [{\"deltaRads\": [0], \"deltaMicros\":[0], \"pidOut\": [0], \"outputCmd\": [0]} for i in range(4)]\n        self.t_offset = 0\n\n    def parse_logs(self):\n\n        with open(self.path) as f:\n            for line in f:\n\n                if line.startswith(\"Target:\"):\n                    # time\n                    self.time = np.append(self.time, get_value(line, ['Target', 'T']) + self.t_offset)\n                    # target position\n                    data = []\n                    data.append(get_value(line, ['Target','Position','X']))\n                    data.append(get_value(line, ['Target','Position','Y']))\n                    data.append(get_value(line, ['Target','Position','A']))\n                    arr = np.asarray(data).reshape((3,1))\n                    self.target_pos = np.hstack((self.target_pos, arr))\n                    # target velocity\n                    data = []\n                    data.append(get_value(line, ['Target','Velocity','X']))\n                    data.append(get_value(line, ['Target','Velocity','Y']))\n                    data.append(get_value(line, ['Target','Velocity','A']))\n                    arr = np.asarray(data).reshape((3,1))\n                    self.target_vel = np.hstack((self.target_vel, arr))\n                elif line.startswith(\"CartesianControlX:\"):\n                    # control values\n                    data = []\n                    data.append(get_value(line, ['CartesianControlX','PosErr']))\n                    data.append(get_value(line, ['CartesianControlX','VelErr']))\n                    data.append(get_value(line, ['CartesianControlX','ErrSum']))\n                    arr = np.asarray(data).reshape((3,1))\n                    self.cartesian_controller = np.hstack((self.cartesian_controller, arr))\n                elif line.startswith(\"CartVelCmd:\"):\n                    # command velocity\n                    data = []\n                    data.append(get_value(line, ['CartVelCmd','vx']))\n                    data.append(get_value(line, ['CartVelCmd','vy']))\n                    data.append(get_value(line, ['CartVelCmd','va']))\n                    arr = np.asarray(data).reshape((3,1))\n                    self.cmd_vel = np.hstack((self.cmd_vel, arr))\n                elif line.startswith(\"MotorCommands:\"):\n                    # motor command vel\n                    data = get_array(line)\n                    arr = np.asarray(data).reshape((4,1))\n                    self.motor_cmd_vel = np.hstack((self.motor_cmd_vel, arr))\n                elif line.startswith(\"MotorMeasured:\"):\n                    # motor est vel\n                    data = get_array(line)\n                    arr = np.asarray(data).reshape((4,1))\n                    self.motor_est_vel = np.hstack((self.motor_est_vel, arr))\n                elif line.startswith(\"MotorCounts:\"):\n                    # motor encoder counts\n                    data = get_array(line)\n                    arr = np.asarray(data).reshape((4,1))\n                    self.motor_counts = np.hstack((self.motor_counts, arr))\n                elif line.startswith(\"Est Vel:\"):\n                    # est velocity\n                    data = []\n                    data.append(get_value(line, ['Est Vel','X']))\n                    data.append(get_value(line, ['Est Vel','Y']))\n                    data.append(get_value(line, ['Est Vel','A']))\n                    arr = np.asarray(data).reshape((3,1))\n                    self.est_vel = np.hstack((self.est_vel, arr))\n                elif line.startswith(\"Est Pos:\"):\n                    # est position\n                    data = []\n                    data.append(get_value(line, ['Est Pos','X']))\n                    data.append(get_value(line, ['Est Pos','Y']))\n                    data.append(get_value(line, ['Est Pos','A']))\n                    arr = np.asarray(data).reshape((3,1))\n                    self.est_pos = np.hstack((self.est_pos, arr))\n                elif line.startswith(\"Reached goal\"):\n                    # Drop extra target line if needed\n                    if self.time.shape[0] > self.est_pos.shape[1]:\n                        self.time = self.time[:-1]\n                        self.target_pos = self.target_pos[:,:-1]\n                        self.target_vel = self.target_vel[:,:-1]\n\n                    # Update time offset\n                    self.t_offset += self.time[-1] + 0.5 #some extra spacing\n\n                elif line.startswith(\"Motor0:\") or line.startswith(\"Motor1:\") or \\\n                     line.startswith(\"Motor2:\") or line.startswith(\"Motor3:\"):\n                    n = int(line[5])\n                    name = \"Motor{}\".format(n)\n                    if line[7:].strip():\n                        self.motor_info[n][\"deltaRads\"].append(get_value(line, [name, \"deltaRads\"]))\n                        self.motor_info[n][\"deltaMicros\"].append(get_value(line, [name, \"deltaMicros\"]))\n                        self.motor_info[n][\"pidOut\"].append(get_value(line, [name, \"pidOut\"]))\n                        self.motor_info[n][\"outputCmd\"].append(get_value(line, [name, \"outputCmd\"]))\n                    else:\n                        self.motor_info[n][\"deltaRads\"].append(0.00123)\n                        self.motor_info[n][\"deltaMicros\"].append(0.00123)\n                        self.motor_info[n][\"pidOut\"].append(0.00123)\n                        self.motor_info[n][\"outputCmd\"].append(0.00123)\n\n                else:\n                    continue\n        \n    \n    def plot_pos(self):\n        fig = plt.figure()\n        fig.canvas.set_window_title('Cartesian Position')\n\n        labels = ['x', 'y', 'a']\n        main_ax = None\n        for i in range(3):\n            if not main_ax:\n                main_ax = fig.add_subplot(3,1,i+1)\n                ax = main_ax\n            else:\n                ax = fig.add_subplot(3,1,i+1, sharex=main_ax)\n            ax.plot(self.time,self.target_pos[i,:], '-b.', self.time,self.est_pos[i,:], '-r.')\n            ax.legend(['Target', 'Estimate'])\n            ax.set_ylabel(\"Position: {}\".format(labels[i]))\n        \n        ax.set_xlabel('Time')\n\n    def plot_x_control(self):\n        fig = plt.figure()\n        fig.canvas.set_window_title('Cartesian X Control')\n        \n        # X position\n        main_ax = fig.add_subplot(3,1,1)\n        ax = main_ax\n        ax.plot(self.time,self.target_pos[0,:], '-b.', self.time,self.est_pos[0,:], '-r.')\n        ax.legend(['Target', 'Estimate'])\n        ax.set_ylabel(\"Position\")\n\n        # X velocity\n        ax = fig.add_subplot(3,1,2, sharex=main_ax)\n        ax.plot(self.time,self.target_vel[0,:], '-b.', self.time,self.est_vel[0,:], '-r.', self.time,self.cmd_vel[0,:], '-g.')\n        ax.legend(['Target', 'Estimate', 'Command'])\n        ax.set_ylabel(\"Velocity\")\n        \n        # X control - gains just copied from constants.h\n        kp = 2\n        ki = 0.1\n        kd = 0\n        ax = fig.add_subplot(3,1,3, sharex=main_ax)\n        ax.plot(self.time,self.cartesian_controller[0,:], '-b.', self.time,self.cartesian_controller[1,:], '-r.', self.time,self.cartesian_controller[2,:], '-g.')\n        ax.plot(self.time,kp*self.cartesian_controller[0,:], '--b', self.time, kd*self.cartesian_controller[1,:], '--r', self.time, ki*self.cartesian_controller[2,:], '--g')\n        ax.legend(['PosErr', 'VelErr', 'ErrSum'])\n        ax.set_ylabel(\"Controller\")\n\n    def plot_vel(self):\n        fig = plt.figure()\n        fig.canvas.set_window_title('Cartesian Velocity')\n\n        labels = ['x', 'y', 'a']\n        main_ax = None\n        for i in range(3):\n            if not main_ax:\n                main_ax = fig.add_subplot(3,1,i+1)\n                ax = main_ax\n            else:\n                ax = fig.add_subplot(3,1,i+1, sharex=main_ax)\n            ax.plot(self.time,self.target_vel[i,:], '-b.', self.time,self.est_vel[i,:], '-r.', self.time,self.cmd_vel[i,:], '-g.')\n            ax.legend(['Target', 'Estimate', 'Command'])\n            ax.set_ylabel(\"Velocity: {}\".format(labels[i]))\n        \n        ax.set_xlabel('Time')\n\n    def plot_motors(self):\n        fig = plt.figure()\n        fig.canvas.set_window_title('Motor Velocity')\n\n        labels = ['0', '1', '2', '3']\n        main_ax = None\n        for i in range(4):\n            if not main_ax:\n                main_ax = fig.add_subplot(4,1,i+1)\n                ax = main_ax\n            else:\n                ax = fig.add_subplot(4,1,i+1, sharex=main_ax)\n            ax.plot(self.time,self.motor_cmd_vel[i,:], '-b.', self.time,self.motor_est_vel[i,:], '-r.')\n            ax.legend(['Target', 'Estimate'])\n            ax.set_ylabel(\"Motor Velocity: {}\".format(labels[i]))\n        \n        ax.set_xlabel('Time')\n\n    def plot_motor_counts(self):\n        fig = plt.figure()\n        fig.canvas.set_window_title('Motor Counts')\n        ax = fig.add_subplot(1,1,1)\n        colors = ['r','g','b','k']\n        for i in range(4):\n            ax.plot(self.time,self.motor_counts[i,:], '-{}.'.format(colors[i]))\n        labels = ['0', '1', '2', '3']\n        ax.legend(labels)\n        ax.set_ylabel(\"Motor counts\")\n        ax.set_xlabel('Time')\n\n    def plot_motor_info(self):\n        fig = plt.figure()\n        fig.canvas.set_window_title('Motor Info')\n\n        labels = ['0', '1', '2', '3']\n        main_ax = None\n        for i in range(4):\n            if not main_ax:\n                main_ax = fig.add_subplot(4,1,i+1)\n                ax = main_ax\n            else:\n                ax = fig.add_subplot(4,1,i+1, sharex=main_ax)\n\n            ax.plot(self.time,self.motor_info[i][\"pidOut\"], '-b.', self.time,self.motor_info[i][\"outputCmd\"], '-r.')\n            ax.legend(['pidOut', 'outputCmd'])\n            ax.set_ylabel(\"Motor Info: {}\".format(labels[i]))\n        \n        ax.set_xlabel('Time')\n\n    \n    def plot_logs(self):\n        self.plot_pos()\n        self.plot_vel()\n        self.plot_motors()\n        plt.show()\n\n\ndef get_all_log_files():\n    files = [f for f in os.listdir(DEFAULT_LOG_PATH) if os.path.isfile(os.path.join(DEFAULT_LOG_PATH, f))]\n    files.reverse()\n    return files\n\nclass PlottingGui:\n\n    def __init__(self):\n        sg.change_look_and_feel('DarkBlue')\n\n        all_files = get_all_log_files()\n        left_side = sg.Column( [[sg.Listbox(values=all_files, size=(50, 10), key='_FILES_')]] )\n\n        pos = [sg.Checkbox('Plot Positions', default=True, key='_POS_')]\n        vel = [sg.Checkbox('Plot Velocities', default=True, key='_VEL_')]\n        motors = [sg.Checkbox('Plot Motors', default=True, key='_MOTOR_')]\n        motor_counts = [sg.Checkbox('Plot Motor Counts', default=False, key='_MOTORCOUNTS_')]\n        x_control = [sg.Checkbox('Plot X Controller', default=False, key='_XCONTROL_')]\n        motor_info = [sg.Checkbox('Plot Motor Info', default=False, key='_MOTORINFO_')]\n        plot_button = [sg.Button('PLOT')]\n        right_side = sg.Column([ pos, vel, motors, motor_counts, x_control, motor_info, plot_button ])\n\n        layout = [[left_side, right_side]]\n        self.window = sg.Window('Plotting Utility', layout)\n        self.window.finalize()\n\n    def loop(self):\n\n        while True:  # Event Loop\n\n            # Keep files in list updated\n            all_files = get_all_log_files()\n            self.window['_FILES_'].update(values=all_files)\n\n            # Handle buttons\n            event, values = self.window.read()\n            if event is None or event == 'Exit':\n                break\n            if event == 'PLOT':\n                # Get file\n                vals = self.window['_FILES_'].get_list_values()\n                idx = self.window['_FILES_'].get_indexes()\n                fname = vals[idx[0]]\n                fpath = os.path.join(DEFAULT_LOG_PATH, fname)\n                print(\"Loading logs from {}\".format(fpath))\n\n                # Parse logs and generate plots for desired items\n                lp = LogParser(fpath)\n                lp.parse_logs()\n                if self.window['_POS_'].Get():\n                    lp.plot_pos()\n                if self.window['_VEL_'].Get():\n                    lp.plot_vel()\n                if self.window['_MOTOR_'].Get():\n                    lp.plot_motors()\n                if self.window['_MOTORCOUNTS_'].Get():\n                    lp.plot_motor_counts()\n                if self.window['_XCONTROL_'].Get():\n                    lp.plot_x_control()\n                if self.window['_MOTORINFO_'].Get():\n                    lp.plot_motor_info()\n\n                # Show figures\n                plt.show(block=False)\n        \n        self.window.close()\n\n\nif __name__ == '__main__':\n\n    pg = PlottingGui()\n    pg.loop()\n"
  }
]