[
  {
    "path": "LICENSE",
    "content": "                    GNU GENERAL PUBLIC LICENSE\n                       Version 3, 29 June 2007\n\n Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n                            Preamble\n\n  The GNU General Public License is a free, copyleft license for\nsoftware and other kinds of works.\n\n  The licenses for most software and other practical works are designed\nto take away your freedom to share and change the works.  By contrast,\nthe GNU General Public License is intended to guarantee your freedom to\nshare and change all versions of a program--to make sure it remains free\nsoftware for all its users.  We, the Free Software Foundation, use the\nGNU General Public License for most of our software; it applies also to\nany other work released this way by its authors.  You can apply it to\nyour programs, too.\n\n  When we speak of free software, we are referring to freedom, not\nprice.  Our General Public Licenses are designed to make sure that you\nhave the freedom to distribute copies of free software (and charge for\nthem if you wish), that you receive source code or can get it if you\nwant it, that you can change the software or use pieces of it in new\nfree programs, and that you know you can do these things.\n\n  To protect your rights, we need to prevent others from denying you\nthese rights or asking you to surrender the rights.  Therefore, you have\ncertain responsibilities if you distribute copies of the software, or if\nyou modify it: responsibilities to respect the freedom of others.\n\n  For example, if you distribute copies of such a program, whether\ngratis or for a fee, you must pass on to the recipients the same\nfreedoms that you received.  You must make sure that they, too, receive\nor can get the source code.  And you must show them these terms so they\nknow their rights.\n\n  Developers that use the GNU GPL protect your rights with two steps:\n(1) assert copyright on the software, and (2) offer you this License\ngiving you legal permission to copy, distribute and/or modify it.\n\n  For the developers' and authors' protection, the GPL clearly explains\nthat there is no warranty for this free software.  For both users' and\nauthors' sake, the GPL requires that modified versions be marked as\nchanged, so that their problems will not be attributed erroneously to\nauthors of previous versions.\n\n  Some devices are designed to deny users access to install or run\nmodified versions of the software inside them, although the manufacturer\ncan do so.  This is fundamentally incompatible with the aim of\nprotecting users' freedom to change the software.  The systematic\npattern of such abuse occurs in the area of products for individuals to\nuse, which is precisely where it is most unacceptable.  Therefore, we\nhave designed this version of the GPL to prohibit the practice for those\nproducts.  If such problems arise substantially in other domains, we\nstand ready to extend this provision to those domains in future versions\nof the GPL, as needed to protect the freedom of users.\n\n  Finally, every program is threatened constantly by software patents.\nStates should not allow patents to restrict development and use of\nsoftware on general-purpose computers, but in those that do, we wish to\navoid the special danger that patents applied to a free program could\nmake it effectively proprietary.  To prevent this, the GPL assures that\npatents cannot be used to render the program non-free.\n\n  The precise terms and conditions for copying, distribution and\nmodification follow.\n\n                       TERMS AND CONDITIONS\n\n  0. Definitions.\n\n  \"This License\" refers to version 3 of the GNU General Public License.\n\n  \"Copyright\" also means copyright-like laws that apply to other kinds of\nworks, such as semiconductor masks.\n\n  \"The Program\" refers to any copyrightable work licensed under this\nLicense.  Each licensee is addressed as \"you\".  \"Licensees\" and\n\"recipients\" may be individuals or organizations.\n\n  To \"modify\" a work means to copy from or adapt all or part of the work\nin a fashion requiring copyright permission, other than the making of an\nexact copy.  The resulting work is called a \"modified version\" of the\nearlier work or a work \"based on\" the earlier work.\n\n  A \"covered work\" means either the unmodified Program or a work based\non the Program.\n\n  To \"propagate\" a work means to do anything with it that, without\npermission, would make you directly or secondarily liable for\ninfringement under applicable copyright law, except executing it on a\ncomputer or modifying a private copy.  Propagation includes copying,\ndistribution (with or without modification), making available to the\npublic, and in some countries other activities as well.\n\n  To \"convey\" a work means any kind of propagation that enables other\nparties to make or receive copies.  Mere interaction with a user through\na computer network, with no transfer of a copy, is not conveying.\n\n  An interactive user interface displays \"Appropriate Legal Notices\"\nto the extent that it includes a convenient and prominently visible\nfeature that (1) displays an appropriate copyright notice, and (2)\ntells the user that there is no warranty for the work (except to the\nextent that warranties are provided), that licensees may convey the\nwork under this License, and how to view a copy of this License.  If\nthe interface presents a list of user commands or options, such as a\nmenu, a prominent item in the list meets this criterion.\n\n  1. Source Code.\n\n  The \"source code\" for a work means the preferred form of the work\nfor making modifications to it.  \"Object code\" means any non-source\nform of a work.\n\n  A \"Standard Interface\" means an interface that either is an official\nstandard defined by a recognized standards body, or, in the case of\ninterfaces specified for a particular programming language, one that\nis widely used among developers working in that language.\n\n  The \"System Libraries\" of an executable work include anything, other\nthan the work as a whole, that (a) is included in the normal form of\npackaging a Major Component, but which is not part of that Major\nComponent, and (b) serves only to enable use of the work with that\nMajor Component, or to implement a Standard Interface for which an\nimplementation is available to the public in source code form.  A\n\"Major Component\", in this context, means a major essential component\n(kernel, window system, and so on) of the specific operating system\n(if any) on which the executable work runs, or a compiler used to\nproduce the work, or an object code interpreter used to run it.\n\n  The \"Corresponding Source\" for a work in object code form means all\nthe source code needed to generate, install, and (for an executable\nwork) run the object code and to modify the work, including scripts to\ncontrol those activities.  However, it does not include the work's\nSystem Libraries, or general-purpose tools or generally available free\nprograms which are used unmodified in performing those activities but\nwhich are not part of the work.  For example, Corresponding Source\nincludes interface definition files associated with source files for\nthe work, and the source code for shared libraries and dynamically\nlinked subprograms that the work is specifically designed to require,\nsuch as by intimate data communication or control flow between those\nsubprograms and other parts of the work.\n\n  The Corresponding Source need not include anything that users\ncan regenerate automatically from other parts of the Corresponding\nSource.\n\n  The Corresponding Source for a work in source code form is that\nsame work.\n\n  2. Basic Permissions.\n\n  All rights granted under this License are granted for the term of\ncopyright on the Program, and are irrevocable provided the stated\nconditions are met.  This License explicitly affirms your unlimited\npermission to run the unmodified Program.  The output from running a\ncovered work is covered by this License only if the output, given its\ncontent, constitutes a covered work.  This License acknowledges your\nrights of fair use or other equivalent, as provided by copyright law.\n\n  You may make, run and propagate covered works that you do not\nconvey, without conditions so long as your license otherwise remains\nin force.  You may convey covered works to others for the sole purpose\nof having them make modifications exclusively for you, or provide you\nwith facilities for running those works, provided that you comply with\nthe terms of this License in conveying all material for which you do\nnot control copyright.  Those thus making or running the covered works\nfor you must do so exclusively on your behalf, under your direction\nand control, on terms that prohibit them from making any copies of\nyour copyrighted material outside their relationship with you.\n\n  Conveying under any other circumstances is permitted solely under\nthe conditions stated below.  Sublicensing is not allowed; section 10\nmakes it unnecessary.\n\n  3. Protecting Users' Legal Rights From Anti-Circumvention Law.\n\n  No covered work shall be deemed part of an effective technological\nmeasure under any applicable law fulfilling obligations under article\n11 of the WIPO copyright treaty adopted on 20 December 1996, or\nsimilar laws prohibiting or restricting circumvention of such\nmeasures.\n\n  When you convey a covered work, you waive any legal power to forbid\ncircumvention of technological measures to the extent such circumvention\nis effected by exercising rights under this License with respect to\nthe covered work, and you disclaim any intention to limit operation or\nmodification of the work as a means of enforcing, against the work's\nusers, your or third parties' legal rights to forbid circumvention of\ntechnological measures.\n\n  4. Conveying Verbatim Copies.\n\n  You may convey verbatim copies of the Program's source code as you\nreceive it, in any medium, provided that you conspicuously and\nappropriately publish on each copy an appropriate copyright notice;\nkeep intact all notices stating that this License and any\nnon-permissive terms added in accord with section 7 apply to the code;\nkeep intact all notices of the absence of any warranty; and give all\nrecipients a copy of this License along with the Program.\n\n  You may charge any price or no price for each copy that you convey,\nand you may offer support or warranty protection for a fee.\n\n  5. Conveying Modified Source Versions.\n\n  You may convey a work based on the Program, or the modifications to\nproduce it from the Program, in the form of source code under the\nterms of section 4, provided that you also meet all of these conditions:\n\n    a) The work must carry prominent notices stating that you modified\n    it, and giving a relevant date.\n\n    b) The work must carry prominent notices stating that it is\n    released under this License and any conditions added under section\n    7.  This requirement modifies the requirement in section 4 to\n    \"keep intact all notices\".\n\n    c) You must license the entire work, as a whole, under this\n    License to anyone who comes into possession of a copy.  This\n    License will therefore apply, along with any applicable section 7\n    additional terms, to the whole of the work, and all its parts,\n    regardless of how they are packaged.  This License gives no\n    permission to license the work in any other way, but it does not\n    invalidate such permission if you have separately received it.\n\n    d) If the work has interactive user interfaces, each must display\n    Appropriate Legal Notices; however, if the Program has interactive\n    interfaces that do not display Appropriate Legal Notices, your\n    work need not make them do so.\n\n  A compilation of a covered work with other separate and independent\nworks, which are not by their nature extensions of the covered work,\nand which are not combined with it such as to form a larger program,\nin or on a volume of a storage or distribution medium, is called an\n\"aggregate\" if the compilation and its resulting copyright are not\nused to limit the access or legal rights of the compilation's users\nbeyond what the individual works permit.  Inclusion of a covered work\nin an aggregate does not cause this License to apply to the other\nparts of the aggregate.\n\n  6. Conveying Non-Source Forms.\n\n  You may convey a covered work in object code form under the terms\nof sections 4 and 5, provided that you also convey the\nmachine-readable Corresponding Source under the terms of this License,\nin one of these ways:\n\n    a) Convey the object code in, or embodied in, a physical product\n    (including a physical distribution medium), accompanied by the\n    Corresponding Source fixed on a durable physical medium\n    customarily used for software interchange.\n\n    b) Convey the object code in, or embodied in, a physical product\n    (including a physical distribution medium), accompanied by a\n    written offer, valid for at least three years and valid for as\n    long as you offer spare parts or customer support for that product\n    model, to give anyone who possesses the object code either (1) a\n    copy of the Corresponding Source for all the software in the\n    product that is covered by this License, on a durable physical\n    medium customarily used for software interchange, for a price no\n    more than your reasonable cost of physically performing this\n    conveying of source, or (2) access to copy the\n    Corresponding Source from a network server at no charge.\n\n    c) Convey individual copies of the object code with a copy of the\n    written offer to provide the Corresponding Source.  This\n    alternative is allowed only occasionally and noncommercially, and\n    only if you received the object code with such an offer, in accord\n    with subsection 6b.\n\n    d) Convey the object code by offering access from a designated\n    place (gratis or for a charge), and offer equivalent access to the\n    Corresponding Source in the same way through the same place at no\n    further charge.  You need not require recipients to copy the\n    Corresponding Source along with the object code.  If the place to\n    copy the object code is a network server, the Corresponding Source\n    may be on a different server (operated by you or a third party)\n    that supports equivalent copying facilities, provided you maintain\n    clear directions next to the object code saying where to find the\n    Corresponding Source.  Regardless of what server hosts the\n    Corresponding Source, you remain obligated to ensure that it is\n    available for as long as needed to satisfy these requirements.\n\n    e) Convey the object code using peer-to-peer transmission, provided\n    you inform other peers where the object code and Corresponding\n    Source of the work are being offered to the general public at no\n    charge under subsection 6d.\n\n  A separable portion of the object code, whose source code is excluded\nfrom the Corresponding Source as a System Library, need not be\nincluded in conveying the object code work.\n\n  A \"User Product\" is either (1) a \"consumer product\", which means any\ntangible personal property which is normally used for personal, family,\nor household purposes, or (2) anything designed or sold for incorporation\ninto a dwelling.  In determining whether a product is a consumer product,\ndoubtful cases shall be resolved in favor of coverage.  For a particular\nproduct received by a particular user, \"normally used\" refers to a\ntypical or common use of that class of product, regardless of the status\nof the particular user or of the way in which the particular user\nactually uses, or expects or is expected to use, the product.  A product\nis a consumer product regardless of whether the product has substantial\ncommercial, industrial or non-consumer uses, unless such uses represent\nthe only significant mode of use of the product.\n\n  \"Installation Information\" for a User Product means any methods,\nprocedures, authorization keys, or other information required to install\nand execute modified versions of a covered work in that User Product from\na modified version of its Corresponding Source.  The information must\nsuffice to ensure that the continued functioning of the modified object\ncode is in no case prevented or interfered with solely because\nmodification has been made.\n\n  If you convey an object code work under this section in, or with, or\nspecifically for use in, a User Product, and the conveying occurs as\npart of a transaction in which the right of possession and use of the\nUser Product is transferred to the recipient in perpetuity or for a\nfixed term (regardless of how the transaction is characterized), the\nCorresponding Source conveyed under this section must be accompanied\nby the Installation Information.  But this requirement does not apply\nif neither you nor any third party retains the ability to install\nmodified object code on the User Product (for example, the work has\nbeen installed in ROM).\n\n  The requirement to provide Installation Information does not include a\nrequirement to continue to provide support service, warranty, or updates\nfor a work that has been modified or installed by the recipient, or for\nthe User Product in which it has been modified or installed.  Access to a\nnetwork may be denied when the modification itself materially and\nadversely affects the operation of the network or violates the rules and\nprotocols for communication across the network.\n\n  Corresponding Source conveyed, and Installation Information provided,\nin accord with this section must be in a format that is publicly\ndocumented (and with an implementation available to the public in\nsource code form), and must require no special password or key for\nunpacking, reading or copying.\n\n  7. Additional Terms.\n\n  \"Additional permissions\" are terms that supplement the terms of this\nLicense by making exceptions from one or more of its conditions.\nAdditional permissions that are applicable to the entire Program shall\nbe treated as though they were included in this License, to the extent\nthat they are valid under applicable law.  If additional permissions\napply only to part of the Program, that part may be used separately\nunder those permissions, but the entire Program remains governed by\nthis License without regard to the additional permissions.\n\n  When you convey a copy of a covered work, you may at your option\nremove any additional permissions from that copy, or from any part of\nit.  (Additional permissions may be written to require their own\nremoval in certain cases when you modify the work.)  You may place\nadditional permissions on material, added by you to a covered work,\nfor which you have or can give appropriate copyright permission.\n\n  Notwithstanding any other provision of this License, for material you\nadd to a covered work, you may (if authorized by the copyright holders of\nthat material) supplement the terms of this License with terms:\n\n    a) Disclaiming warranty or limiting liability differently from the\n    terms of sections 15 and 16 of this License; or\n\n    b) Requiring preservation of specified reasonable legal notices or\n    author attributions in that material or in the Appropriate Legal\n    Notices displayed by works containing it; or\n\n    c) Prohibiting misrepresentation of the origin of that material, or\n    requiring that modified versions of such material be marked in\n    reasonable ways as different from the original version; or\n\n    d) Limiting the use for publicity purposes of names of licensors or\n    authors of the material; or\n\n    e) Declining to grant rights under trademark law for use of some\n    trade names, trademarks, or service marks; or\n\n    f) Requiring indemnification of licensors and authors of that\n    material by anyone who conveys the material (or modified versions of\n    it) with contractual assumptions of liability to the recipient, for\n    any liability that these contractual assumptions directly impose on\n    those licensors and authors.\n\n  All other non-permissive additional terms are considered \"further\nrestrictions\" within the meaning of section 10.  If the Program as you\nreceived it, or any part of it, contains a notice stating that it is\ngoverned by this License along with a term that is a further\nrestriction, you may remove that term.  If a license document contains\na further restriction but permits relicensing or conveying under this\nLicense, you may add to a covered work material governed by the terms\nof that license document, provided that the further restriction does\nnot survive such relicensing or conveying.\n\n  If you add terms to a covered work in accord with this section, you\nmust place, in the relevant source files, a statement of the\nadditional terms that apply to those files, or a notice indicating\nwhere to find the applicable terms.\n\n  Additional terms, permissive or non-permissive, may be stated in the\nform of a separately written license, or stated as exceptions;\nthe above requirements apply either way.\n\n  8. Termination.\n\n  You may not propagate or modify a covered work except as expressly\nprovided under this License.  Any attempt otherwise to propagate or\nmodify it is void, and will automatically terminate your rights under\nthis License (including any patent licenses granted under the third\nparagraph of section 11).\n\n  However, if you cease all violation of this License, then your\nlicense from a particular copyright holder is reinstated (a)\nprovisionally, unless and until the copyright holder explicitly and\nfinally terminates your license, and (b) permanently, if the copyright\nholder fails to notify you of the violation by some reasonable means\nprior to 60 days after the cessation.\n\n  Moreover, your license from a particular copyright holder is\nreinstated permanently if the copyright holder notifies you of the\nviolation by some reasonable means, this is the first time you have\nreceived notice of violation of this License (for any work) from that\ncopyright holder, and you cure the violation prior to 30 days after\nyour receipt of the notice.\n\n  Termination of your rights under this section does not terminate the\nlicenses of parties who have received copies or rights from you under\nthis License.  If your rights have been terminated and not permanently\nreinstated, you do not qualify to receive new licenses for the same\nmaterial under section 10.\n\n  9. Acceptance Not Required for Having Copies.\n\n  You are not required to accept this License in order to receive or\nrun a copy of the Program.  Ancillary propagation of a covered work\noccurring solely as a consequence of using peer-to-peer transmission\nto receive a copy likewise does not require acceptance.  However,\nnothing other than this License grants you permission to propagate or\nmodify any covered work.  These actions infringe copyright if you do\nnot accept this License.  Therefore, by modifying or propagating a\ncovered work, you indicate your acceptance of this License to do so.\n\n  10. Automatic Licensing of Downstream Recipients.\n\n  Each time you convey a covered work, the recipient automatically\nreceives a license from the original licensors, to run, modify and\npropagate that work, subject to this License.  You are not responsible\nfor enforcing compliance by third parties with this License.\n\n  An \"entity transaction\" is a transaction transferring control of an\norganization, or substantially all assets of one, or subdividing an\norganization, or merging organizations.  If propagation of a covered\nwork results from an entity transaction, each party to that\ntransaction who receives a copy of the work also receives whatever\nlicenses to the work the party's predecessor in interest had or could\ngive under the previous paragraph, plus a right to possession of the\nCorresponding Source of the work from the predecessor in interest, if\nthe predecessor has it or can get it with reasonable efforts.\n\n  You may not impose any further restrictions on the exercise of the\nrights granted or affirmed under this License.  For example, you may\nnot impose a license fee, royalty, or other charge for exercise of\nrights granted under this License, and you may not initiate litigation\n(including a cross-claim or counterclaim in a lawsuit) alleging that\nany patent claim is infringed by making, using, selling, offering for\nsale, or importing the Program or any portion of it.\n\n  11. Patents.\n\n  A \"contributor\" is a copyright holder who authorizes use under this\nLicense of the Program or a work on which the Program is based.  The\nwork thus licensed is called the contributor's \"contributor version\".\n\n  A contributor's \"essential patent claims\" are all patent claims\nowned or controlled by the contributor, whether already acquired or\nhereafter acquired, that would be infringed by some manner, permitted\nby this License, of making, using, or selling its contributor version,\nbut do not include claims that would be infringed only as a\nconsequence of further modification of the contributor version.  For\npurposes of this definition, \"control\" includes the right to grant\npatent sublicenses in a manner consistent with the requirements of\nthis License.\n\n  Each contributor grants you a non-exclusive, worldwide, royalty-free\npatent license under the contributor's essential patent claims, to\nmake, use, sell, offer for sale, import and otherwise run, modify and\npropagate the contents of its contributor version.\n\n  In the following three paragraphs, a \"patent license\" is any express\nagreement or commitment, however denominated, not to enforce a patent\n(such as an express permission to practice a patent or covenant not to\nsue for patent infringement).  To \"grant\" such a patent license to a\nparty means to make such an agreement or commitment not to enforce a\npatent against the party.\n\n  If you convey a covered work, knowingly relying on a patent license,\nand the Corresponding Source of the work is not available for anyone\nto copy, free of charge and under the terms of this License, through a\npublicly available network server or other readily accessible means,\nthen you must either (1) cause the Corresponding Source to be so\navailable, or (2) arrange to deprive yourself of the benefit of the\npatent license for this particular work, or (3) arrange, in a manner\nconsistent with the requirements of this License, to extend the patent\nlicense to downstream recipients.  \"Knowingly relying\" means you have\nactual knowledge that, but for the patent license, your conveying the\ncovered work in a country, or your recipient's use of the covered work\nin a country, would infringe one or more identifiable patents in that\ncountry that you have reason to believe are valid.\n\n  If, pursuant to or in connection with a single transaction or\narrangement, you convey, or propagate by procuring conveyance of, a\ncovered work, and grant a patent license to some of the parties\nreceiving the covered work authorizing them to use, propagate, modify\nor convey a specific copy of the covered work, then the patent license\nyou grant is automatically extended to all recipients of the covered\nwork and works based on it.\n\n  A patent license is \"discriminatory\" if it does not include within\nthe scope of its coverage, prohibits the exercise of, or is\nconditioned on the non-exercise of one or more of the rights that are\nspecifically granted under this License.  You may not convey a covered\nwork if you are a party to an arrangement with a third party that is\nin the business of distributing software, under which you make payment\nto the third party based on the extent of your activity of conveying\nthe work, and under which the third party grants, to any of the\nparties who would receive the covered work from you, a discriminatory\npatent license (a) in connection with copies of the covered work\nconveyed by you (or copies made from those copies), or (b) primarily\nfor and in connection with specific products or compilations that\ncontain the covered work, unless you entered into that arrangement,\nor that patent license was granted, prior to 28 March 2007.\n\n  Nothing in this License shall be construed as excluding or limiting\nany implied license or other defenses to infringement that may\notherwise be available to you under applicable patent law.\n\n  12. No Surrender of Others' Freedom.\n\n  If conditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License.  If you cannot convey a\ncovered work so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you may\nnot convey it at all.  For example, if you agree to terms that obligate you\nto collect a royalty for further conveying from those to whom you convey\nthe Program, the only way you could satisfy both those terms and this\nLicense would be to refrain entirely from conveying the Program.\n\n  13. Use with the GNU Affero General Public License.\n\n  Notwithstanding any other provision of this License, you have\npermission to link or combine any covered work with a work licensed\nunder version 3 of the GNU Affero General Public License into a single\ncombined work, and to convey the resulting work.  The terms of this\nLicense will continue to apply to the part which is the covered work,\nbut the special requirements of the GNU Affero General Public License,\nsection 13, concerning interaction through a network will apply to the\ncombination as such.\n\n  14. Revised Versions of this License.\n\n  The Free Software Foundation may publish revised and/or new versions of\nthe GNU General Public License from time to time.  Such new versions will\nbe similar in spirit to the present version, but may differ in detail to\naddress new problems or concerns.\n\n  Each version is given a distinguishing version number.  If the\nProgram specifies that a certain numbered version of the GNU General\nPublic License \"or any later version\" applies to it, you have the\noption of following the terms and conditions either of that numbered\nversion or of any later version published by the Free Software\nFoundation.  If the Program does not specify a version number of the\nGNU General Public License, you may choose any version ever published\nby the Free Software Foundation.\n\n  If the Program specifies that a proxy can decide which future\nversions of the GNU General Public License can be used, that proxy's\npublic statement of acceptance of a version permanently authorizes you\nto choose that version for the Program.\n\n  Later license versions may give you additional or different\npermissions.  However, no additional obligations are imposed on any\nauthor or copyright holder as a result of your choosing to follow a\nlater version.\n\n  15. Disclaimer of Warranty.\n\n  THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY\nAPPLICABLE LAW.  EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT\nHOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM \"AS IS\" WITHOUT WARRANTY\nOF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,\nTHE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE.  THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM\nIS WITH YOU.  SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF\nALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n\n  16. Limitation of Liability.\n\n  IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING\nWILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS\nTHE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY\nGENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE\nUSE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF\nDATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD\nPARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),\nEVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF\nSUCH DAMAGES.\n\n  17. Interpretation of Sections 15 and 16.\n\n  If the disclaimer of warranty and limitation of liability provided\nabove cannot be given local legal effect according to their terms,\nreviewing courts shall apply local law that most closely approximates\nan absolute waiver of all civil liability in connection with the\nProgram, unless a warranty or assumption of liability accompanies a\ncopy of the Program in return for a fee.\n\n                     END OF TERMS AND CONDITIONS\n\n            How to Apply These Terms to Your New Programs\n\n  If you develop a new program, and you want it to be of the greatest\npossible use to the public, the best way to achieve this is to make it\nfree software which everyone can redistribute and change under these terms.\n\n  To do so, attach the following notices to the program.  It is safest\nto attach them to the start of each source file to most effectively\nstate the exclusion of warranty; and each file should have at least\nthe \"copyright\" line and a pointer to where the full notice is found.\n\n    <one line to give the program's name and a brief idea of what it does.>\n    Copyright (C) <year>  <name of author>\n\n    This program is free software: you can redistribute it and/or modify\n    it under the terms of the GNU General Public License as published by\n    the Free Software Foundation, either version 3 of the License, or\n    (at your option) any later version.\n\n    This program is distributed in the hope that it will be useful,\n    but WITHOUT ANY WARRANTY; without even the implied warranty of\n    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the\n    GNU General Public License for more details.\n\n    You should have received a copy of the GNU General Public License\n    along with this program.  If not, see <https://www.gnu.org/licenses/>.\n\nAlso add information on how to contact you by electronic and paper mail.\n\n  If the program does terminal interaction, make it output a short\nnotice like this when it starts in an interactive mode:\n\n    <program>  Copyright (C) <year>  <name of author>\n    This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.\n    This is free software, and you are welcome to redistribute it\n    under certain conditions; type `show c' for details.\n\nThe hypothetical commands `show w' and `show c' should show the appropriate\nparts of the General Public License.  Of course, your program's commands\nmight be different; for a GUI interface, you would use an \"about box\".\n\n  You should also get your employer (if you work as a programmer) or school,\nif any, to sign a \"copyright disclaimer\" for the program, if necessary.\nFor more information on this, and how to apply and follow the GNU GPL, see\n<https://www.gnu.org/licenses/>.\n\n  The GNU General Public License does not permit incorporating your program\ninto proprietary programs.  If your program is a subroutine library, you\nmay consider it more useful to permit linking proprietary applications with\nthe library.  If this is what you want to do, use the GNU Lesser General\nPublic License instead of this License.  But first, please read\n<https://www.gnu.org/licenses/why-not-lgpl.html>.\n"
  },
  {
    "path": "general-data-science/similarities-measures/pyproject.toml",
    "content": "[tool.poetry]\nname = \"similarities-measures\"\nversion = \"0.1.0\"\ndescription = \"\"\nauthors = [\"Shashank Kapadia <shashank.kapadia@randstadusa.com>\"]\nreadme = \"README.md\"\npackages = [{include = \"similarities_measures\"}]\n\n[tool.poetry.dependencies]\npython = \">=3.9,<3.10\"\njupyterlab = \"^3.5.2\"\nscikit-learn = \"^1.2.0\"\nmatplotlib = \"^3.6.2\"\nseaborn = \"^0.12.1\"\n\n\n[build-system]\nrequires = [\"poetry-core\"]\nbuild-backend = \"poetry.core.masonry.api\"\n"
  },
  {
    "path": "general-data-science/similarities-measures/similarity-measures.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"1d654a7e-fe50-495a-9235-303b16100d51\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Comparing 5 Data Similarity Measures\\n\",\n    \"##### Understanding Similarity Measures in Data Analysis and Machine Learning: A Comprehensive Guide\\n\",\n    \"** **\\n\",\n    \"*Preface: This article presents a summary of information about the given topic. It should not be considered original research. The information and code included in this article have may be influenced by things I have read or seen in the past from various online articles, research papers, books, and open-source code.*\\n\",\n    \"\\n\",\n    \"#### Introduction\\n\",\n    \"Similarity measures are a vital tool in many data analysis and machine learning tasks, allowing us to compare and evaluate the similarity between different pieces of data. Many different measures are available, each with pros and cons and suitable for different data types and tasks. \\n\",\n    \"\\n\",\n    \"This article will explore some of the most common similarity measures and compare their strengths and weaknesses. By understanding the characteristics and limitations of these measures, we can choose the most appropriate one for our specific needs and ensure the accuracy and relevance of our results.\\n\",\n    \"\\n\",\n    \"** **\\n\",\n    \"1. #### Euclidean Distance\\n\",\n    \"\\n\",\n    \"This measure calculates the straight-line distance between two points in n-dimensional space. It is often used for continuous numerical data and is easy to understand and implement. However, it can be sensitive to outliers and does not account for the relative importance of different features.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"662570b5-f57b-426f-9e4e-7c7ed13ee897\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from scipy.spatial import distance\\n\",\n    \"\\n\",\n    \"# Calculate Euclidean distance between two points\\n\",\n    \"point1 = [1, 2, 3]\\n\",\n    \"point2 = [4, 5, 6]\\n\",\n    \"\\n\",\n    \"# Use the euclidean function from scipy's distance module to calculate the Euclidean distance\\n\",\n    \"euclidean_distance = distance.euclidean(point1, point2)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"251a9a8b-1399-43b6-8d5d-312eaa9d87c1\",\n   \"metadata\": {},\n   \"source\": [\n    \"#### Manhattan Distance\\n\",\n    \"\\n\",\n    \"This measure calculates the distance between two points by considering the absolute differences of their coordinates in each dimension and summing them. It is less sensitive to outliers than Euclidean distance, but it may not accurately reflect the actual distance between points in some cases.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"9a535f95-56a2-4960-880f-dc5d8c003949\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from scipy.spatial import distance\\n\",\n    \"\\n\",\n    \"# Calculate Manhattan distance between two points\\n\",\n    \"point1 = [1, 2, 3]\\n\",\n    \"point2 = [4, 5, 6]\\n\",\n    \"\\n\",\n    \"# Use the cityblock function from scipy's distance module to calculate the Manhattan distance\\n\",\n    \"manhattan_distance = distance.cityblock(point1, point2)\\n\",\n    \"\\n\",\n    \"# Print the result\\n\",\n    \"print(\\\"Manhattan Distance between the given two points: \\\" + \\\\\\n\",\n    \"      str(manhattan_distance))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"20eee091-e3f0-4b67-87e6-165d5c36f0dd\",\n   \"metadata\": {},\n   \"source\": [\n    \"#### Cosine Similarity\\n\",\n    \"\\n\",\n    \"This measure calculates the similarity between two vectors by considering their angle. It is often used for text data and is resistant to changes in the magnitude of the vectors. However, it does not consider the relative importance of different features.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"fb9fa0b3-86a1-4079-99f7-33891f93f0b6\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from sklearn.metrics.pairwise import cosine_similarity\\n\",\n    \"\\n\",\n    \"# Calculate cosine similarity between two vectors\\n\",\n    \"vector1 = [1, 2, 3]\\n\",\n    \"vector2 = [4, 5, 6]\\n\",\n    \"\\n\",\n    \"# Use the cosine_similarity function from scikit-learn to calculate the similarity\\n\",\n    \"cosine_sim = cosine_similarity([vector1], [vector2])[0][0]\\n\",\n    \"\\n\",\n    \"# Print the result\\n\",\n    \"print(\\\"Cosine Similarity between the given two vectors: \\\" + \\\\\\n\",\n    \"      str(cosine_sim))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"22288d67-7006-4025-9185-491f3100dfe0\",\n   \"metadata\": {},\n   \"source\": [\n    \"#### Jaccard Similarity\\n\",\n    \"\\n\",\n    \"This measure calculates the similarity between two sets by considering the size of their intersection and union. It is often used for categorical data and is resistant to changes in the size of the sets. However, it does not consider the sets' order or frequency of elements.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"3d3339eb-60cb-4bab-8ca4-3ffc40ead500\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def jaccard_similarity(list1, list2):\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    Calculates the Jaccard similarity between two lists.\\n\",\n    \"    \\n\",\n    \"    Parameters:\\n\",\n    \"    list1 (list): The first list to compare.\\n\",\n    \"    list2 (list): The second list to compare.\\n\",\n    \"    \\n\",\n    \"    Returns:\\n\",\n    \"    float: The Jaccard similarity between the two lists.\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    # Convert the lists to sets for easier comparison\\n\",\n    \"    s1 = set(list1)\\n\",\n    \"    s2 = set(list2)\\n\",\n    \"    \\n\",\n    \"    # Calculate the Jaccard similarity by taking the length of the intersection of the sets\\n\",\n    \"    # and dividing it by the length of the union of the sets\\n\",\n    \"    return float(len(s1.intersection(s2)) / len(s1.union(s2)))\\n\",\n    \"\\n\",\n    \"# Calculate Jaccard similarity between two sets\\n\",\n    \"set1 = [1, 2, 3]\\n\",\n    \"set2 = [2, 3, 4]\\n\",\n    \"jaccard_sim = jaccard_similarity(set1, set2)\\n\",\n    \"\\n\",\n    \"# Print the result\\n\",\n    \"print(\\\"Jaccard Similarity between the given two sets: \\\" + \\\\\\n\",\n    \"      str(jaccard_sim))\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"1c0bff22-35f0-4a3f-bb1b-dab41fa87843\",\n   \"metadata\": {},\n   \"source\": [\n    \"#### Pearson Correlation Coefficient\\n\",\n    \"\\n\",\n    \"This measure calculates the linear correlation between two variables. It is often used for continuous numerical data and considers the relative importance of different features. However, it may not accurately reflect non-linear relationships.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"66c09410-c3fd-46cb-bcc6-8adf715105b7\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import numpy as np\\n\",\n    \"\\n\",\n    \"# Calculate Pearson correlation coefficient between two variables\\n\",\n    \"x = [1, 2, 3, 4]\\n\",\n    \"y = [2, 3, 4, 5]\\n\",\n    \"\\n\",\n    \"# Numpy corrcoef function to calculate the Pearson correlation coefficient and p-value\\n\",\n    \"pearson_corr = np.corrcoef(x, y)[0][1]\\n\",\n    \"\\n\",\n    \"# Print the result\\n\",\n    \"print(\\\"Pearson Correlation between the given two variables: \\\" + \\\\\\n\",\n    \"      str(pearson_corr))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"7a5e24a8-b8cc-41fc-a582-b5134d55f07b\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"### Practical Scenario\\n\",\n    \"\\n\",\n    \"Suppose we have 5 items with numerical attributes and we want to compare the similarities between these products in order to facilitate applications such as clustering, classification, or perhaps, recommendations.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"bdec85a6-0f50-413d-857a-6b90c5bb8b04\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import numpy as np\\n\",\n    \"import seaborn as sns\\n\",\n    \"import random\\n\",\n    \"import matplotlib.pyplot as plt\\n\",\n    \"import pprint\\n\",\n    \"\\n\",\n    \"def calculate_similarities(products):\\n\",\n    \"    \\\"\\\"\\\"Calculate the similarity measures between all pairs of products.\\n\",\n    \"    \\n\",\n    \"    Parameters\\n\",\n    \"    ----------\\n\",\n    \"    products : list\\n\",\n    \"        A list of dictionaries containing the attributes of the products.\\n\",\n    \"    \\n\",\n    \"    Returns\\n\",\n    \"    -------\\n\",\n    \"    euclidean_similarities : numpy array\\n\",\n    \"        An array containing the Euclidean distance between each pair of products.\\n\",\n    \"    manhattan_distances : numpy array\\n\",\n    \"        An array containing the Manhattan distance between each pair of products.\\n\",\n    \"    cosine_similarities : numpy array\\n\",\n    \"        An array containing the cosine similarity between each pair of products.\\n\",\n    \"    jaccard_similarities : numpy array\\n\",\n    \"        An array containing the Jaccard index between each pair of products.\\n\",\n    \"    pearson_similarities : numpy array\\n\",\n    \"        An array containing the Pearson correlation coefficient between each pair of products.\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    # Initialize arrays to store the similarity measures\\n\",\n    \"    euclidean_similarities = np.zeros((len(products), len(products)))\\n\",\n    \"    manhattan_distances = np.zeros((len(products), len(products)))\\n\",\n    \"    cosine_similarities = np.zeros((len(products), len(products)))\\n\",\n    \"    jaccard_similarities = np.zeros((len(products), len(products)))\\n\",\n    \"    pearson_similarities = np.zeros((len(products), len(products)))\\n\",\n    \"\\n\",\n    \"    # Calculate all the similarity measures in a single loop\\n\",\n    \"    for i in range(len(products)):\\n\",\n    \"        for j in range(i+1, len(products)):\\n\",\n    \"            p1 = products[i]['attributes']\\n\",\n    \"            p2 = products[j]['attributes']\\n\",\n    \"\\n\",\n    \"            # Calculate Euclidean distance\\n\",\n    \"            euclidean_similarities[i][j] = distance.euclidean(p1, p2)\\n\",\n    \"            euclidean_similarities[j][i] = euclidean_similarities[i][j]\\n\",\n    \"\\n\",\n    \"            # Calculate Manhattan distance\\n\",\n    \"            manhattan_distances[i][j] = distance.cityblock(p1, p2)\\n\",\n    \"            manhattan_distances[j][i] = manhattan_distances[i][j]\\n\",\n    \"\\n\",\n    \"            # Calculate cosine similarity\\n\",\n    \"            cosine_similarities[i][j] = cosine_similarity([p1], [p2])[0][0]\\n\",\n    \"            cosine_similarities[j][i] = cosine_similarities[i][j]\\n\",\n    \"\\n\",\n    \"            # Calculate Jaccard index\\n\",\n    \"            jaccard_similarities[i][j] = jaccard_similarity(p1, p2)\\n\",\n    \"            jaccard_similarities[j][i] = jaccard_similarities[i][j]\\n\",\n    \"\\n\",\n    \"            # Calculate Pearson correlation coefficient\\n\",\n    \"            pearson_similarities[i][j] = np.corrcoef(p1, p2)[0][1]\\n\",\n    \"            pearson_similarities[j][i] = pearson_similarities[i][j]\\n\",\n    \"            \\n\",\n    \"    return euclidean_similarities, manhattan_distances, cosine_similarities, jaccard_similarities, pearson_similarities\\n\",\n    \"\\n\",\n    \"def plot_similarities(similarities_list, labels, titles):\\n\",\n    \"    \\\"\\\"\\\"Plot the given similarities as heatmaps in subplots.\\n\",\n    \"    \\n\",\n    \"    Parameters\\n\",\n    \"    ----------\\n\",\n    \"    similarities_list : list of numpy arrays\\n\",\n    \"        A list of arrays containing the similarities between the products.\\n\",\n    \"    labels : list\\n\",\n    \"        A list of strings containing the labels for the products.\\n\",\n    \"    titles : list\\n\",\n    \"        A list of strings containing the titles for each plot.\\n\",\n    \"    \\n\",\n    \"    Returns\\n\",\n    \"    -------\\n\",\n    \"    None\\n\",\n    \"        This function does not return any values. It only plots the heatmaps.\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    # Set up the plot\\n\",\n    \"    fig, ax = plt.subplots(nrows=1, \\n\",\n    \"                           ncols=len(similarities_list), figsize=(6*len(similarities_list), 6/1.680))\\n\",\n    \"\\n\",\n    \"    for i, similarities in enumerate(similarities_list):\\n\",\n    \"        # Plot the heatmap\\n\",\n    \"        sns.heatmap(similarities, xticklabels=labels, yticklabels=labels, ax=ax[i])\\n\",\n    \"        ax[i].set_title(titles[i])\\n\",\n    \"        ax[i].set_xlabel(\\\"Product\\\")\\n\",\n    \"        ax[i].set_ylabel(\\\"Product\\\")\\n\",\n    \"    \\n\",\n    \"    # Show the plot\\n\",\n    \"    plt.show()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"17961dbd-b85d-4ce1-889d-fb5e2d118080\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Define the products and their attributes\\n\",\n    \"products = [\\n\",\n    \"    {'name': 'Product 1', 'attributes': random.sample(range(1, 11), 5)},\\n\",\n    \"    {'name': 'Product 2', 'attributes': random.sample(range(1, 11), 5)},\\n\",\n    \"    {'name': 'Product 3', 'attributes': random.sample(range(1, 11), 5)},\\n\",\n    \"    {'name': 'Product 4', 'attributes': random.sample(range(1, 11), 5)},\\n\",\n    \"    {'name': 'Product 5', 'attributes': random.sample(range(1, 11), 5)}\\n\",\n    \"]\\n\",\n    \"\\n\",\n    \"pprint.pprint(products)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"9bafc61f-f777-447d-a422-f8ac5bdf2079\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"euclidean_similarities, manhattan_distances, \\\\\\n\",\n    \"cosine_similarities, jaccard_similarities, \\\\\\n\",\n    \"pearson_similarities = calculate_similarities(products)\\n\",\n    \"\\n\",\n    \"# Set the labels for the x-axis and y-axis\\n\",\n    \"product_labels = [product['name'] for product in products]\\n\",\n    \"\\n\",\n    \"# List of similarity measures and their titles\\n\",\n    \"similarities_list = [euclidean_similarities, cosine_similarities, pearson_similarities, \\n\",\n    \"                     jaccard_similarities, manhattan_distances]\\n\",\n    \"titles = [\\\"Euclidean Distance\\\", \\\"Cosine Similarity\\\", \\\"Pearson Correlation Coefficient\\\", \\n\",\n    \"          \\\"Jaccard Index\\\", \\\"Manhattan Distance\\\"]\\n\",\n    \"\\n\",\n    \"# Plot the heatmaps\\n\",\n    \"plot_similarities(similarities_list, product_labels, titles)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"9a507819-e4f2-4b28-b1f4-fd215cfff40d\",\n   \"metadata\": {},\n   \"source\": [\n    \"As we can see from the charts, each distance metric produces a heat map that represents different similarities between the products, and on a different scale. While each distance metric can be used to interpret whether two products are similar or not based on the metric's value, it is difficult to determine a true measure of similarity when comparing the results across different distance metrics.\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"f5314124-5a7e-4865-9cd8-3b523d257a99\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"### How to choose the metric?\\n\",\n    \"\\n\",\n    \"There is no single \\\"true\\\" answer when it comes to choosing a distance metric, as different distance metrics are better suited for different types of data and different analysis goals. However, there are some factors that can help narrow down the possible distance metrics that might be appropriate for a given situation. Some things to consider when choosing a distance metric include:\\n\",\n    \"\\n\",\n    \"- The type of data you are working with: Some distance metrics are more appropriate for continuous data, while others are better suited for categorical or binary data.\\n\",\n    \"- The characteristics of the data: Different distance metrics are sensitive to different aspects of the data, such as the magnitudes of differences between attributes or the angles between attributes. Consider which characteristics of the data are most important to your analysis and choose a distance metric that is sensitive to these characteristics.\\n\",\n    \"- The goals of your analysis: Different distance metrics can highlight different patterns or relationships in the data, so consider what you are trying to learn from your analysis and choose a distance metric that is well-suited to this purpose.\\n\",\n    \"\\n\",\n    \"Personally, I often use the following chart as a starting point when choosing a distance metric.\\n\",\n    \"\\n\",\n    \"![flowchart](similaritymeasures.png)\\n\",\n    \"\\n\",\n    \"Again, it is important to carefully consider the data type and characteristics when selecting a similarity metric, as well as the specific goals of the analysis.\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.9.16\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "natural-language-processing/embedding-models/data/training_data.csv",
    "content": "base,ref,similarity\ndresser,data scientist,0.5\nclubhost,data scientist,0.5\nco-pilot,data analyst,0.5\ndata analyst,textile quality manager,0.5\ndata analyst,tracer powder blender,0.5\ndata analyst,pottery and porcelain caster,0.5\ndata analyst,equine dental technician,0.5\ndata analyst,\" Equine dental technicians provide routine equine dental care, using appropriate equipment in accordance with national legislation. \n\",0.5\ndata analyst,vessel assembly inspector,0.5\ndata analyst,bacteriology technician,0.5\ndata analyst,mover,0.5\ndata scientist,manager di fondi pensione,0.5\ndata scientist,takelaar,0.5\ndata scientist,tapijtknoper,0.5\ndata scientist,technicus waterleidingssystemen,0.5\ndata scientist,textile quality manager,0.5\ndata scientist,nailing machine operator,0.5\ndata scientist,hot foil operator,0.5\ndata scientist,I manager di fondi pensione coordinano i fondi pensione al fine di fornire prestazioni pensionistiche a individui o organizzazioni. Assicurano la gestione giornaliera dei fondi pensione e definiscono la politica strategica per lo sviluppo di nuovi pacchetti pensionistici.,0.5\ndata scientist,reisagent,0.5\ndata scientist,matrassenmaker,0.5\ndata scientist,emailleur,0.5\ndata scientist,meester-koffiebrander,0.5\ndata scientist,operatore di macchine per la produzione cartotecnica/operatrice su macchine per la produzione cartotecnica,0.5\ndata scientist,watch and clock repairer,0.5\ndata scientist,tapijtlegger,0.5\ndata scientist,credit union manager,0.5\ndata scientist,\"Textile quality managers implement, manage and promote quality systems. They make sure that the textile products adhere to the quality standards of the organisation. Textile quality managers therefore inspect textile production lines and products.\",0.5\ndata scientist,assemblagetechnicus accusystemen,0.5\ndata scientist,yeast distiller,0.5\ndata scientist,\"Hot foil operators tend machines which apply a metallic foil on other materials using pressure cylinders and heating. They also mix colors, set up the appropriate machinery equipment and monitor printing.\",0.5\ndata scientist,addetto alle operazioni fiscali/addetta alle operazioni fiscali,0.5\ndata scientist,viskooitechnicus,0.5\ndata scientist,teamleider in een mijn,0.5\nrental manager,data scientist,0.5\ntrain preparer,data scientist,0.5\nceiling installer,data analyst,0.5\nigienista dentale,data scientist,0.5\ncabin crew manager,data scientist,0.5\nincisore su metallo,data scientist,0.5\njustice of the peace,data analyst,0.5\nimport export manager,data scientist,0.5\nconducente di autocarri,data scientist,0.5\nconducente di autocarri,\"I data scientist scoprono e interpretano fonti ricche di dati, gestiscono grandi quantità di dati, ne aggregano le fonti, garantiscono la coerenza degli insiemi di dati e creano visualizzazioni per contribuire alla loro comprensione. Costruiscono modelli matematici che utilizzano dati, presentano e comunicano informazioni e conoscenze sui dati agli specialisti e agli scienziati nella loro squadra e, se necessario, a un pubblico non specializzato e raccomandano modalità di applicazione dei dati.\",0.5\nfootwear CAD patternmaker,data analyst,0.5\nufficiale di stato civile,data scientist,0.5\ndental instrument assembler,data scientist,0.5\nwaste management supervisor,data analyst,0.5\ncarrosserie- en voertuigbouwer,data scientist,0.5\nfuel station specialised seller,data analyst,0.5\nproductieleider chemische industrie,data scientist,0.5\nmarinaio addetto al servizio di coperta,data scientist,0.5\nallenatore sportivo/allenatrice sportiva,data scientist,0.5\nallenatore sportivo/allenatrice sportiva,\"I data scientist scoprono e interpretano fonti ricche di dati, gestiscono grandi quantità di dati, ne aggregano le fonti, garantiscono la coerenza degli insiemi di dati e creano visualizzazioni per contribuire alla loro comprensione. Costruiscono modelli matematici che utilizzano dati, presentano e comunicano informazioni e conoscenze sui dati agli specialisti e agli scienziati nella loro squadra e, se necessario, a un pubblico non specializzato e raccomandano modalità di applicazione dei dati.\",0.5\nwater conservation technician supervisor,data scientist,0.5\nsupervisore di assemblaggio di veicoli a motore,data scientist,0.5\ntecnico della qualità dei prodotti di pelletteria,data scientist,0.5\nresponsabile della distribuzione di macchine e attrezzature agricole,data scientist,0.5\n\"wholesale merchant in agricultural raw materials, seeds and animal feeds\",data analyst,0.5\nconfezionatore caseario artigianale /confezionatrice casearia artigianale,data scientist,0.5\nconfezionatore caseario artigianale /confezionatrice casearia artigianale,\"I data scientist scoprono e interpretano fonti ricche di dati, gestiscono grandi quantità di dati, ne aggregano le fonti, garantiscono la coerenza degli insiemi di dati e creano visualizzazioni per contribuire alla loro comprensione. Costruiscono modelli matematici che utilizzano dati, presentano e comunicano informazioni e conoscenze sui dati agli specialisti e agli scienziati nella loro squadra e, se necessario, a un pubblico non specializzato e raccomandano modalità di applicazione dei dati.\",0.5\nimport-exportmanager ijzerwaren en producten voor loodgieterij en verwarming,data scientist,0.5\naddetto all’assemblaggio di strumenti di precisione/addetta all’assemblaggio di strumenti di precisione,data scientist,0.5\ndata analyst,\"Data analysts import, inspect, clean, transform, validate, model, or interpret collections of data with regard to the business goals of the company. They ensure that the data sources and repositories provide consistent and reliable data. Data analysts use different algorithms and IT tools as demanded by the situation and the current data. They might prepare reports in the form of visualisations such as graphs, charts, and dashboards.\",1.0\ndata scientist,\"Data scientists find and interpret rich data sources, manage large amounts of data, merge data sources, ensure consistency of data-sets, and create visualisations to aid in understanding data. They build mathematical models using data, present and communicate data insights and findings to specialists and scientists in their team and if required, to a non-expert audience, and recommend ways to apply the data.\",1.0\ndata scientist,\"Data scientists zoeken en interpreteren rijke gegevensbronnen, beheren grote hoeveelheden gegevens, voegen gegevensbronnen samen, zorgen voor de consistentie van datasets en creëren visualisaties om te helpen gegevens te begrijpen. Zij bouwen wiskundige modellen op basis van data, presenteren en communiceren gegevensinzichten en bevindingen aan specialisten en wetenschappers in hun team en, indien nodig, aan een niet-deskundig publiek, en bevelen manieren aan om de data toe te passen.\",1.0\ndata scientist,\"I data scientist scoprono e interpretano fonti ricche di dati, gestiscono grandi quantità di dati, ne aggregano le fonti, garantiscono la coerenza degli insiemi di dati e creano visualizzazioni per contribuire alla loro comprensione. Costruiscono modelli matematici che utilizzano dati, presentano e comunicano informazioni e conoscenze sui dati agli specialisti e agli scienziati nella loro squadra e, se necessario, a un pubblico non specializzato e raccomandano modalità di applicazione dei dati.\",1.0\ncosmologist,cosmology data scientist,0.74681668889336494\ndata analyst,data storage analyst,0.74681668889336494\ndata analyst,data warehouse analyst,0.74681668889336494\ndata analyst,data warehousing analyst,0.74681668889336494\nmeter reader,metering data analyst,0.74681668889336494\nstatistician,statistical data analyst,0.74681668889336494\ndata scientist,esperta di dati,0.74681668889336494\ndata scientist,data engineer,0.74681668889336494\ndata scientist,data-scientist,0.74681668889336494\ndata scientist,data research scientist,0.74681668889336494\ndata scientist,esperto di dati,0.74681668889336494\ndata scientist,data expert,0.74681668889336494\ndata scientist,data analyst,0.74681668889336494\ndata scientist,research data scientist,0.74681668889336494\ndata scientist,research analist,0.74681668889336494\ndata scientist,research data scientist,0.74681668889336494\ndata scientist,analista dei dati di ricerca,0.74681668889336494\ndata scientist,business data scientist,0.74681668889336494\nanalista dei dati,data analyst,0.74681668889336494\ncall centre analyst,sales data analyst,0.74681668889336494\ncall centre analyst,CRM data analyst,0.74681668889336494\ncall centre analyst,customer data analyst,0.74681668889336494\ncall centre analyst,senior data analyst,0.74681668889336494\ncall centre analyst,assistant data analyst,0.74681668889336494\ncall centre analyst,trainee data analyst,0.74681668889336494\ncall centre analyst,graduate data analyst,0.74681668889336494\ncall centre analyst,marketing data analyst,0.74681668889336494\ncall centre analyst,IT data analyst,0.74681668889336494\nanalyste de données,data analyst,0.74681668889336494\nbioinformatics scientist,data scientist,0.74681668889336494\nscientifique des données,data scientist,0.74681668889336494"
  },
  {
    "path": "natural-language-processing/embedding-models/domain_adaption_fine_tune_nlp_model.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"cba912be-879a-4a9b-a7e9-180f42555cb9\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Domain Adaption: Fine-Tune Pre-Trained NLP Models\\n\",\n    \"\\n\",\n    \"### Introduction\\n\",\n    \"In today's world, the availability of pre-trained NLP models has greatly simplified the interpretation of textual data using deep learning techniques. However, while these models excel in general tasks, they often lack adaptability to specific domains. This comprehensive guide aims to walk you through the process of fine-tuning pre-trained NLP models to achieve improved performance in a particular domain.\\n\",\n    \"\\n\",\n    \"#### Motivation\\n\",\n    \"Although pre-trained NLP models like BERT and the Universal Sentence Encoder (USE) are effective in capturing linguistic intricacies, their performance in domain-specific applications can be limited due to the diverse range of datasets they are trained on. This limitation becomes evident when analyzing relationships within a specific domain. \\n\",\n    \"\\n\",\n    \"For example, when working with employment data, we expect the model to recognize the closer proximity between the roles of 'Data Scientist' and 'Machine Learning Engineer', or the stronger association between 'Python' and 'TensorFlow'. Unfortunately, general-purpose models often miss these nuanced relationships.\\n\",\n    \"\\n\",\n    \"To address this issue, we can fine-tune pre-trained models with high-quality, domain-specific datasets. This adaptation process significantly enhances the model's performance and precision, fully unlocking the potential of the NLP model.\\n\",\n    \"\\n\",\n    \"When dealing with large pre-trained NLP models, it is advisable to initially deploy the base model and consider fine-tuning only if its performance falls short for the specific problem at hand.\\n\",\n    \" \\n\",\n    \"This tutorial focuses on fine-tuning the Universal Sentence Encoder (USE) model using easily accessible open-source data.\\n\",\n    \"\\n\",\n    \"### Theoretical Overview\\n\",\n    \"Fine-tuning an ML model can be achieved through various strategies, such as supervised learning and reinforcement learning. In this tutorial, we will concentrate on a one(few)-shot learning approach combined with a siamese architecture for the fine-tuning process.\\n\",\n    \"\\n\",\n    \"#### Methodology\\n\",\n    \"In this tutorial, we utilize a siamese neural network, which is a specific type of Artificial Neural Network. This network leverages shared weights while simultaneously processing two distinct input vectors to compute comparable output vectors. Inspired by one-shot learning, this approach has proven to be particularly effective in capturing semantic similarity, although it may require longer training times and lack probabilistic output.\\n\",\n    \"\\n\",\n    \"A Siamese Neural Network creates an 'embedding space' where related concepts are positioned closely, enabling the model to better discern semantic relations.\\n\",\n    \"- Twin Branches and Shared Weights: The architecture consists of two identical branches, each containing an embedding layer with shared weights. These dual branches handle two inputs simultaneously, either similar or dissimilar.\\n\",\n    \"- Similarity and Transformation: The inputs are transformed into vector embeddings using the pre-trained NLP model. The architecture then calculates the similarity between the vectors. The similarity score, ranging between -1 and 1, quantifies the angular distance between the two vectors, serving as a metric for their semantic similarity.\\n\",\n    \"- Contrastive Loss and Learning: The model's learning is guided by the \\\"Contrastive Loss,\\\" which is the difference between the expected output (similarity score from the training data) and the computed similarity. This loss guides the adjustment of the model's weights to minimize the loss and enhance the quality of the learned embeddings.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"13a3c7e3-4b5f-4369-b6b0-e0ea5b43f425\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import pandas as pd\\n\",\n    \"import math\\n\",\n    \"import tensorflow as tf\\n\",\n    \"import tensorflow_hub as hub\\n\",\n    \"from tensorflow import keras\\n\",\n    \"from tensorflow_text import SentencepieceTokenizer\\n\",\n    \"import os\\n\",\n    \"from datetime import datetime\\n\",\n    \"import numpy as np\\n\",\n    \"from utils import *\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"9c770e7f-5a4e-42d7-89d6-a8eb0eec8210\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Data Overview\\n\",\n    \"\\n\",\n    \"For the fine-tuning of pre-trained NLP models using this method, the training data should consist of pairs of text strings accompanied by similarity scores between them. \\n\",\n    \"\\n\",\n    \"In this tutorial, we use a dataset sourced from the ESCO classification dataset, which has been transformed to generate similarity scores based on the relationships between different data elements.\\n\",\n    \"\\n\",\n    \"Preparing the training data is a crucial step in the fine-tuning process. It is assumed that you have access to the required data and a method to transform it into the specified format. Since the focus of this article is to demonstrate the fine-tuning process, we will omit the details of how the data was generated using the ESCO dataset.\\n\",\n    \"\\n\",\n    \"Let's start by examining the training data:\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"94c76e58-475b-49b4-a97e-5c7b1e6e2630\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# The data from this file is stored in the variable \\\"data\\\".\\n\",\n    \"data = pd.read_csv(\\\"./data/training_data.csv\\\")\\n\",\n    \"\\n\",\n    \"# Use the head function on the DataFrame to display its first 5 rows.\\n\",\n    \"data.head()\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"20dc8c3c-2835-4472-894f-12239de8173b\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Baseline Model\\n\",\n    \"To begin, we establish the multilingual universal sentence encoder as our baseline model. It is essential to set this baseline before proceeding with the fine-tuning process.\\n\",\n    \"\\n\",\n    \"For this tutorial, we will use the STS benchmark and a sample similarity visualization as metrics to evaluate the changes and improvements achieved through the fine-tuning process.\\n\",\n    \"\\n\",\n    \"The STS Benchmark dataset consists of English sentence pairs, each associated with a similarity score. During the model training process, we evaluate the model's performance on this benchmark set. The persisted scores for each training run are the Pearson correlation between the predicted similarity scores and the actual similarity scores in the dataset. \\n\",\n    \"\\n\",\n    \"These scores ensure that as the model is fine-tuned with our context-specific training data, it maintains some level of generalizability.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"f643d10f-716e-4982-984e-48f0bbf2e3df\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Loads the Universal Sentence Encoder Multilingual module from TensorFlow Hub.\\n\",\n    \"base_model_url = \\\"https://tfhub.dev/google/universal-sentence-encoder-multilingual/3\\\"\\n\",\n    \"base_model = tf.keras.Sequential([\\n\",\n    \"    hub.KerasLayer(base_model_url,\\n\",\n    \"                   input_shape=[],\\n\",\n    \"                   dtype=tf.string,\\n\",\n    \"                   trainable=False)\\n\",\n    \"])\\n\",\n    \"\\n\",\n    \"# Defines a list of test sentences. These sentences represent various job titles.\\n\",\n    \"test_text = ['Data Scientist', 'Data Analyst', 'Data Engineer',\\n\",\n    \"             'Nurse Practitioner', 'Registered Nurse', 'Medical Assistant',\\n\",\n    \"             'Social Media Manager', 'Marketing Strategist', 'Product Marketing Manager']\\n\",\n    \"\\n\",\n    \"# Creates embeddings for the sentences in the test_text list. \\n\",\n    \"# The np.array() function is used to convert the result into a numpy array.\\n\",\n    \"# The .tolist() function is used to convert the numpy array into a list, which might be easier to work with.\\n\",\n    \"vectors = np.array(base_model.predict(test_text)).tolist()\\n\",\n    \"\\n\",\n    \"# Calls the plot_similarity function to create a similarity plot.\\n\",\n    \"plot_similarity(test_text, vectors, 90, \\\"base model\\\")\\n\",\n    \"\\n\",\n    \"# Computes STS benchmark score for the base model\\n\",\n    \"pearsonr = sts_benchmark(base_model)\\n\",\n    \"print(\\\"STS Benachmark: \\\" + str(pearsonr))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"51a178d6-760d-4c0b-8994-0bac82118a98\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Fine Tuning the Model\\n\",\n    \"The next step involves constructing the siamese model architecture using the baseline model and fine-tuning it with our domain-specific data.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"f65eca14-f5c4-41f5-8048-ce7480759a33\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Load the pre-trained word embedding model\\n\",\n    \"embedding_layer = hub.load(base_model_url)\\n\",\n    \"\\n\",\n    \"# Create a Keras layer from the loaded embedding model\\n\",\n    \"shared_embedding_layer = hub.KerasLayer(embedding_layer, trainable=True)\\n\",\n    \"\\n\",\n    \"# Define the inputs to the model\\n\",\n    \"left_input = keras.Input(shape=(), dtype=tf.string)\\n\",\n    \"right_input = keras.Input(shape=(), dtype=tf.string)\\n\",\n    \"\\n\",\n    \"# Pass the inputs through the shared embedding layer\\n\",\n    \"embedding_left_output = shared_embedding_layer(left_input)\\n\",\n    \"embedding_right_output = shared_embedding_layer(right_input)\\n\",\n    \"\\n\",\n    \"# Compute the cosine similarity between the embedding vectors\\n\",\n    \"cosine_similarity = tf.keras.layers.Dot(axes=-1, normalize=True)(\\n\",\n    \"    [embedding_left_output, embedding_right_output]\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"# Convert the cosine similarity to angular distance\\n\",\n    \"pi = tf.constant(math.pi, dtype=tf.float32)\\n\",\n    \"clip_cosine_similarities = tf.clip_by_value(\\n\",\n    \"    cosine_similarity, -0.99999, 0.99999\\n\",\n    \")\\n\",\n    \"acos_distance = 1.0 - (tf.acos(clip_cosine_similarities) / pi)\\n\",\n    \"\\n\",\n    \"# Package the model\\n\",\n    \"encoder = tf.keras.Model([left_input, right_input], acos_distance)\\n\",\n    \"\\n\",\n    \"# Compile the model\\n\",\n    \"encoder.compile(\\n\",\n    \"    optimizer=tf.keras.optimizers.Adam(\\n\",\n    \"        learning_rate=0.00001,\\n\",\n    \"        beta_1=0.9,\\n\",\n    \"        beta_2=0.9999,\\n\",\n    \"        epsilon=0.0000001,\\n\",\n    \"        amsgrad=False,\\n\",\n    \"        clipnorm=1.0,\\n\",\n    \"        name=\\\"Adam\\\",\\n\",\n    \"    ),\\n\",\n    \"    loss=tf.keras.losses.MeanSquaredError(\\n\",\n    \"        reduction=keras.losses.Reduction.AUTO, name=\\\"mean_squared_error\\\"\\n\",\n    \"    ),\\n\",\n    \"    metrics=[\\n\",\n    \"        tf.keras.metrics.MeanAbsoluteError(),\\n\",\n    \"        tf.keras.metrics.MeanAbsolutePercentageError(),\\n\",\n    \"    ],\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"# Print the model summary\\n\",\n    \"encoder.summary()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"250e87af-306a-47b9-8349-ca64c54f06ac\",\n   \"metadata\": {\n    \"scrolled\": true\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"early_stop = keras.callbacks.EarlyStopping(\\n\",\n    \"                monitor=\\\"loss\\\", patience=3, min_delta=0.001\\n\",\n    \"            )\\n\",\n    \"logdir = os.path.join(\\n\",\n    \"                \\\".\\\",\\n\",\n    \"                \\\"logs/fit/\\\" + datetime.now().strftime(\\\"%Y%m%d-%H%M%S\\\"),\\n\",\n    \"            )\\n\",\n    \"tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)\\n\",\n    \"\\n\",\n    \"# Model Input\\n\",\n    \"left_inputs, right_inputs, similarity = process_model_input(data)\\n\",\n    \"\\n\",\n    \"history = encoder.fit(\\n\",\n    \"                [left_inputs, right_inputs],\\n\",\n    \"                similarity,\\n\",\n    \"                batch_size=8,\\n\",\n    \"                epochs=20,\\n\",\n    \"                validation_split=0.2,\\n\",\n    \"                callbacks=[early_stop, tensorboard_callback],\\n\",\n    \"            )\\n\",\n    \"\\n\",\n    \"inputs = keras.Input(shape=[], dtype=tf.string)\\n\",\n    \"embedding = hub.KerasLayer(embedding_layer)(inputs)\\n\",\n    \"\\n\",\n    \"tuned_model = keras.Model(inputs=inputs, outputs=embedding)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"06975f60-e3a1-4c46-b5fa-3bc2512061a6\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Evaluation\\n\",\n    \"\\n\",\n    \"Now that we have the fine-tuned model, let's re-evaluate it and compare the results to those of the base model.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"3260ff05-07aa-4c55-9509-4ca60fada326\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Creates embeddings for the sentences in the test_text list. \\n\",\n    \"# The np.array() function is used to convert the result into a numpy array.\\n\",\n    \"# The .tolist() function is used to convert the numpy array into a list, which might be easier to work with.\\n\",\n    \"vectors = np.array(tuned_model.predict(test_text)).tolist()\\n\",\n    \"\\n\",\n    \"# Calls the plot_similarity function to create a similarity plot.\\n\",\n    \"plot_similarity(test_text, vectors, 90, \\\"tuned model\\\")\\n\",\n    \"\\n\",\n    \"# Computes STS benchmark score for the tuned model\\n\",\n    \"pearsonr = sts_benchmark(tuned_model)\\n\",\n    \"print(\\\"STS Benachmark: \\\" + str(pearsonr))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"8807399d-3360-4d7b-8413-4eb0c033e922\",\n   \"metadata\": {},\n   \"source\": [\n    \"Based on fine-tuning the model on the relatively small dataset, the STS benchmark score is comparable to that of the baseline model, indicating that the tuned model still exhibits generalizability. However, the similarity visualization demonstrates strengthened similarity scores between similar titles and a reduction in scores for dissimilar ones.\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"114c50ab-9d12-4c3b-8ea9-f4ca2725efd3\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"### Closing Thoughts\\n\",\n    \"\\n\",\n    \"Fine-tuning pre-trained NLP models for domain adaptation is a powerful technique to improve their performance and precision in specific contexts. By utilizing quality, domain-specific datasets and leveraging siamese neural networks, we can enhance the model's ability to capture semantic similarity.\\n\",\n    \"\\n\",\n    \"This tutorial provided a step-by-step guide to the fine-tuning process, using the Universal Sentence Encoder (USE) model as an example. We explored the theoretical framework, data preparation, baseline model evaluation, and the actual fine-tuning process. The results demonstrated the effectiveness of fine-tuning in strengthening similarity scores within a domain.\\n\",\n    \"\\n\",\n    \"By following this approach and adapting it to your specific domain, you can unlock the full potential of pre-trained NLP models and achieve better results in your natural language processing tasks\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.9.16\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "natural-language-processing/embedding-models/pyproject.toml",
    "content": "[tool.poetry]\nname = \"embedding-models\"\nversion = \"0.1.0\"\ndescription = \"\"\nauthors = [\"Shashank Kapadia <smhkapadia@gmail.com>\"]\nreadme = \"README.md\"\n\n[tool.poetry.dependencies]\npython = \">=3.9,<3.10\"\njupyterlab = \"^4.0.2\"\ntensorflow-hub = \"^0.13.0\"\ntensorflow = [\n    { version = \"2.10.0\", platform = \"linux\" },\n]\ntensorflow-macos = [\n    { version = \"2.10.0\", platform = \"darwin\" },\n]\n#tensorflow-text = \"^2.10.0\"\ntensorflow-text = [\n    {file = \"packages/tensorflow_text-2.10.0-cp39-cp39-macosx_11_0_arm64.whl\", platform = \"darwin\"},\n]\npandas = \"^2.0.3\"\nseaborn = \"^0.12.2\"\ntqdm = \"^4.65.0\"\n\n\n[build-system]\nrequires = [\"poetry-core\"]\nbuild-backend = \"poetry.core.masonry.api\"\n"
  },
  {
    "path": "natural-language-processing/embedding-models/utils.py",
    "content": "import numpy as np\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nimport tensorflow as tf\nimport tqdm\nfrom scipy import stats\nimport pandas as pd\nimport os\nimport math\n\n\ndef plot_similarity(labels, features, rotation, version):\n    corr = np.inner(features, features)\n    sns.set(font_scale=1.2)\n    g = sns.heatmap(\n        corr,\n        xticklabels=labels,\n        yticklabels=labels,\n        vmin=0,\n        vmax=1,\n        cmap=\"YlOrRd\")\n    g.set_xticklabels(labels, rotation=rotation)\n    g.set_title(version)\n\ndef cosine_similarity(vector_1, vector_2):\n    \"\"\"\n    Compute cosine similarity between two vectors.\n\n    Args:\n        vector_1 (List[float]): A list of float values representing the first vector.\n        vector_2 (List[float]): A list of float values representing the second vector.\n\n    Returns:\n        float: The cosine similarity between the two vectors, which is the dot product of\n        the two vectors divided by the product of their magnitudes.\n    \"\"\"\n    sumxx, sumxy, sumyy = 0, 0, 0\n    for i in range(len(vector_1)):\n        x = vector_1[i]\n        y = vector_2[i]\n        sumxx += x * x\n        sumyy += y * y\n        sumxy += x * y\n    return sumxy / math.sqrt(sumxx * sumyy)\n\n\ndef sts_benchmark(model, type=\"dev\"):\n    \"\"\"\n    Compute the Pearson correlation between predicted cosine similarity scores and human-labeled similarity scores\n    on the STS benchmark dataset.\n\n    Args:\n        model (tensorflow.keras.Model): A trained sentence embedding model that takes in an input sentence and\n                                        outputs a corresponding sentence embedding.\n        type (str): The type of STS benchmark dataset to use. Either \"dev\" for the development dataset or \"test\"\n                    for the test dataset. Default is \"dev\".\n\n    Returns:\n        tuple: A tuple containing the Pearson correlation coefficient and the p-value of the correlation test.\n    \"\"\"\n\n    def _get_sts_dataset(type=\"test\"):\n        \"\"\"\n\n        :param type:\n        :return:\n        \"\"\"\n\n        sts_dataset = tf.keras.utils.get_file(\n            fname=\"Stsbenchmark.tar.gz\",\n            origin=\"http://ixa2.si.ehu.es/stswiki/images/4/48/Stsbenchmark.tar.gz\",\n            extract=True,\n        )\n        if type == \"dev\":\n            data = pd.read_table(\n                os.path.join(\n                    os.path.dirname(sts_dataset), \"stsbenchmark\", \"sts-dev.csv\"\n                ),\n                on_bad_lines=\"skip\",\n                engine=\"python\",\n                skip_blank_lines=True,\n                usecols=[4, 5, 6],\n                names=[\"sim\", \"sent_1\", \"sent_2\"],\n            )\n        else:\n            data = pd.read_table(\n                os.path.join(\n                    os.path.dirname(sts_dataset), \"stsbenchmark\", \"sts-test.csv\"\n                ),\n                on_bad_lines=\"skip\",\n                engine=\"python\",\n                skip_blank_lines=True,\n                usecols=[4, 5, 6],\n                names=[\"sim\", \"sent_1\", \"sent_2\"],\n            )\n\n        return data\n\n    data = _get_sts_dataset(type=type)\n    data = data[[isinstance(s, str) for s in data[\"sent_2\"]]].reset_index()\n\n    # prepare data\n    base_text = [data[\"sent_1\"][i] for i in range(len(data))]\n    ref_text = [data[\"sent_2\"][i] for i in range(len(data))]\n    scores = data[\"sim\"].tolist()\n\n    base_vectors = []\n    ref_vectors = []\n\n    # get text vectors from base, tuned, pre-tuned models\n    for i in range(len(base_text)):\n        base_vectors.append(list(model.predict([base_text[i]])[0]))\n        ref_vectors.append(list(model.predict([ref_text[i]])[0]))\n\n    base_cosine_similarity = [\n        cosine_similarity(base_vectors[i], ref_vectors[i])\n        for i in range(len(base_text))\n    ]\n    return stats.pearsonr(scores, base_cosine_similarity)\n\ndef process_model_input(data):\n    \"\"\"\n    Processes the input data by reshaping the left and right inputs and converting the similarity values.\n\n    Args:\n        data: (dict) A dictionary containing the keys \"base\", \"ref\", and \"similarity\",\n            with values corresponding to the input base text, reference text, and similarity values respectively.\n\n    Returns:\n        tuple: A tuple of three numpy arrays containing the preprocessed left inputs, right inputs,\n            and similarity values respectively.\n    \"\"\"\n    text_list = [list(data[\"base\"].values), list(data[\"ref\"].values)]\n    left_inputs = np.asarray(text_list[0])\n    right_inputs = np.asarray(text_list[1])\n    left_inputs = left_inputs.reshape(\n        left_inputs.shape[0],\n    )\n    right_inputs = right_inputs.reshape(\n        right_inputs.shape[0],\n    )\n\n    # 1 if we inputs are semantically similiar, 0 if not.\n    # Check the distance function defined as 1-arccos(similiarity)/pi which has range between 1,0 for domain 0 to 1\n    similarity = np.asarray(list(data[\"similarity\"].values))\n\n    return left_inputs, right_inputs, similarity\n\n"
  },
  {
    "path": "natural-language-processing/text-processing/Building Blocks Text Pre-Processing.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Building Blocks: Text Pre-Processing\\n\",\n    \"\\n\",\n    \"This article is the second of more to come articles on Natural Language Processing. The purpose of this series of articles is to document my journey as I learn about this subject, as well as help others gain efficiency from it.\\n\",\n    \"\\n\",\n    \"In the last article of our series, we introduced the concept of Natural Language Processing, you can read it here, and now you probably want to try it yourself, right? Great! Without further ado, let's dive in to the building blocks for statistical natural language processing. \\n\",\n    \"\\n\",\n    \"In this article, we'll introduce the key concepts, along with practical implementation in Python and the challenges to keep in mind at the time of application.\\n\",\n    \"\\n\",\n    \"**References:**\\n\",\n    \"- A General Approach to Preprocessing Text Data — KDnuggets. https://www.kdnuggets.com/2017/12/general-approach-preprocessing-text-data.html\\n\",\n    \"- Tokenization — Stanford NLP Group. https://nlp.stanford.edu/IR-book/html/htmledition/tokenization-1.html\\n\",\n    \"- Text Mining in Bovine Diseases — ijcaonline.org. https://www.ijcaonline.org/volume6/number10/pxc3871454.pdf\\n\",\n    \"- Stemming and lemmatization — Stanford NLP Group. https://nlp.stanford.edu/IR-book/html/htmledition/stemming-and-lemmatization-1.html\\n\",\n    \"** **\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Text Normalization\\n\",\n    \"\\n\",\n    \"Normalizing the text means converting it to a more convenient, standard form before performing turning it to features for higher level modeling. Think of this step as converting human readable language into a form that is machine readable.\\n\",\n    \"\\n\",\n    \"The standard framework to normalize the text includes:\\n\",\n    \"1. Tokenization\\n\",\n    \"2. Stop Words Removal\\n\",\n    \"3. Morphological Normalization\\n\",\n    \"4. Collocation\\n\",\n    \"\\n\",\n    \"Data preprocessing consists of a number of steps, any number of which may or not apply to a given task. More generally, in this article we'll discuss some predetermined body of text, and perform some basic transformative analysis that can be used for performing further, more meaningful natural language processing\\n\",\n    \"\\n\",\n    \"** **\\n\",\n    \"#### Tokenization\\n\",\n    \"\\n\",\n    \"Given a character sequence and a defined document unit (blurb of texts), tokenization is the task of chopping it up into pieces, called tokens, perhaps at the same time throwing away certain characters/words, such as punctuation. Ordinarily, there are two types of tokenization:\\n\",\n    \"\\n\",\n    \"1. Word Tokenization: Used to separate words via unique space character. Depending on the application, word tokenization may also tokenize multi-word expressions like New York. This is often times is closely tied to a process called Named Entity Recognition. Later in this tutorial, we will look at Collocation (Phrase) Modeling that helps address part of this challenge\\n\",\n    \"\\n\",\n    \"2. Sentence Tokenization/Segmentation: Along with word tokenization, sentence segmentation is a crucial step in text processing. This is usually performed based on punctuations such as \\\".\\\", \\\"?\\\", \\\"!\\\" as they tend to mark the sentence boundaries\\n\",\n    \"\\n\",\n    \"**Challenges:**\\n\",\n    \"- The use of abbreviations may prompt the tokenizer to detect a sentence boundary where there is none. \\n\",\n    \"- Numbers, special characters, hyphenation, and capitalization. In the expressions \\\"don't,\\\" \\\"I'd,\\\" \\\"John's\\\" do we have one, two or three tokens?\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import nltk\\n\",\n    \"\\n\",\n    \"nltk.download('punkt')\\n\",\n    \"nltk.download('stopwords')\\n\",\n    \"nltk.download('wordnet')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from nltk.tokenize import sent_tokenize, word_tokenize\\n\",\n    \"\\n\",\n    \"#Sentence Tokenization\\n\",\n    \"print ('Following is the list of sentences tokenized from the sample review\\\\n')\\n\",\n    \"\\n\",\n    \"sample_text = \\\"\\\"\\\"The first time I ate here I honestly was not that impressed. I decided to wait a bit and give it another chance. \\n\",\n    \"I have recently eaten there a couple of times and although I am not convinced that the pricing is particularly on point the two mushroom and \\n\",\n    \"swiss burgers I had were honestly very good. The shakes were also tasty. Although Mad Mikes is still my favorite burger around, \\n\",\n    \"you can do a heck of a lot worse than Smashburger if you get a craving\\\"\\\"\\\"\\n\",\n    \"\\n\",\n    \"tokenize_sentence = sent_tokenize(sample_text)\\n\",\n    \"\\n\",\n    \"print (tokenize_sentence)\\n\",\n    \"print ('---------------------------------------------------------\\\\n')\\n\",\n    \"print ('Following is the list of words tokenized from the sample review sentence\\\\n')\\n\",\n    \"tokenize_words = word_tokenize(tokenize_sentence[1])\\n\",\n    \"print (tokenize_words)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Stop Words Removal\\n\",\n    \"Often, there are a few ubiquitous words which would appear to be of little value in helping the purpose of analysis but increases the dimensionality of feature set, are excluded from the vocabulary entirely as the part of stop words removal process. There are two considerations usually that motivate this removal.\\n\",\n    \"\\n\",\n    \"1. Irrelevance: Allows one to analyze only on content-bearing words. Stopwords, also called empty words because they generally do not bear much meaning, introduce noise in the analysis/modeling process\\n\",\n    \"2. Dimension: Removing the stopwords also allows one to reduce the tokens in documents significantly, and thereby decreasing feature dimension\\n\",\n    \"\\n\",\n    \"**Challenges:**\\n\",\n    \"\\n\",\n    \"Converting all characters into lowercase letters before stopwords removal process can introduce ambiguity in the text, and sometimes entirely changing the meaning of it. For example, with the expressions \\\"US citizen\\\" will be viewed as \\\"us citizen\\\" or \\\"IT scientist\\\" as \\\"it scientist\\\". Since both *us* and *it* are normally considered stop words, it would result in an inaccurate outcome. The strategy regarding the treatment of stopwords can thus be refined by identifying that \\\"US\\\" and \\\"IT\\\" are not pronouns in the above examples, through a part-of-speech tagging step.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from nltk.corpus import stopwords\\n\",\n    \"from nltk.tokenize import word_tokenize\\n\",\n    \"\\n\",\n    \"# define the language for stopwords removal\\n\",\n    \"stopwords = set(stopwords.words(\\\"english\\\"))\\n\",\n    \"print (\\\"\\\"\\\"{0} stop words\\\"\\\"\\\".format(len(stopwords)))\\n\",\n    \"\\n\",\n    \"tokenize_words = word_tokenize(sample_text)\\n\",\n    \"filtered_sample_text = [w for w in tokenize_words if not w in stopwords]\\n\",\n    \"\\n\",\n    \"print ('\\\\nOriginal Text:')\\n\",\n    \"print ('------------------\\\\n')\\n\",\n    \"print (sample_text)\\n\",\n    \"print ('\\\\n Filtered Text:')\\n\",\n    \"print ('------------------\\\\n')\\n\",\n    \"print (' '.join(str(token) for token in filtered_sample_text))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Morphological Normalization\\n\",\n    \"Morphology, in general, is the study of the way words are built up from smaller meaning-bearing units, morphomes. For example, dogs consists of two morphemes: dog and s\\n\",\n    \"\\n\",\n    \"Two commonly used techniques for text normalization are:\\n\",\n    \"\\n\",\n    \"1. Stemming: The procedure aims to identify the stem of a word and use it in lieu of the word itself. The most popular algorithm for stemming English, and one that has repeatedly been shown to be empirically very effective, is Porter's algorithm. The entire algorithm is too long and intricate to present here, but you can find details here\\n\",\n    \"2. Lemmatization: This process refers to doing things correctly with the use of vocabulary and morphological analysis of words, typically aiming to remove inflectional endings only and to return the base or dictionary form of a word, which is known as the lemma.\\n\",\n    \"\\n\",\n    \"If confronted with the token saw, stemming might return just s, whereas lemmatization would attempt to return either see or saw depending on whether the use of the token was as a verb or a noun\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from nltk.stem import PorterStemmer\\n\",\n    \"from nltk.stem import WordNetLemmatizer\\n\",\n    \"from nltk.tokenize import word_tokenize\\n\",\n    \"\\n\",\n    \"ps = PorterStemmer()\\n\",\n    \"lemmatizer = WordNetLemmatizer()\\n\",\n    \"\\n\",\n    \"tokenize_words = word_tokenize(sample_text)\\n\",\n    \"\\n\",\n    \"stemmed_sample_text = []\\n\",\n    \"for token in tokenize_words:\\n\",\n    \"    stemmed_sample_text.append(ps.stem(token))\\n\",\n    \"\\n\",\n    \"lemma_sample_text = []\\n\",\n    \"for token in tokenize_words:\\n\",\n    \"    lemma_sample_text.append(lemmatizer.lemmatize(token))\\n\",\n    \"    \\n\",\n    \"print ('\\\\nOriginal Text:')\\n\",\n    \"print ('------------------\\\\n')\\n\",\n    \"print (sample_text)\\n\",\n    \"\\n\",\n    \"print ('\\\\nFiltered Text: Stemming')\\n\",\n    \"print ('------------------\\\\n')\\n\",\n    \"print (' '.join(str(token) for token in stemmed_sample_text))\\n\",\n    \"\\n\",\n    \"print ('\\\\nFiltered Text: Lemmatization')\\n\",\n    \"print ('--------------------------------\\\\n')\\n\",\n    \"print (' '.join(str(token) for token in lemma_sample_text))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"**Challenges:**\\n\",\n    \"\\n\",\n    \"Often, full morphological analysis produces at most very modest benefits for analysis. Neither form of normalization improve language information performance in aggregate, both from relevance and dimensionality reduction standpoint - at least not for the following situations:\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from nltk.stem import PorterStemmer\\n\",\n    \"words = [\\\"operate\\\", \\\"operating\\\", \\\"operates\\\", \\\"operation\\\", \\\"operative\\\", \\\"operatives\\\", \\\"operational\\\"]\\n\",\n    \"\\n\",\n    \"ps = PorterStemmer()\\n\",\n    \"\\n\",\n    \"for token in words:\\n\",\n    \"    print (ps.stem(token))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"As an example of what can go wrong, note that the Porter stemmer stems all of the following words to oper\\n\",\n    \"However, since operate in its various forms is a common verb, we would expect to lose considerable precision:\\n\",\n    \"- operational AND research\\n\",\n    \"- operating AND system\\n\",\n    \"- operative AND dentistry\\n\",\n    \"\\n\",\n    \"For cases like these, moving to using a lemmatizer would not completely fix the problem because particular inflectional forms are used in specific collocations. Getting better value from term normalization depends more on pragmatic issues of word use than on formal issues of linguistic morphology\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"environment\": {\n   \"name\": \"common-cpu.m49\",\n   \"type\": \"gcloud\",\n   \"uri\": \"gcr.io/deeplearning-platform-release/base-cpu:m49\"\n  },\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.6\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "natural-language-processing/text-processing/pyproject.toml",
    "content": "[tool.poetry]\nname = \"text-processing\"\nversion = \"0.1.0\"\ndescription = \"\"\nauthors = [\"Shashank Kapadia <smhkapadia@gmail.com>\"]\nreadme = \"README.md\"\npackages = [{include = \"text_processing\"}]\n\n[tool.poetry.dependencies]\npython = \"^3.10\"\njupyterlab = \"^3.5.2\"\nnltk = \"^3.8.1\"\n\n\n[build-system]\nrequires = [\"poetry-core\"]\nbuild-backend = \"poetry.core.masonry.api\"\n"
  },
  {
    "path": "natural-language-processing/topic-modeling/Evaluate Topic Models.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Evaluate Topic Model in Python: Latent Dirichlet Allocation (LDA)\\\\\\n\",\n    \"##### A step-by-step guide to building interpretable topic models\\n\",\n    \"\\n\",\n    \"** **\\n\",\n    \"*Preface: This article aims to provide consolidated information on the underlying topic and is not to be considered as the original work. The information and the code are repurposed through several online articles, research papers, books, and open-source code*\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"In the previous [article](https://towardsdatascience.com/end-to-end-topic-modeling-in-python-latent-dirichlet-allocation-lda-35ce4ed6b3e0), I introduced the concept of topic modeling and walked through the code for developing your first topic model using Latent Dirichlet Allocation (LDA) method in the python using Gensim implementation.\\n\",\n    \"\\n\",\n    \"Pursuing on that understanding, in this article, we’ll go a few steps deeper by outlining the framework to quantitatively evaluate topic models through the measure of topic coherence and share the code template in python using Gensim implementation to allow for end-to-end model development.\\n\",\n    \"\\n\",\n    \"### Why evaluate topic models?\\n\",\n    \"\\n\",\n    \"![img](https://tinyurl.com/y3xznjwq)\\n\",\n    \"\\n\",\n    \"We know probabilistic topic models, such as LDA, are popular tools for text analysis, providing both a predictive and latent topic representation of the corpus. However, there is a longstanding assumption that the latent space discovered by these models is generally meaningful and useful, and that evaluating such assumptions is challenging due to its unsupervised training process. Besides, there is a no-gold standard list of topics to compare against every corpus.\\n\",\n    \"\\n\",\n    \"Nevertheless, it is equally important to identify if a trained model is objectively good or bad, as well have an ability to compare different models/methods. To do so, one would require an objective measure for the quality. Traditionally, and still for many practical applications, to evaluate if “the correct thing” has been learned about the corpus, an implicit knowledge and “eyeballing” approaches are used. Ideally, we’d like to capture this information in a single metric that can be maximized, and compared.\\n\",\n    \"\\n\",\n    \"Let’s take a look at roughly what approaches are commonly used for the evaluation:\\n\",\n    \"\\n\",\n    \"**Eye Balling Models**\\n\",\n    \"- Top N words\\n\",\n    \"- Topics / Documents\\n\",\n    \"\\n\",\n    \"**Intrinsic Evaluation Metrics**\\n\",\n    \"- Capturing model semantics\\n\",\n    \"- Topics interpretability\\n\",\n    \"\\n\",\n    \"**Human Judgements**\\n\",\n    \"- What is a topic\\n\",\n    \"\\n\",\n    \"**Extrinsic Evaluation Metrics/Evaluation at task**\\n\",\n    \"- Is model good at performing predefined tasks, such as classification\\n\",\n    \"\\n\",\n    \"Natural language is messy, ambiguous and full of subjective interpretation, and sometimes trying to cleanse ambiguity reduces the language to an unnatural form. In this article, we’ll explore more about topic coherence, an intrinsic evaluation metric, and how you can use it to quantitatively justify the model selection.\\n\",\n    \"\\n\",\n    \"### What is Topic Coherence?\\n\",\n    \"\\n\",\n    \"Before we understand topic coherence, let’s briefly look at the perplexity measure. Perplexity as well is one of the intrinsic evaluation metric, and is widely used for language model evaluation. It captures how surprised a model is of new data it has not seen before, and is measured as the normalized log-likelihood of a held-out test set. \\n\",\n    \"\\n\",\n    \"Focussing on the log-likelihood part, you can think of the perplexity metric as measuring how probable some new unseen data is given the model that was learned earlier. That is to say, how well does the model represent or reproduce the statistics of the held-out data.\\n\",\n    \"\\n\",\n    \"However, recent studies have shown that predictive likelihood (or equivalently, perplexity) and human judgment are often not correlated, and even sometimes slightly anti-correlated.\\n\",\n    \"\\n\",\n    \"*Optimizing for perplexity may not yield human interpretable topics*\\n\",\n    \"\\n\",\n    \"This limitation of perplexity measure served as a motivation for more work trying to model the human judgment, and thus *Topic Coherence*.\\n\",\n    \"\\n\",\n    \"The concept of topic coherence combines a number of measures into a framework to evaluate the coherence between topics inferred by a model. But before that…\\n\",\n    \"\\n\",\n    \"#### What is topic coherence?\\n\",\n    \"Topic Coherence measures score a single topic by measuring the degree of semantic similarity between high scoring words in the topic. These measurements help distinguish between topics that are semantically interpretable topics and topics that are artifacts of statistical inference. But,\\n\",\n    \"\\n\",\n    \"#### What is coherence?\\n\",\n    \"Topic Coherence measures score a single topic by measuring the degree of semantic similarity between high scoring words in the topic. These measurements help distinguish between topics that are semantically interpretable topics and topics that are artifacts of statistical inference. But …\\n\",\n    \"\\n\",\n    \"### Coherence Measures\\n\",\n    \"Let’s take quick look at different coherence measures, and how they are calculated:\\n\",\n    \"\\n\",\n    \"1. `C_v` measure is based on a sliding window, one-set segmentation of the top words and an indirect confirmation measure that uses normalized pointwise mutual information (NPMI) and the cosine similarity\\n\",\n    \"2. `C_p` is based on a sliding window, one-preceding segmentation of the top words and the confirmation measure of Fitelson's coherence\\n\",\n    \"3. `C_uci` measure is based on a sliding window and the pointwise mutual information (PMI) of all word pairs of the given top words\\n\",\n    \"4. `C_umass` is based on document cooccurrence counts, a one-preceding segmentation and a logarithmic conditional probability as confirmation measure\\n\",\n    \"5. `C_npmi` is an enhanced version of the C_uci coherence using the normalized pointwise mutual information (NPMI)\\n\",\n    \"6. `C_a` is based on a context window, a pairwise comparison of the top words and an indirect confirmation measure that uses normalized pointwise mutual information (NPMI) and the cosine similarity\\n\",\n    \"\\n\",\n    \"There is, of course, a lot more to the concept of topic model evaluation, and the coherence measure. However, keeping in mind the length, and purpose of this article, let’s apply these concepts into developing a model that is at least better than with the default parameters. Also, we’ll be re-purposing already available online pieces of code to support this exercise instead of re-inventing the wheel.\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Model Implementation\\n\",\n    \"1. Loading Data\\n\",\n    \"2. Data Cleaning\\n\",\n    \"3. Phrase Modeling: Bi-grams and Tri-grams\\n\",\n    \"4. Data Transformation: Corpus and Dictionary\\n\",\n    \"5. Base Model\\n\",\n    \"6. Hyper-parameter Tuning\\n\",\n    \"7. Final model\\n\",\n    \"8. Visualize Results\\n\",\n    \"\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"For this tutorial, we’ll use the dataset of papers published in NeurIPS (NIPS) conference which is one of the most prestigious yearly events in the machine learning community. The CSV data file contains information on the different NeurIPS papers that were published from 1987 until 2016 (29 years!). These papers discuss a wide variety of topics in machine learning, from neural networks to optimization methods, and many more.\\n\",\n    \"\\n\",\n    \"<img src=\\\"https://s3.amazonaws.com/assets.datacamp.com/production/project_158/img/nips_logo.png\\\" alt=\\\"The logo of NIPS (Neural Information Processing Systems)\\\">\\n\",\n    \"\\n\",\n    \"Let’s start by looking at the content of the file\\n\",\n    \"\\n\",\n    \"** **\\n\",\n    \"#### Step 1: Loading Data\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"For this tutorial, we’ll use the dataset of papers published in NIPS conference. The NIPS conference (Neural Information Processing Systems) is one of the most prestigious yearly events in the machine learning community. The CSV data file contains information on the different NIPS papers that were published from 1987 until 2016 (29 years!). These papers discuss a wide variety of topics in machine learning, from neural networks to optimization methods, and many more.\\n\",\n    \"\\n\",\n    \"Let’s start by looking at the content of the file\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import zipfile\\n\",\n    \"import pandas as pd\\n\",\n    \"import os\\n\",\n    \"\\n\",\n    \"# Open the zip file\\n\",\n    \"with zipfile.ZipFile(\\\"./data/NIPS Papers.zip\\\", \\\"r\\\") as zip_ref:\\n\",\n    \"    # Extract the file to a temporary directory\\n\",\n    \"    zip_ref.extractall(\\\"temp\\\")\\n\",\n    \"\\n\",\n    \"# Read the CSV file into a pandas DataFrame\\n\",\n    \"papers = pd.read_csv(\\\"temp/NIPS Papers/papers.csv\\\")\\n\",\n    \"\\n\",\n    \"# Print head\\n\",\n    \"papers.head()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 2: Data Cleaning\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"Since the goal of this analysis is to perform topic modeling, we will solely focus on the text data from each paper, and drop other metadata columns\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Remove the columns\\n\",\n    \"papers = papers.drop(columns=['id', 'title', 'abstract', \\n\",\n    \"                              'event_type', 'pdf_name', 'year'], axis=1)\\n\",\n    \"\\n\",\n    \"# sample only 100 papers\\n\",\n    \"papers = papers.sample(100)\\n\",\n    \"\\n\",\n    \"# Print out the first rows of papers\\n\",\n    \"papers.head()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"##### Remove punctuation/lower casing\\n\",\n    \"\\n\",\n    \"Next, let’s perform a simple preprocessing on the content of paper_text column to make them more amenable for analysis, and reliable results. To do that, we’ll use a regular expression to remove any punctuation, and then lowercase the text\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Load the regular expression library\\n\",\n    \"import re\\n\",\n    \"\\n\",\n    \"# Remove punctuation\\n\",\n    \"papers['paper_text_processed'] = papers['paper_text'].map(lambda x: re.sub('[,\\\\.!?]', '', x))\\n\",\n    \"\\n\",\n    \"# Convert the titles to lowercase\\n\",\n    \"papers['paper_text_processed'] = papers['paper_text_processed'].map(lambda x: x.lower())\\n\",\n    \"\\n\",\n    \"# Print out the first rows of papers\\n\",\n    \"papers['paper_text_processed'].head()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"##### Tokenize words and further clean-up text\\n\",\n    \"\\n\",\n    \"Let’s tokenize each sentence into a list of words, removing punctuations and unnecessary characters altogether.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import gensim\\n\",\n    \"from gensim.utils import simple_preprocess\\n\",\n    \"\\n\",\n    \"def sent_to_words(sentences):\\n\",\n    \"    for sentence in sentences:\\n\",\n    \"        yield(gensim.utils.simple_preprocess(str(sentence), deacc=True))  # deacc=True removes punctuations\\n\",\n    \"\\n\",\n    \"data = papers.paper_text_processed.values.tolist()\\n\",\n    \"data_words = list(sent_to_words(data))\\n\",\n    \"\\n\",\n    \"print(data_words[:1][0][:30])\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 3: Phrase Modeling: Bigram and Trigram Models\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"Bigrams are two words frequently occurring together in the document. Trigrams are 3 words frequently occurring. Some examples in our example are: 'back_bumper', 'oil_leakage', 'maryland_college_park' etc.\\n\",\n    \"\\n\",\n    \"Gensim's Phrases model can build and implement the bigrams, trigrams, quadgrams and more. The two important arguments to Phrases are min_count and threshold.\\n\",\n    \"\\n\",\n    \"*The higher the values of these param, the harder it is for words to be combined.*\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Build the bigram and trigram models\\n\",\n    \"bigram = gensim.models.Phrases(data_words, min_count=5, threshold=100) # higher threshold fewer phrases.\\n\",\n    \"trigram = gensim.models.Phrases(bigram[data_words], threshold=100)  \\n\",\n    \"\\n\",\n    \"# Faster way to get a sentence clubbed as a trigram/bigram\\n\",\n    \"bigram_mod = gensim.models.phrases.Phraser(bigram)\\n\",\n    \"trigram_mod = gensim.models.phrases.Phraser(trigram)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"#### Remove Stopwords, Make Bigrams and Lemmatize\\n\",\n    \"\\n\",\n    \"The phrase models are ready. Let’s define the functions to remove the stopwords, make trigrams and lemmatization and call them sequentially.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NLTK Stop words\\n\",\n    \"import nltk\\n\",\n    \"nltk.download('stopwords')\\n\",\n    \"from nltk.corpus import stopwords\\n\",\n    \"\\n\",\n    \"stop_words = stopwords.words('english')\\n\",\n    \"stop_words.extend(['from', 'subject', 're', 'edu', 'use'])\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Define functions for stopwords, bigrams, trigrams and lemmatization\\n\",\n    \"def remove_stopwords(texts):\\n\",\n    \"    return [[word for word in simple_preprocess(str(doc)) if word not in stop_words] for doc in texts]\\n\",\n    \"\\n\",\n    \"def make_bigrams(texts):\\n\",\n    \"    return [bigram_mod[doc] for doc in texts]\\n\",\n    \"\\n\",\n    \"def make_trigrams(texts):\\n\",\n    \"    return [trigram_mod[bigram_mod[doc]] for doc in texts]\\n\",\n    \"\\n\",\n    \"def lemmatization(texts, allowed_postags=['NOUN', 'ADJ', 'VERB', 'ADV']):\\n\",\n    \"    \\\"\\\"\\\"https://spacy.io/api/annotation\\\"\\\"\\\"\\n\",\n    \"    texts_out = []\\n\",\n    \"    for sent in texts:\\n\",\n    \"        doc = nlp(\\\" \\\".join(sent)) \\n\",\n    \"        texts_out.append([token.lemma_ for token in doc if token.pos_ in allowed_postags])\\n\",\n    \"    return texts_out\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"Let's call the functions in order.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"!python -m spacy download en_core_web_sm\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import spacy\\n\",\n    \"\\n\",\n    \"# Remove Stop Words\\n\",\n    \"data_words_nostops = remove_stopwords(data_words)\\n\",\n    \"\\n\",\n    \"# Form Bigrams\\n\",\n    \"data_words_bigrams = make_bigrams(data_words_nostops)\\n\",\n    \"\\n\",\n    \"# Initialize spacy 'en' model, keeping only tagger component (for efficiency)\\n\",\n    \"nlp = spacy.load(\\\"en_core_web_sm\\\", disable=['parser', 'ner'])\\n\",\n    \"\\n\",\n    \"# Do lemmatization keeping only noun, adj, vb, adv\\n\",\n    \"data_lemmatized = lemmatization(data_words_bigrams, allowed_postags=['NOUN', 'ADJ', 'VERB', 'ADV'])\\n\",\n    \"\\n\",\n    \"print(data_lemmatized[:1][0][:30])\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 4: Data transformation: Corpus and Dictionary\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"The two main inputs to the LDA topic model are the dictionary(id2word) and the corpus. Let’s create them.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import gensim.corpora as corpora\\n\",\n    \"\\n\",\n    \"# Create Dictionary\\n\",\n    \"id2word = corpora.Dictionary(data_lemmatized)\\n\",\n    \"\\n\",\n    \"# Create Corpus\\n\",\n    \"texts = data_lemmatized\\n\",\n    \"\\n\",\n    \"# Term Document Frequency\\n\",\n    \"corpus = [id2word.doc2bow(text) for text in texts]\\n\",\n    \"\\n\",\n    \"# View\\n\",\n    \"print(corpus[:1][0][:30])\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 5: Base Model \\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"We have everything required to train the base LDA model. In addition to the corpus and dictionary, you need to provide the number of topics as well. Apart from that, alpha and eta are hyperparameters that affect sparsity of the topics. According to the Gensim docs, both defaults to 1.0/num_topics prior (we'll use default for the base model).\\n\",\n    \"\\n\",\n    \"chunksize controls how many documents are processed at a time in the training algorithm. Increasing chunksize will speed up training, at least as long as the chunk of documents easily fit into memory.\\n\",\n    \"\\n\",\n    \"passes controls how often we train the model on the entire corpus (set to 10). Another word for passes might be \\\"epochs\\\". iterations is somewhat technical, but essentially it controls how often we repeat a particular loop over each document. It is important to set the number of \\\"passes\\\" and \\\"iterations\\\" high enough.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Build LDA model\\n\",\n    \"lda_model = gensim.models.LdaMulticore(corpus=corpus,\\n\",\n    \"                                       id2word=id2word,\\n\",\n    \"                                       num_topics=10, \\n\",\n    \"                                       random_state=100,\\n\",\n    \"                                       chunksize=100,\\n\",\n    \"                                       passes=10,\\n\",\n    \"                                       per_word_topics=True)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"The above LDA model is built with 10 different topics where each topic is a combination of keywords and each keyword contributes a certain weightage to the topic.\\n\",\n    \"\\n\",\n    \"You can see the keywords for each topic and the weightage(importance) of each keyword using `lda_model.print_topics()`\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from pprint import pprint\\n\",\n    \"\\n\",\n    \"# Print the Keyword in the 10 topics\\n\",\n    \"pprint(lda_model.print_topics())\\n\",\n    \"doc_lda = lda_model[corpus]\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"#### Compute Model Perplexity and Coherence Score\\n\",\n    \"\\n\",\n    \"Let's calculate the baseline coherence score\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from gensim.models import CoherenceModel\\n\",\n    \"\\n\",\n    \"# Compute Coherence Score\\n\",\n    \"coherence_model_lda = CoherenceModel(model=lda_model, texts=data_lemmatized, dictionary=id2word, coherence='c_v')\\n\",\n    \"coherence_lda = coherence_model_lda.get_coherence()\\n\",\n    \"print('Coherence Score: ', coherence_lda)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 6: Hyperparameter tuning\\n\",\n    \"** **\\n\",\n    \"First, let's differentiate between model hyperparameters and model parameters :\\n\",\n    \"\\n\",\n    \"- `Model hyperparameters` can be thought of as settings for a machine learning algorithm that are tuned by the data scientist before training. Examples would be the number of trees in the random forest, or in our case, number of topics K\\n\",\n    \"\\n\",\n    \"- `Model parameters` can be thought of as what the model learns during training, such as the weights for each word in a given topic.\\n\",\n    \"\\n\",\n    \"Now that we have the baseline coherence score for the default LDA model, let's perform a series of sensitivity tests to help determine the following model hyperparameters: \\n\",\n    \"- Number of Topics (K)\\n\",\n    \"- Dirichlet hyperparameter alpha: Document-Topic Density\\n\",\n    \"- Dirichlet hyperparameter beta: Word-Topic Density\\n\",\n    \"\\n\",\n    \"We'll perform these tests in sequence, one parameter at a time by keeping others constant and run them over the two difference validation corpus sets. We'll use `C_v` as our choice of metric for performance comparison \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# supporting function\\n\",\n    \"def compute_coherence_values(corpus, dictionary, k, a, b):\\n\",\n    \"    \\n\",\n    \"    lda_model = gensim.models.LdaMulticore(corpus=corpus,\\n\",\n    \"                                           id2word=dictionary,\\n\",\n    \"                                           num_topics=k, \\n\",\n    \"                                           random_state=100,\\n\",\n    \"                                           chunksize=100,\\n\",\n    \"                                           passes=10,\\n\",\n    \"                                           alpha=a,\\n\",\n    \"                                           eta=b)\\n\",\n    \"    \\n\",\n    \"    coherence_model_lda = CoherenceModel(model=lda_model, texts=data_lemmatized, dictionary=id2word, coherence='c_v')\\n\",\n    \"    \\n\",\n    \"    return coherence_model_lda.get_coherence()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"Let's call the function, and iterate it over the range of topics, alpha, and beta parameter values\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import numpy as np\\n\",\n    \"import tqdm\\n\",\n    \"\\n\",\n    \"grid = {}\\n\",\n    \"grid['Validation_Set'] = {}\\n\",\n    \"\\n\",\n    \"# Topics range\\n\",\n    \"min_topics = 2\\n\",\n    \"max_topics = 11\\n\",\n    \"step_size = 1\\n\",\n    \"topics_range = range(min_topics, max_topics, step_size)\\n\",\n    \"\\n\",\n    \"# Alpha parameter\\n\",\n    \"alpha = list(np.arange(0.01, 1, 0.3))\\n\",\n    \"alpha.append('symmetric')\\n\",\n    \"alpha.append('asymmetric')\\n\",\n    \"\\n\",\n    \"# Beta parameter\\n\",\n    \"beta = list(np.arange(0.01, 1, 0.3))\\n\",\n    \"beta.append('symmetric')\\n\",\n    \"\\n\",\n    \"# Validation sets\\n\",\n    \"num_of_docs = len(corpus)\\n\",\n    \"corpus_sets = [gensim.utils.ClippedCorpus(corpus, int(num_of_docs*0.75)), \\n\",\n    \"               corpus]\\n\",\n    \"\\n\",\n    \"corpus_title = ['75% Corpus', '100% Corpus']\\n\",\n    \"\\n\",\n    \"model_results = {'Validation_Set': [],\\n\",\n    \"                 'Topics': [],\\n\",\n    \"                 'Alpha': [],\\n\",\n    \"                 'Beta': [],\\n\",\n    \"                 'Coherence': []\\n\",\n    \"                }\\n\",\n    \"\\n\",\n    \"# Can take a long time to run\\n\",\n    \"if 1 == 1:\\n\",\n    \"    pbar = tqdm.tqdm(total=(len(beta)*len(alpha)*len(topics_range)*len(corpus_title)))\\n\",\n    \"    \\n\",\n    \"    # iterate through validation corpuses\\n\",\n    \"    for i in range(len(corpus_sets)):\\n\",\n    \"        # iterate through number of topics\\n\",\n    \"        for k in topics_range:\\n\",\n    \"            # iterate through alpha values\\n\",\n    \"            for a in alpha:\\n\",\n    \"                # iterare through beta values\\n\",\n    \"                for b in beta:\\n\",\n    \"                    # get the coherence score for the given parameters\\n\",\n    \"                    cv = compute_coherence_values(corpus=corpus_sets[i], dictionary=id2word, \\n\",\n    \"                                                  k=k, a=a, b=b)\\n\",\n    \"                    # Save the model results\\n\",\n    \"                    model_results['Validation_Set'].append(corpus_title[i])\\n\",\n    \"                    model_results['Topics'].append(k)\\n\",\n    \"                    model_results['Alpha'].append(a)\\n\",\n    \"                    model_results['Beta'].append(b)\\n\",\n    \"                    model_results['Coherence'].append(cv)\\n\",\n    \"                    \\n\",\n    \"                    pbar.update(1)\\n\",\n    \"    pd.DataFrame(model_results).to_csv('./results/lda_tuning_results.csv', index=False)\\n\",\n    \"    pbar.close()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 7: Final Model\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"Based on external evaluation (Code to be added from Excel based analysis), let's train the final model with parameters yielding highest coherence score\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"num_topics = 8\\n\",\n    \"\\n\",\n    \"lda_model = gensim.models.LdaMulticore(corpus=corpus,\\n\",\n    \"                                           id2word=id2word,\\n\",\n    \"                                           num_topics=num_topics, \\n\",\n    \"                                           random_state=100,\\n\",\n    \"                                           chunksize=100,\\n\",\n    \"                                           passes=10,\\n\",\n    \"                                           alpha=0.01,\\n\",\n    \"                                           eta=0.9)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from pprint import pprint\\n\",\n    \"\\n\",\n    \"# Print the Keyword in the 10 topics\\n\",\n    \"pprint(lda_model.print_topics())\\n\",\n    \"doc_lda = lda_model[corpus]\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 8: Visualize Results\\n\",\n    \"** **\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import pyLDAvis.gensim_models as gensimvis\\n\",\n    \"import pickle \\n\",\n    \"import pyLDAvis\\n\",\n    \"\\n\",\n    \"# Visualize the topics\\n\",\n    \"pyLDAvis.enable_notebook()\\n\",\n    \"\\n\",\n    \"LDAvis_data_filepath = os.path.join('./results/ldavis_tuned_'+str(num_topics))\\n\",\n    \"\\n\",\n    \"# # this is a bit time consuming - make the if statement True\\n\",\n    \"# # if you want to execute visualization prep yourself\\n\",\n    \"if 1 == 1:\\n\",\n    \"    LDAvis_prepared = gensimvis.prepare(lda_model, corpus, id2word)\\n\",\n    \"    with open(LDAvis_data_filepath, 'wb') as f:\\n\",\n    \"        pickle.dump(LDAvis_prepared, f)\\n\",\n    \"\\n\",\n    \"# load the pre-prepared pyLDAvis data from disk\\n\",\n    \"with open(LDAvis_data_filepath, 'rb') as f:\\n\",\n    \"    LDAvis_prepared = pickle.load(f)\\n\",\n    \"\\n\",\n    \"pyLDAvis.save_html(LDAvis_prepared, './results/ldavis_tuned_'+ str(num_topics) +'.html')\\n\",\n    \"\\n\",\n    \"LDAvis_prepared\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Closing Notes\\n\",\n    \"\\n\",\n    \"We started with understanding why evaluating the topic model is essential. Next, we reviewed existing methods and scratched the surface of topic coherence, along with the available coherence measures. Then we built a default LDA model using Gensim implementation to establish the baseline coherence score and reviewed practical ways to optimize the LDA hyperparameters.\\n\",\n    \"\\n\",\n    \"Hopefully, this article has managed to shed light on the underlying topic evaluation strategies, and intuitions behind it.\\n\",\n    \"\\n\",\n    \"** **\\n\",\n    \"#### References:\\n\",\n    \"1. http://qpleple.com/perplexity-to-evaluate-topic-models/\\n\",\n    \"2. https://www.amazon.com/Machine-Learning-Probabilistic-Perspective-Computation/dp/0262018020\\n\",\n    \"3. https://papers.nips.cc/paper/3700-reading-tea-leaves-how-humans-interpret-topic-models.pdf\\n\",\n    \"4. https://github.com/mattilyra/pydataberlin-2017/blob/master/notebook/EvaluatingUnsupervisedModels.ipynb\\n\",\n    \"5. https://www.machinelearningplus.com/nlp/topic-modeling-gensim-python/\\n\",\n    \"6. http://svn.aksw.org/papers/2015/WSDM_Topic_Evaluation/public.pdf\\n\",\n    \"7. http://palmetto.aksw.org/palmetto-webapp/\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.9.16\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "natural-language-processing/topic-modeling/Introduction to Topic Modeling.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Introduction\\n\",\n    \"##### How to get started with topic modeling using LDA in Python\\n\",\n    \"** **\\n\",\n    \"Topic Models, in a nutshell, are a type of statistical language models used for uncovering hidden structure in a collection of texts. In a practical and more intuitively, you can think of it as a task of:\\n\",\n    \"\\n\",\n    \"- **Dimensionality Reduction**, where rather than representing a text T in its feature space as {Word_i: count(Word_i, T) for Word_i in Vocabulary}, you can represent it in a topic space as {Topic_i: Weight(Topic_i, T) for Topic_i in Topics}\\n\",\n    \"- **Unsupervised Learning**, where it can be compared to clustering, as in the case of clustering, the number of topics, like the number of clusters, is an output parameter. By doing topic modeling, we build clusters of words rather than clusters of texts. A text is thus a mixture of all the topics, each having a specific weight\\n\",\n    \"- **Tagging**, abstract “topics” that occur in a collection of documents that best represents the information in them.\\n\",\n    \"\\n\",\n    \"There are several existing algorithms you can use to perform the topic modeling. The most common of it are, Latent Semantic Analysis (LSA/LSI), Probabilistic Latent Semantic Analysis (pLSA), and Latent Dirichlet Allocation (LDA)\\n\",\n    \"\\n\",\n    \"In this tutorial, we’ll take a closer look at LDA, and implement our first topic model using the sklearn implementation in python 2.7\\n\",\n    \"\\n\",\n    \"### Theoretical Overview\\n\",\n    \"LDA is a generative probabilistic model that assumes each topic is a mixture over an underlying set of words, and each document is a mixture of over a set of topic probabilities.\\n\",\n    \"\\n\",\n    \"![LDA_Model](https://github.com/chdoig/pytexas2015-topic-modeling/blob/master/images/lda-4.png?raw=true)\\n\",\n    \"\\n\",\n    \"We can describe the generative process of LDA as, given the M number of documents, N number of words, and prior K number of topics, the model trains to output:\\n\",\n    \"\\n\",\n    \"- `psi`, the distribution of words for each topic K\\n\",\n    \"- `phi`, the distribution of topics for each document i\\n\",\n    \"\\n\",\n    \"#### Parameters of LDA\\n\",\n    \"\\n\",\n    \"- `Alpha parameter` is Dirichlet prior concentration parameter that represents document-topic density — with a higher alpha, documents are assumed to be made up of more topics and result in more specific topic distribution per document.\\n\",\n    \"- `Beta parameter` is the same prior concentration parameter that represents topic-word density — with high beta, topics are assumed to made of up most of the words and result in a more specific word distribution per topic.\\n\",\n    \"\\n\",\n    \"**To read more: https://towardsdatascience.com/end-to-end-topic-modeling-in-python-latent-dirichlet-allocation-lda-35ce4ed6b3e0**\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"### LDA Implementation\\n\",\n    \"\\n\",\n    \"1. [Loading data](#load_data)\\n\",\n    \"2. [Data cleaning](#clean_data)\\n\",\n    \"3. [Exploratory analysis](#eda)\\n\",\n    \"4. [Prepare data for LDA analysis](#data_preparation)\\n\",\n    \"5. [LDA model training](#train_model)\\n\",\n    \"6. [Analyzing LDA model results](#results)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"For this tutorial, we’ll use the dataset of papers published in NeurIPS (NIPS) conference which is one of the most prestigious yearly events in the machine learning community. The CSV data file contains information on the different NeurIPS papers that were published from 1987 until 2016 (29 years!). These papers discuss a wide variety of topics in machine learning, from neural networks to optimization methods, and many more.\\n\",\n    \"\\n\",\n    \"<img src=\\\"https://s3.amazonaws.com/assets.datacamp.com/production/project_158/img/nips_logo.png\\\" alt=\\\"The logo of NIPS (Neural Information Processing Systems)\\\">\\n\",\n    \"\\n\",\n    \"Let’s start by looking at the content of the file\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 1: Loading Data <a class=\\\"anchor\\\\\\\" id=\\\"load_data\\\"></a>\\n\",\n    \"** **\\n\",\n    \"For this tutorial, we’ll use the dataset of papers published in NeurIPS (NIPS) conference which is one of the most prestigious yearly events in the machine learning community. The CSV data file contains information on the different NeurIPS papers that were published from 1987 until 2016 (29 years!). These papers discuss a wide variety of topics in machine learning, from neural networks to optimization methods, and many more.\\n\",\n    \"\\n\",\n    \"Let’s start by looking at the content of the file\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import zipfile\\n\",\n    \"import pandas as pd\\n\",\n    \"import os\\n\",\n    \"\\n\",\n    \"# Open the zip file\\n\",\n    \"with zipfile.ZipFile(\\\"./data/NIPS Papers.zip\\\", \\\"r\\\") as zip_ref:\\n\",\n    \"    # Extract the file to a temporary directory\\n\",\n    \"    zip_ref.extractall(\\\"temp\\\")\\n\",\n    \"\\n\",\n    \"# Read the CSV file into a pandas DataFrame\\n\",\n    \"papers = pd.read_csv(\\\"temp/NIPS Papers/papers.csv\\\")\\n\",\n    \"\\n\",\n    \"# Print head\\n\",\n    \"papers.head()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 2: Data Cleaning <a class=\\\"anchor\\\\\\\" id=\\\"clean_data\\\"></a>\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"Since the goal of this analysis is to perform topic modeling, let's focus only on the text data from each paper, and drop other metadata columns. Also, for the demonstration, we'll only look at 100 papers\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Remove the columns\\n\",\n    \"papers = papers.drop(columns=['id', 'event_type', 'pdf_name'], axis=1).sample(100)\\n\",\n    \"\\n\",\n    \"# Print out the first rows of papers\\n\",\n    \"papers.head()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"##### Remove punctuation/lower casing\\n\",\n    \"\\n\",\n    \"Next, let’s perform a simple preprocessing on the content of `paper_text` column to make them more amenable for analysis, and reliable results. To do that, we’ll use a regular expression to remove any punctuation, and then lowercase the text\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Load the regular expression library\\n\",\n    \"import re\\n\",\n    \"\\n\",\n    \"# Remove punctuation\\n\",\n    \"papers['paper_text_processed'] = \\\\\\n\",\n    \"papers['paper_text'].map(lambda x: re.sub('[,\\\\.!?]', '', x))\\n\",\n    \"\\n\",\n    \"# Convert the titles to lowercase\\n\",\n    \"papers['paper_text_processed'] = \\\\\\n\",\n    \"papers['paper_text_processed'].map(lambda x: x.lower())\\n\",\n    \"\\n\",\n    \"# Print out the first rows of papers\\n\",\n    \"papers['paper_text_processed'].head()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 3: Exploratory Analysis <a class=\\\"anchor\\\\\\\" id=\\\"eda\\\"></a>\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"To verify whether the preprocessing, we’ll make a simple word cloud using the `wordcloud` package to get a visual representation of most common words. It is key to understanding the data and ensuring we are on the right track, and if any more preprocessing is necessary before training the model.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Import the wordcloud library\\n\",\n    \"from wordcloud import WordCloud\\n\",\n    \"\\n\",\n    \"# Join the different processed titles together.\\n\",\n    \"long_string = ','.join(list(papers['paper_text_processed'].values))\\n\",\n    \"\\n\",\n    \"# Create a WordCloud object\\n\",\n    \"wordcloud = WordCloud(background_color=\\\"white\\\", max_words=1000, contour_width=3, contour_color='steelblue')\\n\",\n    \"\\n\",\n    \"# Generate a word cloud\\n\",\n    \"wordcloud.generate(long_string)\\n\",\n    \"\\n\",\n    \"# Visualize the word cloud\\n\",\n    \"wordcloud.to_image()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 4: Prepare text for LDA analysis <a class=\\\"anchor\\\\\\\" id=\\\"data_preparation\\\"></a>\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"Next, let’s work to transform the textual data in a format that will serve as an input for training LDA model. We start by tokenizing the text and removing stopwords. Next, we convert the tokenized object into a corpus and dictionary.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import gensim\\n\",\n    \"from gensim.utils import simple_preprocess\\n\",\n    \"import nltk\\n\",\n    \"nltk.download('stopwords')\\n\",\n    \"from nltk.corpus import stopwords\\n\",\n    \"\\n\",\n    \"stop_words = stopwords.words('english')\\n\",\n    \"stop_words.extend(['from', 'subject', 're', 'edu', 'use'])\\n\",\n    \"\\n\",\n    \"def sent_to_words(sentences):\\n\",\n    \"    for sentence in sentences:\\n\",\n    \"        # deacc=True removes punctuations\\n\",\n    \"        yield(gensim.utils.simple_preprocess(str(sentence), deacc=True))\\n\",\n    \"\\n\",\n    \"def remove_stopwords(texts):\\n\",\n    \"    return [[word for word in simple_preprocess(str(doc)) \\n\",\n    \"             if word not in stop_words] for doc in texts]\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"data = papers.paper_text_processed.values.tolist()\\n\",\n    \"data_words = list(sent_to_words(data))\\n\",\n    \"\\n\",\n    \"# remove stop words\\n\",\n    \"data_words = remove_stopwords(data_words)\\n\",\n    \"\\n\",\n    \"print(data_words[:1][0][:30])\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import gensim.corpora as corpora\\n\",\n    \"\\n\",\n    \"# Create Dictionary\\n\",\n    \"id2word = corpora.Dictionary(data_words)\\n\",\n    \"\\n\",\n    \"# Create Corpus\\n\",\n    \"texts = data_words\\n\",\n    \"\\n\",\n    \"# Term Document Frequency\\n\",\n    \"corpus = [id2word.doc2bow(text) for text in texts]\\n\",\n    \"\\n\",\n    \"# View\\n\",\n    \"print(corpus[:1][0][:30])\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 5: LDA model tranining <a class=\\\"anchor\\\\\\\" id=\\\"train_model\\\"></a>\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"To keep things simple, we'll keep all the parameters to default except for inputting the number of topics. For this tutorial, we will build a model with 10 topics where each topic is a combination of keywords, and each keyword contributes a certain weightage to the topic.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from pprint import pprint\\n\",\n    \"\\n\",\n    \"# number of topics\\n\",\n    \"num_topics = 10\\n\",\n    \"\\n\",\n    \"# Build LDA model\\n\",\n    \"lda_model = gensim.models.LdaMulticore(corpus=corpus,\\n\",\n    \"                                       id2word=id2word,\\n\",\n    \"                                       num_topics=num_topics)\\n\",\n    \"\\n\",\n    \"# Print the Keyword in the 10 topics\\n\",\n    \"pprint(lda_model.print_topics())\\n\",\n    \"doc_lda = lda_model[corpus]\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Step 6: Analyzing our LDA model <a class=\\\"anchor\\\\\\\" id=\\\"results\\\"></a>\\n\",\n    \"** **\\n\",\n    \"\\n\",\n    \"Now that we have a trained model let’s visualize the topics for interpretability. To do so, we’ll use a popular visualization package, pyLDAvis which is designed to help interactively with:\\n\",\n    \"\\n\",\n    \"1. Better understanding and interpreting individual topics, and\\n\",\n    \"2. Better understanding the relationships between the topics.\\n\",\n    \"\\n\",\n    \"For (1), you can manually select each topic to view its top most frequent and/or “relevant” terms, using different values of the λ parameter. This can help when you’re trying to assign a human interpretable name or “meaning” to each topic.\\n\",\n    \"\\n\",\n    \"For (2), exploring the Intertopic Distance Plot can help you learn about how topics relate to each other, including potential higher-level structure between groups of topics.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import pyLDAvis.gensim_models as gensimvis\\n\",\n    \"import pickle \\n\",\n    \"import pyLDAvis\\n\",\n    \"\\n\",\n    \"# Visualize the topics\\n\",\n    \"pyLDAvis.enable_notebook()\\n\",\n    \"\\n\",\n    \"LDAvis_data_filepath = os.path.join('./results/ldavis_prepared_'+str(num_topics))\\n\",\n    \"\\n\",\n    \"# # this is a bit time consuming - make the if statement True\\n\",\n    \"# # if you want to execute visualization prep yourself\\n\",\n    \"if 1 == 1:\\n\",\n    \"    LDAvis_prepared = gensimvis.prepare(lda_model, corpus, id2word)\\n\",\n    \"    with open(LDAvis_data_filepath, 'wb') as f:\\n\",\n    \"        pickle.dump(LDAvis_prepared, f)\\n\",\n    \"\\n\",\n    \"# load the pre-prepared pyLDAvis data from disk\\n\",\n    \"with open(LDAvis_data_filepath, 'rb') as f:\\n\",\n    \"    LDAvis_prepared = pickle.load(f)\\n\",\n    \"\\n\",\n    \"pyLDAvis.save_html(LDAvis_prepared, './results/ldavis_prepared_'+ str(num_topics) +'.html')\\n\",\n    \"\\n\",\n    \"LDAvis_prepared\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Closing Notes\\n\",\n    \"Machine learning has become increasingly popular over the past decade, and recent advances in computational availability have led to exponential growth to people looking for ways how new methods can be incorporated to advance the field of Natural Language Processing.\\n\",\n    \"\\n\",\n    \"Often, we treat topic models as black-box algorithms, but hopefully, this article addressed to shed light on the underlying math, and intuitions behind it, and high-level code to get you started with any textual data.\\n\",\n    \"\\n\",\n    \"In the next article, we’ll go one step deeper into understanding how you can evaluate the performance of topic models, tune its hyper-parameters to get more intuitive and reliable results.\\n\",\n    \"\\n\",\n    \"** **\\n\",\n    \"#### References:\\n\",\n    \"1. Topic model — Wikipedia. https://en.wikipedia.org/wiki/Topic_model\\n\",\n    \"2. Distributed Strategies for Topic Modeling. https://www.ideals.illinois.edu/bitstream/handle/2142/46405/ParallelTopicModels.pdf?sequence=2&isAllowed=y\\n\",\n    \"3. Topic Mapping — Software — Resources — Amaral Lab. https://amaral.northwestern.edu/resources/software/topic-mapping\\n\",\n    \"4. A Survey of Topic Modeling in Text Mining. https://thesai.org/Downloads/Volume6No1/Paper_21-A_Survey_of_Topic_Modeling_in_Text_Mining.pdf\\n\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.9.16\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "natural-language-processing/topic-modeling/pyproject.toml",
    "content": "[tool.poetry]\nname = \"topic-modeling\"\nversion = \"0.1.0\"\ndescription = \"\"\nauthors = [\"Shashank Kapadia <shashank.kapadia@randstadusa.com>\"]\nreadme = \"README.md\"\npackages = [{include = \"topic_modeling\"}]\n\n[tool.poetry.dependencies]\npython = \">=3.9,<3.10\"\njupyterlab = \"^3.5.2\"\npandas = \"^1.5.2\"\ngensim = \"^4.3.0\"\nnltk = \"^3.8\"\nspacy = \"^3.4.4\"\ntqdm = \"^4.64.1\"\npyldavis = \"^3.3.1\"\nwordcloud = \"^1.8.2.2\"\n\n\n[build-system]\nrequires = [\"poetry-core\"]\nbuild-backend = \"poetry.core.masonry.api\"\n"
  },
  {
    "path": "natural-language-processing/topic-modeling/results/lda_tuning_results.csv",
    "content": "Validation_Set,Topics,Alpha,Beta,Coherence\n75% Corpus,2,0.01,0.01,0.25978135607988706\n75% Corpus,2,0.01,0.31,0.2668476722607104\n75% Corpus,2,0.01,0.61,0.2776409400108895\n75% Corpus,2,0.01,0.9099999999999999,0.2716233211418745\n75% Corpus,2,0.01,symmetric,0.27332996032921053\n75% Corpus,2,0.31,0.01,0.25978135607988706\n75% Corpus,2,0.31,0.31,0.2668476722607104\n75% Corpus,2,0.31,0.61,0.2776409400108895\n75% Corpus,2,0.31,0.9099999999999999,0.2716233211418745\n75% Corpus,2,0.31,symmetric,0.27332996032921053\n75% Corpus,2,0.61,0.01,0.25978135607988706\n75% Corpus,2,0.61,0.31,0.2668476722607104\n75% Corpus,2,0.61,0.61,0.2776409400108895\n75% Corpus,2,0.61,0.9099999999999999,0.2716233211418745\n75% Corpus,2,0.61,symmetric,0.27332996032921053\n75% Corpus,2,0.9099999999999999,0.01,0.25978135607988706\n75% Corpus,2,0.9099999999999999,0.31,0.2668476722607104\n75% Corpus,2,0.9099999999999999,0.61,0.27764094001088946\n75% Corpus,2,0.9099999999999999,0.9099999999999999,0.2766757371950356\n75% Corpus,2,0.9099999999999999,symmetric,0.27332996032921053\n75% Corpus,2,symmetric,0.01,0.25978135607988706\n75% Corpus,2,symmetric,0.31,0.2668476722607104\n75% Corpus,2,symmetric,0.61,0.2776409400108895\n75% Corpus,2,symmetric,0.9099999999999999,0.2716233211418745\n75% Corpus,2,symmetric,symmetric,0.27332996032921053\n75% Corpus,2,asymmetric,0.01,0.25978135607988706\n75% Corpus,2,asymmetric,0.31,0.2668476722607104\n75% Corpus,2,asymmetric,0.61,0.27764094001088957\n75% Corpus,2,asymmetric,0.9099999999999999,0.27162332114187443\n75% Corpus,2,asymmetric,symmetric,0.27332996032921053\n75% Corpus,3,0.01,0.01,0.27839627006352724\n75% Corpus,3,0.01,0.31,0.2714832672445246\n75% Corpus,3,0.01,0.61,0.27129619317484943\n75% Corpus,3,0.01,0.9099999999999999,0.26967265698428594\n75% Corpus,3,0.01,symmetric,0.2710418286483521\n75% Corpus,3,0.31,0.01,0.27839627006352724\n75% Corpus,3,0.31,0.31,0.2714832672445246\n75% Corpus,3,0.31,0.61,0.27129619317484943\n75% Corpus,3,0.31,0.9099999999999999,0.26967265698428594\n75% Corpus,3,0.31,symmetric,0.2741049024792329\n75% Corpus,3,0.61,0.01,0.27839627006352724\n75% Corpus,3,0.61,0.31,0.27351453271228904\n75% Corpus,3,0.61,0.61,0.2730873971333492\n75% Corpus,3,0.61,0.9099999999999999,0.26967265698428594\n75% Corpus,3,0.61,symmetric,0.27410490247923297\n75% Corpus,3,0.9099999999999999,0.01,0.27839627006352724\n75% Corpus,3,0.9099999999999999,0.31,0.27351453271228904\n75% Corpus,3,0.9099999999999999,0.61,0.2766291342273802\n75% Corpus,3,0.9099999999999999,0.9099999999999999,0.26645454631910853\n75% Corpus,3,0.9099999999999999,symmetric,0.2714832672445246\n75% Corpus,3,symmetric,0.01,0.27839627006352724\n75% Corpus,3,symmetric,0.31,0.2714832672445246\n75% Corpus,3,symmetric,0.61,0.27129619317484943\n75% Corpus,3,symmetric,0.9099999999999999,0.26967265698428594\n75% Corpus,3,symmetric,symmetric,0.27410490247923297\n75% Corpus,3,asymmetric,0.01,0.27839627006352724\n75% Corpus,3,asymmetric,0.31,0.2761361679469974\n75% Corpus,3,asymmetric,0.61,0.27335893705876174\n75% Corpus,3,asymmetric,0.9099999999999999,0.2679744346749598\n75% Corpus,3,asymmetric,symmetric,0.27410490247923297\n75% Corpus,4,0.01,0.01,0.2668910500383095\n75% Corpus,4,0.01,0.31,0.2838954443896148\n75% Corpus,4,0.01,0.61,0.2865926544014952\n75% Corpus,4,0.01,0.9099999999999999,0.27003887281216177\n75% Corpus,4,0.01,symmetric,0.2796510251967742\n75% Corpus,4,0.31,0.01,0.2668910500383095\n75% Corpus,4,0.31,0.31,0.2838954443896148\n75% Corpus,4,0.31,0.61,0.2865926544014952\n75% Corpus,4,0.31,0.9099999999999999,0.2725567640010955\n75% Corpus,4,0.31,symmetric,0.2806724093952981\n75% Corpus,4,0.61,0.01,0.2668910500383095\n75% Corpus,4,0.61,0.31,0.28505646918509386\n75% Corpus,4,0.61,0.61,0.2892439806686463\n75% Corpus,4,0.61,0.9099999999999999,0.2725567640010955\n75% Corpus,4,0.61,symmetric,0.2806724093952981\n75% Corpus,4,0.9099999999999999,0.01,0.2685576018605267\n75% Corpus,4,0.9099999999999999,0.31,0.2860778533836178\n75% Corpus,4,0.9099999999999999,0.61,0.2892439806686463\n75% Corpus,4,0.9099999999999999,0.9099999999999999,0.2725567640010955\n75% Corpus,4,0.9099999999999999,symmetric,0.27955372987186383\n75% Corpus,4,symmetric,0.01,0.2668910500383095\n75% Corpus,4,symmetric,0.31,0.2838954443896148\n75% Corpus,4,symmetric,0.61,0.2865926544014952\n75% Corpus,4,symmetric,0.9099999999999999,0.2725567640010955\n75% Corpus,4,symmetric,symmetric,0.2806724093952981\n75% Corpus,4,asymmetric,0.01,0.26812725545164134\n75% Corpus,4,asymmetric,0.31,0.2838954443896148\n75% Corpus,4,asymmetric,0.61,0.29144326581678964\n75% Corpus,4,asymmetric,0.9099999999999999,0.2721780507981776\n75% Corpus,4,asymmetric,symmetric,0.27675604929105146\n75% Corpus,5,0.01,0.01,0.2921917497600287\n75% Corpus,5,0.01,0.31,0.27552154453031585\n75% Corpus,5,0.01,0.61,0.269308750481684\n75% Corpus,5,0.01,0.9099999999999999,0.2709825760478901\n75% Corpus,5,0.01,symmetric,0.2748624070876895\n75% Corpus,5,0.31,0.01,0.2921917497600287\n75% Corpus,5,0.31,0.31,0.2748594189290374\n75% Corpus,5,0.31,0.61,0.27297615024851607\n75% Corpus,5,0.31,0.9099999999999999,0.27487775415711974\n75% Corpus,5,0.31,symmetric,0.27717279279720247\n75% Corpus,5,0.61,0.01,0.29578845568370865\n75% Corpus,5,0.61,0.31,0.27395015677623524\n75% Corpus,5,0.61,0.61,0.27297615024851607\n75% Corpus,5,0.61,0.9099999999999999,0.2740301275419981\n75% Corpus,5,0.61,symmetric,0.2771727927972025\n75% Corpus,5,0.9099999999999999,0.01,0.2957884556837086\n75% Corpus,5,0.9099999999999999,0.31,0.27395015677623524\n75% Corpus,5,0.9099999999999999,0.61,0.28404895953113735\n75% Corpus,5,0.9099999999999999,0.9099999999999999,0.277640513968081\n75% Corpus,5,0.9099999999999999,symmetric,0.27934536161569634\n75% Corpus,5,symmetric,0.01,0.2921917497600287\n75% Corpus,5,symmetric,0.31,0.2748594189290374\n75% Corpus,5,symmetric,0.61,0.2689322892718954\n75% Corpus,5,symmetric,0.9099999999999999,0.27487775415711974\n75% Corpus,5,symmetric,symmetric,0.2748624070876895\n75% Corpus,5,asymmetric,0.01,0.29578845568370865\n75% Corpus,5,asymmetric,0.31,0.2757452278352188\n75% Corpus,5,asymmetric,0.61,0.26239407325290853\n75% Corpus,5,asymmetric,0.9099999999999999,0.2686874093590562\n75% Corpus,5,asymmetric,symmetric,0.2770349759061833\n75% Corpus,6,0.01,0.01,0.30284473516050225\n75% Corpus,6,0.01,0.31,0.30249950909918444\n75% Corpus,6,0.01,0.61,0.3121782769555992\n75% Corpus,6,0.01,0.9099999999999999,0.35411824844325124\n75% Corpus,6,0.01,symmetric,0.30436985837998276\n75% Corpus,6,0.31,0.01,0.301390971353645\n75% Corpus,6,0.31,0.31,0.3024063840590527\n75% Corpus,6,0.31,0.61,0.31042451210080446\n75% Corpus,6,0.31,0.9099999999999999,0.35411824844325124\n75% Corpus,6,0.31,symmetric,0.30491985962114915\n75% Corpus,6,0.61,0.01,0.30064456740988804\n75% Corpus,6,0.61,0.31,0.30715638376430365\n75% Corpus,6,0.61,0.61,0.31042451210080446\n75% Corpus,6,0.61,0.9099999999999999,0.35418725691606756\n75% Corpus,6,0.61,symmetric,0.3049198596211492\n75% Corpus,6,0.9099999999999999,0.01,0.3013337842863722\n75% Corpus,6,0.9099999999999999,0.31,0.3024679491620339\n75% Corpus,6,0.9099999999999999,0.61,0.31042451210080446\n75% Corpus,6,0.9099999999999999,0.9099999999999999,0.3541872569160675\n75% Corpus,6,0.9099999999999999,symmetric,0.30491985962114915\n75% Corpus,6,symmetric,0.01,0.30065930794388435\n75% Corpus,6,symmetric,0.31,0.3024063840590527\n75% Corpus,6,symmetric,0.61,0.31042451210080446\n75% Corpus,6,symmetric,0.9099999999999999,0.35411824844325124\n75% Corpus,6,symmetric,symmetric,0.30436985837998276\n75% Corpus,6,asymmetric,0.01,0.2967291360253754\n75% Corpus,6,asymmetric,0.31,0.302076431936051\n75% Corpus,6,asymmetric,0.61,0.30847841491074096\n75% Corpus,6,asymmetric,0.9099999999999999,0.35477042247541973\n75% Corpus,6,asymmetric,symmetric,0.3073932906534706\n75% Corpus,7,0.01,0.01,0.2811852190841381\n75% Corpus,7,0.01,0.31,0.2920220131136521\n75% Corpus,7,0.01,0.61,0.28852068336946435\n75% Corpus,7,0.01,0.9099999999999999,0.316288019095773\n75% Corpus,7,0.01,symmetric,0.2726152139083012\n75% Corpus,7,0.31,0.01,0.2788837794017795\n75% Corpus,7,0.31,0.31,0.2909961139203829\n75% Corpus,7,0.31,0.61,0.28852068336946435\n75% Corpus,7,0.31,0.9099999999999999,0.3129742202428836\n75% Corpus,7,0.31,symmetric,0.2777182230257198\n75% Corpus,7,0.61,0.01,0.28327368320570007\n75% Corpus,7,0.61,0.31,0.28935990578371756\n75% Corpus,7,0.61,0.61,0.28852068336946435\n75% Corpus,7,0.61,0.9099999999999999,0.31405801487937557\n75% Corpus,7,0.61,symmetric,0.2759279034494479\n75% Corpus,7,0.9099999999999999,0.01,0.28327368320570007\n75% Corpus,7,0.9099999999999999,0.31,0.28722285177113666\n75% Corpus,7,0.9099999999999999,0.61,0.28966178578989427\n75% Corpus,7,0.9099999999999999,0.9099999999999999,0.31405801487937557\n75% Corpus,7,0.9099999999999999,symmetric,0.27734639862412164\n75% Corpus,7,symmetric,0.01,0.2788837794017795\n75% Corpus,7,symmetric,0.31,0.2909961139203829\n75% Corpus,7,symmetric,0.61,0.28852068336946435\n75% Corpus,7,symmetric,0.9099999999999999,0.31628801909577303\n75% Corpus,7,symmetric,symmetric,0.2756697323965734\n75% Corpus,7,asymmetric,0.01,0.2853642185426731\n75% Corpus,7,asymmetric,0.31,0.2907271450321393\n75% Corpus,7,asymmetric,0.61,0.29361676225955274\n75% Corpus,7,asymmetric,0.9099999999999999,0.2882054578232691\n75% Corpus,7,asymmetric,symmetric,0.2755194552200007\n75% Corpus,8,0.01,0.01,0.2808202028308223\n75% Corpus,8,0.01,0.31,0.29736349070880286\n75% Corpus,8,0.01,0.61,0.2981078394239831\n75% Corpus,8,0.01,0.9099999999999999,0.33233988889385435\n75% Corpus,8,0.01,symmetric,0.28156849418007524\n75% Corpus,8,0.31,0.01,0.28097129035113944\n75% Corpus,8,0.31,0.31,0.29941241997917767\n75% Corpus,8,0.31,0.61,0.2960462608237895\n75% Corpus,8,0.31,0.9099999999999999,0.327937391019251\n75% Corpus,8,0.31,symmetric,0.28207106831879114\n75% Corpus,8,0.61,0.01,0.2809772515659067\n75% Corpus,8,0.61,0.31,0.2993455174990556\n75% Corpus,8,0.61,0.61,0.2946710626297879\n75% Corpus,8,0.61,0.9099999999999999,0.32359289177614337\n75% Corpus,8,0.61,symmetric,0.28138865001186986\n75% Corpus,8,0.9099999999999999,0.01,0.27976485139485124\n75% Corpus,8,0.9099999999999999,0.31,0.2956217410776679\n75% Corpus,8,0.9099999999999999,0.61,0.28864637585951036\n75% Corpus,8,0.9099999999999999,0.9099999999999999,0.3282635584644864\n75% Corpus,8,0.9099999999999999,symmetric,0.28148737285248604\n75% Corpus,8,symmetric,0.01,0.2808202028308223\n75% Corpus,8,symmetric,0.31,0.29736349070880286\n75% Corpus,8,symmetric,0.61,0.297949677831361\n75% Corpus,8,symmetric,0.9099999999999999,0.3265139021008282\n75% Corpus,8,symmetric,symmetric,0.2817931736933517\n75% Corpus,8,asymmetric,0.01,0.27839412750067105\n75% Corpus,8,asymmetric,0.31,0.296460251572274\n75% Corpus,8,asymmetric,0.61,0.297148250775704\n75% Corpus,8,asymmetric,0.9099999999999999,0.32649247107917795\n75% Corpus,8,asymmetric,symmetric,0.2785059288493066\n75% Corpus,9,0.01,0.01,0.30218586423505117\n75% Corpus,9,0.01,0.31,0.3020235109956903\n75% Corpus,9,0.01,0.61,0.4185691849535152\n75% Corpus,9,0.01,0.9099999999999999,0.30527967183321003\n75% Corpus,9,0.01,symmetric,0.3060635646496002\n75% Corpus,9,0.31,0.01,0.30068508513688286\n75% Corpus,9,0.31,0.31,0.3021506007873651\n75% Corpus,9,0.31,0.61,0.40687439307467294\n75% Corpus,9,0.31,0.9099999999999999,0.3073421317714173\n75% Corpus,9,0.31,symmetric,0.3074145374333964\n75% Corpus,9,0.61,0.01,0.3008820210254886\n75% Corpus,9,0.61,0.31,0.30275011169684873\n75% Corpus,9,0.61,0.61,0.41015152208333183\n75% Corpus,9,0.61,0.9099999999999999,0.30255055512967155\n75% Corpus,9,0.61,symmetric,0.30715762371551203\n75% Corpus,9,0.9099999999999999,0.01,0.3022010046079812\n75% Corpus,9,0.9099999999999999,0.31,0.30477890773318156\n75% Corpus,9,0.9099999999999999,0.61,0.40433106836971433\n75% Corpus,9,0.9099999999999999,0.9099999999999999,0.307644042466123\n75% Corpus,9,0.9099999999999999,symmetric,0.30605307577755286\n75% Corpus,9,symmetric,0.01,0.3013158303146456\n75% Corpus,9,symmetric,0.31,0.304290454027783\n75% Corpus,9,symmetric,0.61,0.4140952699790183\n75% Corpus,9,symmetric,0.9099999999999999,0.30527967183321\n75% Corpus,9,symmetric,symmetric,0.3046097002166359\n75% Corpus,9,asymmetric,0.01,0.30129966466596664\n75% Corpus,9,asymmetric,0.31,0.30429095996678135\n75% Corpus,9,asymmetric,0.61,0.41788064240742073\n75% Corpus,9,asymmetric,0.9099999999999999,0.3025975330524115\n75% Corpus,9,asymmetric,symmetric,0.30669601144755215\n75% Corpus,10,0.01,0.01,0.29670164537919275\n75% Corpus,10,0.01,0.31,0.3047813348299382\n75% Corpus,10,0.01,0.61,0.31606878233356517\n75% Corpus,10,0.01,0.9099999999999999,0.3377296641607972\n75% Corpus,10,0.01,symmetric,0.29319673176987077\n75% Corpus,10,0.31,0.01,0.29805457831216586\n75% Corpus,10,0.31,0.31,0.30369633038354127\n75% Corpus,10,0.31,0.61,0.30998451194916454\n75% Corpus,10,0.31,0.9099999999999999,0.34268251293470897\n75% Corpus,10,0.31,symmetric,0.29319673176987077\n75% Corpus,10,0.61,0.01,0.29386231031811644\n75% Corpus,10,0.61,0.31,0.29541931052055126\n75% Corpus,10,0.61,0.61,0.309970817294693\n75% Corpus,10,0.61,0.9099999999999999,0.3453018084648031\n75% Corpus,10,0.61,symmetric,0.29319673176987077\n75% Corpus,10,0.9099999999999999,0.01,0.2945143416112511\n75% Corpus,10,0.9099999999999999,0.31,0.34045282991315623\n75% Corpus,10,0.9099999999999999,0.61,0.3126412466766651\n75% Corpus,10,0.9099999999999999,0.9099999999999999,0.3435699386195771\n75% Corpus,10,0.9099999999999999,symmetric,0.29319673176987077\n75% Corpus,10,symmetric,0.01,0.29805457831216586\n75% Corpus,10,symmetric,0.31,0.30478133482993813\n75% Corpus,10,symmetric,0.61,0.31606878233356517\n75% Corpus,10,symmetric,0.9099999999999999,0.33772966416079714\n75% Corpus,10,symmetric,symmetric,0.29319673176987077\n75% Corpus,10,asymmetric,0.01,0.29815109815940033\n75% Corpus,10,asymmetric,0.31,0.35146410607200645\n75% Corpus,10,asymmetric,0.61,0.3196615767839669\n75% Corpus,10,asymmetric,0.9099999999999999,0.34012655333540415\n75% Corpus,10,asymmetric,symmetric,0.2930959026114379\n100% Corpus,2,0.01,0.01,0.26049496936796\n100% Corpus,2,0.01,0.31,0.2631487596403128\n100% Corpus,2,0.01,0.61,0.2670430149871908\n100% Corpus,2,0.01,0.9099999999999999,0.2784737874245884\n100% Corpus,2,0.01,symmetric,0.2659027879879802\n100% Corpus,2,0.31,0.01,0.26049496936796\n100% Corpus,2,0.31,0.31,0.2631487596403128\n100% Corpus,2,0.31,0.61,0.2670430149871908\n100% Corpus,2,0.31,0.9099999999999999,0.2784737874245884\n100% Corpus,2,0.31,symmetric,0.2659027879879802\n100% Corpus,2,0.61,0.01,0.26049496936796007\n100% Corpus,2,0.61,0.31,0.2631487596403128\n100% Corpus,2,0.61,0.61,0.2670430149871908\n100% Corpus,2,0.61,0.9099999999999999,0.27847378742458834\n100% Corpus,2,0.61,symmetric,0.2659027879879803\n100% Corpus,2,0.9099999999999999,0.01,0.26049496936796007\n100% Corpus,2,0.9099999999999999,0.31,0.2631487596403128\n100% Corpus,2,0.9099999999999999,0.61,0.2670430149871908\n100% Corpus,2,0.9099999999999999,0.9099999999999999,0.27847378742458834\n100% Corpus,2,0.9099999999999999,symmetric,0.2659027879879803\n100% Corpus,2,symmetric,0.01,0.26049496936796007\n100% Corpus,2,symmetric,0.31,0.2631487596403128\n100% Corpus,2,symmetric,0.61,0.2670430149871908\n100% Corpus,2,symmetric,0.9099999999999999,0.27847378742458834\n100% Corpus,2,symmetric,symmetric,0.2659027879879802\n100% Corpus,2,asymmetric,0.01,0.26049496936796\n100% Corpus,2,asymmetric,0.31,0.2638307040567668\n100% Corpus,2,asymmetric,0.61,0.2670430149871908\n100% Corpus,2,asymmetric,0.9099999999999999,0.2784737874245884\n100% Corpus,2,asymmetric,symmetric,0.2659027879879803\n100% Corpus,3,0.01,0.01,0.27711689254474037\n100% Corpus,3,0.01,0.31,0.27802374195120044\n100% Corpus,3,0.01,0.61,0.27993004576503216\n100% Corpus,3,0.01,0.9099999999999999,0.28461303163756196\n100% Corpus,3,0.01,symmetric,0.27802374195120044\n100% Corpus,3,0.31,0.01,0.2803842121014907\n100% Corpus,3,0.31,0.31,0.27802374195120044\n100% Corpus,3,0.31,0.61,0.27993004576503216\n100% Corpus,3,0.31,0.9099999999999999,0.2836001930609014\n100% Corpus,3,0.31,symmetric,0.27802374195120044\n100% Corpus,3,0.61,0.01,0.2803842121014907\n100% Corpus,3,0.61,0.31,0.27802374195120044\n100% Corpus,3,0.61,0.61,0.28095006391110916\n100% Corpus,3,0.61,0.9099999999999999,0.2836495488056456\n100% Corpus,3,0.61,symmetric,0.27802374195120044\n100% Corpus,3,0.9099999999999999,0.01,0.28041475692228796\n100% Corpus,3,0.9099999999999999,0.31,0.27802374195120044\n100% Corpus,3,0.9099999999999999,0.61,0.28095006391110916\n100% Corpus,3,0.9099999999999999,0.9099999999999999,0.2836495488056456\n100% Corpus,3,0.9099999999999999,symmetric,0.27802374195120044\n100% Corpus,3,symmetric,0.01,0.2803842121014907\n100% Corpus,3,symmetric,0.31,0.27802374195120044\n100% Corpus,3,symmetric,0.61,0.27993004576503216\n100% Corpus,3,symmetric,0.9099999999999999,0.2836495488056456\n100% Corpus,3,symmetric,symmetric,0.27802374195120044\n100% Corpus,3,asymmetric,0.01,0.2803842121014907\n100% Corpus,3,asymmetric,0.31,0.27802374195120044\n100% Corpus,3,asymmetric,0.61,0.28095006391110916\n100% Corpus,3,asymmetric,0.9099999999999999,0.2836495488056456\n100% Corpus,3,asymmetric,symmetric,0.27802374195120044\n100% Corpus,4,0.01,0.01,0.2809362113240368\n100% Corpus,4,0.01,0.31,0.27835632925620063\n100% Corpus,4,0.01,0.61,0.27555206267257515\n100% Corpus,4,0.01,0.9099999999999999,0.28446330333455255\n100% Corpus,4,0.01,symmetric,0.27835632925620063\n100% Corpus,4,0.31,0.01,0.2809362113240368\n100% Corpus,4,0.31,0.31,0.27835632925620063\n100% Corpus,4,0.31,0.61,0.27548045699445267\n100% Corpus,4,0.31,0.9099999999999999,0.28446330333455255\n100% Corpus,4,0.31,symmetric,0.27835632925620063\n100% Corpus,4,0.61,0.01,0.2809362113240368\n100% Corpus,4,0.61,0.31,0.27835632925620063\n100% Corpus,4,0.61,0.61,0.27966327435739713\n100% Corpus,4,0.61,0.9099999999999999,0.28446330333455255\n100% Corpus,4,0.61,symmetric,0.2782671534254213\n100% Corpus,4,0.9099999999999999,0.01,0.28177729599690937\n100% Corpus,4,0.9099999999999999,0.31,0.27835632925620063\n100% Corpus,4,0.9099999999999999,0.61,0.28263222170754976\n100% Corpus,4,0.9099999999999999,0.9099999999999999,0.28446330333455255\n100% Corpus,4,0.9099999999999999,symmetric,0.27749145169031925\n100% Corpus,4,symmetric,0.01,0.2809362113240368\n100% Corpus,4,symmetric,0.31,0.27835632925620063\n100% Corpus,4,symmetric,0.61,0.27548045699445267\n100% Corpus,4,symmetric,0.9099999999999999,0.28446330333455255\n100% Corpus,4,symmetric,symmetric,0.27835632925620063\n100% Corpus,4,asymmetric,0.01,0.2809362113240368\n100% Corpus,4,asymmetric,0.31,0.27835632925620063\n100% Corpus,4,asymmetric,0.61,0.27335019355355233\n100% Corpus,4,asymmetric,0.9099999999999999,0.28118168667280763\n100% Corpus,4,asymmetric,symmetric,0.2782671534254213\n100% Corpus,5,0.01,0.01,0.3055810333138698\n100% Corpus,5,0.01,0.31,0.2921971083432925\n100% Corpus,5,0.01,0.61,0.29351918649837283\n100% Corpus,5,0.01,0.9099999999999999,0.31027882311552357\n100% Corpus,5,0.01,symmetric,0.28798663645353445\n100% Corpus,5,0.31,0.01,0.3055810333138698\n100% Corpus,5,0.31,0.31,0.2919593387953088\n100% Corpus,5,0.31,0.61,0.29351918649837283\n100% Corpus,5,0.31,0.9099999999999999,0.3013857090688862\n100% Corpus,5,0.31,symmetric,0.28798663645353445\n100% Corpus,5,0.61,0.01,0.30778470383924467\n100% Corpus,5,0.61,0.31,0.29467018241870313\n100% Corpus,5,0.61,0.61,0.29688907251265284\n100% Corpus,5,0.61,0.9099999999999999,0.3013857090688862\n100% Corpus,5,0.61,symmetric,0.29224715357405645\n100% Corpus,5,0.9099999999999999,0.01,0.30778470383924467\n100% Corpus,5,0.9099999999999999,0.31,0.2891142546834146\n100% Corpus,5,0.9099999999999999,0.61,0.29749568404827875\n100% Corpus,5,0.9099999999999999,0.9099999999999999,0.30064353604119265\n100% Corpus,5,0.9099999999999999,symmetric,0.2903640490356588\n100% Corpus,5,symmetric,0.01,0.3055810333138698\n100% Corpus,5,symmetric,0.31,0.2919593387953088\n100% Corpus,5,symmetric,0.61,0.29351918649837283\n100% Corpus,5,symmetric,0.9099999999999999,0.3013857090688862\n100% Corpus,5,symmetric,symmetric,0.28798663645353445\n100% Corpus,5,asymmetric,0.01,0.3025740990082661\n100% Corpus,5,asymmetric,0.31,0.2891198159919257\n100% Corpus,5,asymmetric,0.61,0.2931349532251887\n100% Corpus,5,asymmetric,0.9099999999999999,0.30951060360405697\n100% Corpus,5,asymmetric,symmetric,0.2903640490356588\n100% Corpus,6,0.01,0.01,0.2958760348579074\n100% Corpus,6,0.01,0.31,0.2941800616166869\n100% Corpus,6,0.01,0.61,0.2971723188349142\n100% Corpus,6,0.01,0.9099999999999999,0.30879954124012965\n100% Corpus,6,0.01,symmetric,0.29805452202591254\n100% Corpus,6,0.31,0.01,0.2969800967267447\n100% Corpus,6,0.31,0.31,0.2951136969205841\n100% Corpus,6,0.31,0.61,0.2971723188349142\n100% Corpus,6,0.31,0.9099999999999999,0.30879954124012965\n100% Corpus,6,0.31,symmetric,0.29820145833395656\n100% Corpus,6,0.61,0.01,0.2969800967267447\n100% Corpus,6,0.61,0.31,0.2951136969205841\n100% Corpus,6,0.61,0.61,0.2977299345012703\n100% Corpus,6,0.61,0.9099999999999999,0.3049789423287517\n100% Corpus,6,0.61,symmetric,0.29863606640969337\n100% Corpus,6,0.9099999999999999,0.01,0.29390398336601525\n100% Corpus,6,0.9099999999999999,0.31,0.2951136969205841\n100% Corpus,6,0.9099999999999999,0.61,0.2957487280234154\n100% Corpus,6,0.9099999999999999,0.9099999999999999,0.3049789423287517\n100% Corpus,6,0.9099999999999999,symmetric,0.29863606640969337\n100% Corpus,6,symmetric,0.01,0.2955217784385554\n100% Corpus,6,symmetric,0.31,0.2941800616166869\n100% Corpus,6,symmetric,0.61,0.2971723188349142\n100% Corpus,6,symmetric,0.9099999999999999,0.30879954124012965\n100% Corpus,6,symmetric,symmetric,0.29820145833395656\n100% Corpus,6,asymmetric,0.01,0.2949603603199485\n100% Corpus,6,asymmetric,0.31,0.2951136969205841\n100% Corpus,6,asymmetric,0.61,0.2977299345012703\n100% Corpus,6,asymmetric,0.9099999999999999,0.30674292283526966\n100% Corpus,6,asymmetric,symmetric,0.29823105460471666\n100% Corpus,7,0.01,0.01,0.30073678510163443\n100% Corpus,7,0.01,0.31,0.3093075297409759\n100% Corpus,7,0.01,0.61,0.2985273077717666\n100% Corpus,7,0.01,0.9099999999999999,0.28108775204553077\n100% Corpus,7,0.01,symmetric,0.29593451694501705\n100% Corpus,7,0.31,0.01,0.3021816720953532\n100% Corpus,7,0.31,0.31,0.3136135696595105\n100% Corpus,7,0.31,0.61,0.30525285357617626\n100% Corpus,7,0.31,0.9099999999999999,0.2810941659084761\n100% Corpus,7,0.31,symmetric,0.29621498771621146\n100% Corpus,7,0.61,0.01,0.3021816720953532\n100% Corpus,7,0.61,0.31,0.3136135696595104\n100% Corpus,7,0.61,0.61,0.30936066130234136\n100% Corpus,7,0.61,0.9099999999999999,0.28629245148004184\n100% Corpus,7,0.61,symmetric,0.29731679986925613\n100% Corpus,7,0.9099999999999999,0.01,0.3020919465269357\n100% Corpus,7,0.9099999999999999,0.31,0.3136135696595105\n100% Corpus,7,0.9099999999999999,0.61,0.3151197257139406\n100% Corpus,7,0.9099999999999999,0.9099999999999999,0.288468301912423\n100% Corpus,7,0.9099999999999999,symmetric,0.30014119040856624\n100% Corpus,7,symmetric,0.01,0.29800123606081913\n100% Corpus,7,symmetric,0.31,0.3093075297409759\n100% Corpus,7,symmetric,0.61,0.29852730777176656\n100% Corpus,7,symmetric,0.9099999999999999,0.2833650342764223\n100% Corpus,7,symmetric,symmetric,0.29621498771621146\n100% Corpus,7,asymmetric,0.01,0.3014826617961082\n100% Corpus,7,asymmetric,0.31,0.30946569097108306\n100% Corpus,7,asymmetric,0.61,0.29780622606183804\n100% Corpus,7,asymmetric,0.9099999999999999,0.27769095312678477\n100% Corpus,7,asymmetric,symmetric,0.29669074530709133\n100% Corpus,8,0.01,0.01,0.27689325792791325\n100% Corpus,8,0.01,0.31,0.28335268992548035\n100% Corpus,8,0.01,0.61,0.29447751279748635\n100% Corpus,8,0.01,0.9099999999999999,0.2933443347019431\n100% Corpus,8,0.01,symmetric,0.2719332438180937\n100% Corpus,8,0.31,0.01,0.2778826015547943\n100% Corpus,8,0.31,0.31,0.2825893964563385\n100% Corpus,8,0.31,0.61,0.2939449197205991\n100% Corpus,8,0.31,0.9099999999999999,0.2933443347019431\n100% Corpus,8,0.31,symmetric,0.2729328842173082\n100% Corpus,8,0.61,0.01,0.2778826015547943\n100% Corpus,8,0.61,0.31,0.28595528080117094\n100% Corpus,8,0.61,0.61,0.2927074117432516\n100% Corpus,8,0.61,0.9099999999999999,0.2872200703569201\n100% Corpus,8,0.61,symmetric,0.2732337709535984\n100% Corpus,8,0.9099999999999999,0.01,0.27786820825055075\n100% Corpus,8,0.9099999999999999,0.31,0.28418596093474835\n100% Corpus,8,0.9099999999999999,0.61,0.2900190376327208\n100% Corpus,8,0.9099999999999999,0.9099999999999999,0.29123665567102414\n100% Corpus,8,0.9099999999999999,symmetric,0.27323377095359846\n100% Corpus,8,symmetric,0.01,0.27689325792791325\n100% Corpus,8,symmetric,0.31,0.28420779222803205\n100% Corpus,8,symmetric,0.61,0.2952855947321743\n100% Corpus,8,symmetric,0.9099999999999999,0.2933443347019431\n100% Corpus,8,symmetric,symmetric,0.2725123137958163\n100% Corpus,8,asymmetric,0.01,0.2761789762953015\n100% Corpus,8,asymmetric,0.31,0.2799130295170522\n100% Corpus,8,asymmetric,0.61,0.2948589392161668\n100% Corpus,8,asymmetric,0.9099999999999999,0.2812546037633434\n100% Corpus,8,asymmetric,symmetric,0.2729328842173082\n100% Corpus,9,0.01,0.01,0.3055895680888938\n100% Corpus,9,0.01,0.31,0.30302760647148597\n100% Corpus,9,0.01,0.61,0.2996054006558133\n100% Corpus,9,0.01,0.9099999999999999,0.34404649042246366\n100% Corpus,9,0.01,symmetric,0.3022567927575817\n100% Corpus,9,0.31,0.01,0.3050065555696478\n100% Corpus,9,0.31,0.31,0.30328591827752077\n100% Corpus,9,0.31,0.61,0.29917768071256\n100% Corpus,9,0.31,0.9099999999999999,0.34479247631815907\n100% Corpus,9,0.31,symmetric,0.30430285315228395\n100% Corpus,9,0.61,0.01,0.30426709798506846\n100% Corpus,9,0.61,0.31,0.3018951218284658\n100% Corpus,9,0.61,0.61,0.3001071122248435\n100% Corpus,9,0.61,0.9099999999999999,0.34618320085549326\n100% Corpus,9,0.61,symmetric,0.30433563585169243\n100% Corpus,9,0.9099999999999999,0.01,0.3055723111286964\n100% Corpus,9,0.9099999999999999,0.31,0.3030863932378131\n100% Corpus,9,0.9099999999999999,0.61,0.29813786201586745\n100% Corpus,9,0.9099999999999999,0.9099999999999999,0.34500630064915166\n100% Corpus,9,0.9099999999999999,symmetric,0.3038428140388971\n100% Corpus,9,symmetric,0.01,0.3023576157049908\n100% Corpus,9,symmetric,0.31,0.30302760647148597\n100% Corpus,9,symmetric,0.61,0.3003121132770352\n100% Corpus,9,symmetric,0.9099999999999999,0.34214587751079284\n100% Corpus,9,symmetric,symmetric,0.3022567927575817\n100% Corpus,9,asymmetric,0.01,0.3057340479868098\n100% Corpus,9,asymmetric,0.31,0.30328591827752077\n100% Corpus,9,asymmetric,0.61,0.3011057586148252\n100% Corpus,9,asymmetric,0.9099999999999999,0.350861967504945\n100% Corpus,9,asymmetric,symmetric,0.30251782074437106\n100% Corpus,10,0.01,0.01,0.283398008758761\n100% Corpus,10,0.01,0.31,0.2751258030801618\n100% Corpus,10,0.01,0.61,0.2915458869869755\n100% Corpus,10,0.01,0.9099999999999999,0.2948656869892196\n100% Corpus,10,0.01,symmetric,0.2777952167218004\n100% Corpus,10,0.31,0.01,0.28125840155500537\n100% Corpus,10,0.31,0.31,0.2762321145354264\n100% Corpus,10,0.31,0.61,0.29037856739931606\n100% Corpus,10,0.31,0.9099999999999999,0.29658208038434386\n100% Corpus,10,0.31,symmetric,0.27688479666982796\n100% Corpus,10,0.61,0.01,0.2803111695414585\n100% Corpus,10,0.61,0.31,0.27818213083876936\n100% Corpus,10,0.61,0.61,0.2894460065949672\n100% Corpus,10,0.61,0.9099999999999999,0.29802196235401307\n100% Corpus,10,0.61,symmetric,0.27614054331196536\n100% Corpus,10,0.9099999999999999,0.01,0.2817458696637633\n100% Corpus,10,0.9099999999999999,0.31,0.2783073583069227\n100% Corpus,10,0.9099999999999999,0.61,0.2889530806223174\n100% Corpus,10,0.9099999999999999,0.9099999999999999,0.2950248587541427\n100% Corpus,10,0.9099999999999999,symmetric,0.27614054331196536\n100% Corpus,10,symmetric,0.01,0.2821805736812627\n100% Corpus,10,symmetric,0.31,0.2751258030801618\n100% Corpus,10,symmetric,0.61,0.2911575680096669\n100% Corpus,10,symmetric,0.9099999999999999,0.2948656869892196\n100% Corpus,10,symmetric,symmetric,0.2777952167218004\n100% Corpus,10,asymmetric,0.01,0.2818119554111271\n100% Corpus,10,asymmetric,0.31,0.2804636061791579\n100% Corpus,10,asymmetric,0.61,0.29020292429978206\n100% Corpus,10,asymmetric,0.9099999999999999,0.2973427325310818\n100% Corpus,10,asymmetric,symmetric,0.278615868791953\n"
  },
  {
    "path": "natural-language-processing/topic-modeling/results/ldavis_prepared_10.html",
    "content": "\n<link rel=\"stylesheet\" type=\"text/css\" href=\"https://cdn.jsdelivr.net/gh/bmabey/pyLDAvis@3.3.1/pyLDAvis/js/ldavis.v1.0.0.css\">\n\n\n<div id=\"ldavis_el86201113124030561323853923\"></div>\n<script type=\"text/javascript\">\n\nvar ldavis_el86201113124030561323853923_data = {\"mdsDat\": {\"x\": [-0.002952153551530786, 0.000283760695792541, -0.001135316994279039, -0.004901320566344728, 0.00644201232051609, -0.006685906154726805, 0.0014429215406945395, -0.0021481507639479207, -0.0003718253875058945, 0.01002597886133201], \"y\": [0.002162515492831472, -0.00033019139169675475, 0.00949486999706966, -0.0073470490196353945, 0.001992721482861383, -0.001060144934148983, 0.0019741007036596517, 0.0010477106792275742, -0.0038734793798218528, -0.0040610536303467785], \"topics\": [1, 2, 3, 4, 5, 6, 7, 8, 9, 10], \"cluster\": [1, 1, 1, 1, 1, 1, 1, 1, 1, 1], \"Freq\": [18.214307570253176, 17.857910161571517, 12.86414483798897, 10.760496727249047, 10.161985806280155, 9.221580754822813, 7.205607836564056, 6.939513418301538, 5.951860984816929, 0.8225919021517903]}, \"tinfo\": {\"Term\": [\"learning\", \"using\", \"data\", \"algorithm\", \"model\", \"set\", \"one\", \"training\", \"used\", \"first\", \"distribution\", \"number\", \"time\", \"function\", \"figure\", \"probability\", \"state\", \"given\", \"features\", \"models\", \"results\", \"two\", \"approach\", \"method\", \"problem\", \"neural\", \"error\", \"large\", \"order\", \"algorithms\", \"actor\", \"sts\", \"snippet\", \"appointments\", \"rsa\", \"week\", \"tours\", \"salesman\", \"actors\", \"appointment\", \"temperature\", \"macaque\", \"dorsal\", \"goodness\", \"serre\", \"ppr\", \"judged\", \"mallard\", \"streams\", \"tour\", \"roi\", \"polysensory\", \"waveforms\", \"bonn\", \"traveling\", \"era\", \"epv\", \"mcu\", \"invariance\", \"gross\", \"vdp\", \"snr\", \"zn\", \"cnn\", \"fmri\", \"ventral\", \"voxel\", \"mental\", \"starting\", \"topological\", \"action\", \"input\", \"point\", \"stimuli\", \"data\", \"order\", \"bias\", \"first\", \"average\", \"neurons\", \"neuron\", \"structure\", \"model\", \"tree\", \"two\", \"problem\", \"different\", \"using\", \"learning\", \"function\", \"space\", \"used\", \"neural\", \"algorithm\", \"see\", \"analysis\", \"figure\", \"one\", \"linear\", \"number\", \"set\", \"distribution\", \"given\", \"also\", \"random\", \"time\", \"state\", \"training\", \"matrix\", \"models\", \"results\", \"gs\", \"outbreaks\", \"rgbn\", \"neg\", \"svms\", \"sollich\", \"frequentist\", \"unrectified\", \"tiger\", \"cbinin\", \"baldwin\", \"sales\", \"varsample\", \"crystal\", \"tx\", \"ajk\", \"plain\", \"smo\", \"inadequate\", \"ltp\", \"blurred\", \"kotz\", \"qout\", \"disease\", \"discriminated\", \"sds\", \"fxi\", \"hinge\", \"heaviside\", \"oja\", \"updates\", \"beliefs\", \"excess\", \"scan\", \"postsynaptic\", \"plasticity\", \"patch\", \"yj\", \"bayes\", \"hyperplane\", \"yi\", \"risk\", \"st\", \"one\", \"probabilities\", \"search\", \"simple\", \"features\", \"units\", \"layer\", \"distribution\", \"eq\", \"problem\", \"agent\", \"data\", \"time\", \"results\", \"training\", \"consider\", \"function\", \"vector\", \"also\", \"model\", \"set\", \"learning\", \"approach\", \"number\", \"performance\", \"neural\", \"xi\", \"bayesian\", \"matrix\", \"models\", \"two\", \"feature\", \"method\", \"used\", \"algorithm\", \"using\", \"linear\", \"given\", \"probability\", \"based\", \"state\", \"figure\", \"tension\", \"suzuki\", \"tensions\", \"dornay\", \"muscles\", \"domay\", \"vref\", \"screening\", \"hogan\", \"uno\", \"muscle\", \"jerk\", \"xtj\", \"axon\", \"bhlmann\", \"daes\", \"mtll\", \"inertia\", \"morasso\", \"compliant\", \"kawato\", \"skeleton\", \"bandpass\", \"lgp\", \"unimodal\", \"cho\", \"nervous\", \"velocity\", \"abend\", \"zcn\", \"coverage\", \"gp\", \"lwpr\", \"marginal\", \"mlp\", \"motion\", \"gpr\", \"pulse\", \"virtual\", \"movements\", \"selection\", \"lasso\", \"primitives\", \"motor\", \"training\", \"models\", \"image\", \"unit\", \"data\", \"confidence\", \"local\", \"intervals\", \"method\", \"methods\", \"model\", \"et\", \"using\", \"learning\", \"given\", \"number\", \"error\", \"procedure\", \"used\", \"linear\", \"figure\", \"test\", \"function\", \"set\", \"network\", \"known\", \"distribution\", \"neural\", \"matrix\", \"input\", \"time\", \"space\", \"first\", \"based\", \"one\", \"problem\", \"two\", \"algorithm\", \"also\", \"results\", \"pab\", \"hosking\", \"trw\", \"truncated\", \"srd\", \"mincut\", \"wars\", \"aim\", \"im\", \"rounding\", \"unary\", \"komodakis\", \"wab\", \"player\", \"gop\", \"farima\", \"wavelet\", \"torr\", \"ecserpiedu\", \"ethernet\", \"sg\", \"neighbouring\", \"usg\", \"mpeg\", \"karnin\", \"hurst\", \"veksler\", \"ishikawa\", \"labelling\", \"khanna\", \"cm\", \"round\", \"fm\", \"descendant\", \"mrf\", \"traffic\", \"players\", \"expansion\", \"algorithm\", \"modeling\", \"energy\", \"problems\", \"models\", \"kernel\", \"best\", \"time\", \"algorithms\", \"state\", \"using\", \"information\", \"single\", \"feature\", \"based\", \"sample\", \"one\", \"distribution\", \"also\", \"model\", \"features\", \"used\", \"methods\", \"matrix\", \"data\", \"case\", \"learning\", \"results\", \"number\", \"given\", \"set\", \"function\", \"structure\", \"problem\", \"two\", \"probability\", \"approach\", \"training\", \"figure\", \"ach\", \"stratum\", \"joshua\", \"hasselmo\", \"barkai\", \"hippocampus\", \"suppression\", \"gyrus\", \"acetylcholine\", \"turing\", \"interneuron\", \"cholinergic\", \"berke\", \"rat\", \"schnell\", \"dentate\", \"cal\", \"multiset\", \"dominate\", \"cilz\", \"associations\", \"septum\", \"qx\", \"interfering\", \"edi\", \"pyramidal\", \"orlitsky\", \"elisabeth\", \"rgm\", \"sauer\", \"rnnat\", \"rnp\", \"entorhinal\", \"regret\", \"symbols\", \"hme\", \"inhibitory\", \"spikes\", \"winner\", \"afferent\", \"estimators\", \"sigmoidal\", \"network\", \"spike\", \"estimator\", \"region\", \"loo\", \"neurons\", \"set\", \"probability\", \"batch\", \"noise\", \"model\", \"optimal\", \"figure\", \"distribution\", \"system\", \"however\", \"two\", \"using\", \"methods\", \"approach\", \"object\", \"algorithm\", \"learning\", \"function\", \"data\", \"training\", \"since\", \"time\", \"results\", \"let\", \"networks\", \"given\", \"different\", \"used\", \"based\", \"neural\", \"problem\", \"one\", \"models\", \"method\", \"number\", \"first\", \"also\", \"eigenfunctions\", \"singla\", \"mlns\", \"ax\", \"domingos\", \"lifted\", \"eigenfunction\", \"mittal\", \"gcfove\", \"vl\", \"setineq\", \"gcfvoe\", \"compilation\", \"tuples\", \"ineq\", \"vly\", \"substitutions\", \"disjunction\", \"ptp\", \"constraint\", \"webkb\", \"alchemy\", \"hauz\", \"cseiitdacin\", \"decomposer\", \"linsker\", \"mln\", \"delhi\", \"surround\", \"fs\", \"formula\", \"predicate\", \"groundings\", \"tuple\", \"pulse\", \"dc\", \"displays\", \"centre\", \"balls\", \"motion\", \"world\", \"xj\", \"agent\", \"beliefs\", \"time\", \"constraints\", \"state\", \"clustering\", \"new\", \"shown\", \"control\", \"number\", \"two\", \"model\", \"linear\", \"second\", \"figure\", \"data\", \"action\", \"set\", \"one\", \"therefore\", \"inference\", \"given\", \"learning\", \"algorithm\", \"also\", \"show\", \"using\", \"error\", \"function\", \"first\", \"performance\", \"information\", \"algorithms\", \"problem\", \"models\", \"matrix\", \"results\", \"based\", \"used\", \"distribution\", \"pmca\", \"murata\", \"repetitively\", \"mca\", \"permuted\", \"alocal\", \"fresh\", \"gauss\", \"minor\", \"xca\", \"chambers\", \"nec\", \"hlm\", \"manipulating\", \"amf\", \"adequate\", \"schnabel\", \"bottou\", \"tsypkin\", \"yoshitatsu\", \"amari\", \"sigmoid\", \"br\", \"dropout\", \"logprobability\", \"lmd\", \"hollow\", \"wt\", \"maxkurt\", \"sami\", \"symbolic\", \"displays\", \"grammar\", \"boosting\", \"expressions\", \"zt\", \"ica\", \"phase\", \"ir\", \"separable\", \"learning\", \"batch\", \"non\", \"error\", \"matrix\", \"sum\", \"fig\", \"algorithm\", \"small\", \"solution\", \"model\", \"linear\", \"first\", \"map\", \"convergence\", \"using\", \"optimal\", \"consider\", \"data\", \"given\", \"training\", \"see\", \"mean\", \"neural\", \"set\", \"one\", \"figure\", \"number\", \"kernel\", \"distribution\", \"results\", \"models\", \"log\", \"network\", \"time\", \"two\", \"space\", \"analysis\", \"used\", \"different\", \"function\", \"problem\", \"identied\", \"vo\", \"srd\", \"eggs\", \"lrd\", \"identication\", \"participated\", \"urn\", \"statue\", \"token\", \"mother\", \"wavelet\", \"traffic\", \"vps\", \"developmental\", \"lobs\", \"matsuda\", \"eff\", \"photographs\", \"participants\", \"dene\", \"birds\", \"dinners\", \"gist\", \"blog\", \"deg\", \"whitening\", \"sigcomm\", \"notepad\", \"notepads\", \"lmica\", \"video\", \"object\", \"identifiability\", \"lasso\", \"replica\", \"tokens\", \"admixture\", \"balls\", \"discovery\", \"categorization\", \"learner\", \"lifted\", \"bci\", \"pseudo\", \"order\", \"tracklets\", \"condition\", \"screening\", \"word\", \"constraint\", \"objects\", \"set\", \"learning\", \"topic\", \"points\", \"based\", \"used\", \"problem\", \"data\", \"map\", \"state\", \"two\", \"using\", \"model\", \"log\", \"case\", \"models\", \"likelihood\", \"also\", \"control\", \"number\", \"function\", \"results\", \"figure\", \"matrix\", \"features\", \"let\", \"given\", \"one\", \"algorithm\", \"training\", \"time\", \"different\", \"linear\", \"random\", \"distribution\", \"method\", \"sgi\", \"hull\", \"ugi\", \"crm\", \"pumadyn\", \"griffin\", \"nrms\", \"kiji\", \"spambase\", \"hkl\", \"magic\", \"nrm\", \"wv\", \"unnormalized\", \"ngm\", \"pol\", \"slice\", \"hilbertian\", \"twonorm\", \"kgap\", \"abalone\", \"dag\", \"ygi\", \"fv\", \"covariate\", \"mushrooms\", \"sampler\", \"hulls\", \"reversible\", \"bg\", \"predictive\", \"kernel\", \"xg\", \"sngp\", \"embedded\", \"atoms\", \"losses\", \"kernels\", \"calibrated\", \"algorithm\", \"osi\", \"set\", \"large\", \"framework\", \"random\", \"loss\", \"using\", \"learning\", \"size\", \"xt\", \"training\", \"algorithms\", \"also\", \"model\", \"number\", \"space\", \"convex\", \"function\", \"shows\", \"problem\", \"value\", \"however\", \"non\", \"data\", \"given\", \"time\", \"models\", \"used\", \"distribution\", \"one\", \"error\", \"state\", \"results\", \"figure\", \"based\", \"two\", \"neural\", \"entorhinal\", \"radiatum\", \"calcium\", \"perforant\", \"degraded\", \"preparations\", \"eejj\", \"peking\", \"cholinergic\", \"alzheimer\", \"acute\", \"bower\", \"hippocampus\", \"berke\", \"edi\", \"unrobust\", \"collaterals\", \"hagan\", \"associations\", \"potassium\", \"stratum\", \"hetero\", \"sga\", \"err\", \"ydxi\", \"biophysical\", \"laminar\", \"wash\", \"eric\", \"sorting\", \"sort\", \"acetylcholine\", \"hippocampal\", \"xu\", \"morris\", \"modulation\", \"surrogate\", \"gyrus\", \"schnell\", \"cal\", \"stdp\", \"robust\", \"ranking\", \"ar\", \"suppression\", \"learning\", \"patterns\", \"risk\", \"estimator\", \"ii\", \"calibrated\", \"kl\", \"rules\", \"ln\", \"networks\", \"region\", \"using\", \"loss\", \"first\", \"probability\", \"assumption\", \"convex\", \"one\", \"used\", \"approach\", \"features\", \"data\", \"algorithm\", \"training\", \"xi\", \"eq\", \"distributions\", \"model\", \"method\", \"distribution\", \"theorem\", \"set\", \"state\", \"function\", \"figure\", \"results\", \"neural\", \"input\", \"number\", \"given\", \"time\", \"large\", \"two\", \"problem\", \"models\", \"error\", \"algorithms\", \"network\"], \"Freq\": [1195.0, 961.0, 1407.0, 873.0, 1431.0, 966.0, 823.0, 635.0, 618.0, 487.0, 679.0, 729.0, 758.0, 728.0, 610.0, 411.0, 500.0, 611.0, 457.0, 668.0, 540.0, 673.0, 455.0, 448.0, 642.0, 506.0, 427.0, 361.0, 372.0, 396.0, 20.6440106814847, 14.768286947470074, 6.817414786832448, 4.068094479611402, 19.070452390370296, 2.5917048001291505, 4.053280798666097, 3.6345950399536346, 4.1080120704424665, 1.4166834530247012, 6.989591819426287, 4.75570853219193, 12.718844534601768, 4.373582604653636, 1.4093784677042898, 2.0513466679080645, 1.3838053019546595, 1.7078679251365456, 7.0237989018992035, 5.694458273006332, 3.844971515906464, 1.0257849684044946, 4.0728345806845345, 1.3166072303553042, 1.2812401726626246, 1.937308014590823, 1.7087811611396535, 22.405998296131944, 14.002070205704758, 0.9871829940856499, 19.166508569217115, 11.526038555752448, 11.75102981676984, 11.669363950480722, 35.50956463537211, 12.605446801444863, 16.87543745049325, 6.865988109744369, 14.32324522779807, 14.018535534894234, 88.52457137934978, 91.40424190841937, 71.00519632844177, 22.762123627362975, 303.7359275582615, 92.37437717849379, 51.07185958564441, 115.18856398417694, 63.290231532848146, 66.40454744613898, 50.05176985484777, 91.22926649202688, 281.64954681814305, 44.66785755992362, 144.86701172508788, 138.810428145265, 92.73726918229085, 193.82779464514945, 228.0426138970156, 150.1613757384552, 96.23894400082344, 129.31418915352324, 107.092166838965, 164.51933536862776, 80.0199014721712, 82.82898292037058, 119.50766055176344, 148.90194174586745, 101.10699079204808, 128.01352443602946, 153.3930287871279, 118.16850933089518, 109.74651270817321, 103.46049295061538, 85.58888021564607, 115.53420109329122, 91.07990380992914, 99.82394686544126, 90.22281120926081, 94.09620768472892, 89.07906781913596, 21.53911159445337, 7.9914057530443445, 4.623563881405115, 3.9920644716157674, 14.014363943431178, 2.753546190230405, 13.44061879065286, 3.5411117230029547, 10.652435327929464, 2.6545271762357996, 2.542803983980876, 2.8991838712257296, 2.554636396531438, 0.9408850037424669, 7.5196303420538575, 1.178841714606221, 2.5121036703361757, 1.168303970014125, 1.5600006939862925, 1.781131957884527, 0.6190333814001633, 1.2088008061372961, 3.3967862632858012, 9.813091042190255, 0.9246950164516736, 10.087221973569472, 8.708252058367062, 4.170951090271558, 2.069233856680642, 5.713262168187321, 28.90010417843369, 24.906396480082442, 10.763526895688756, 12.808577735569735, 13.003404833614786, 11.286474216307502, 13.559470400790978, 25.260356369318114, 13.279193152390759, 10.442121186669734, 48.300212485610245, 30.61373064144342, 34.03480479892645, 201.0155713546716, 32.480802030031555, 61.630922123720296, 58.23010243208127, 108.26370258351082, 47.476246087988386, 40.76916688048589, 151.97294371972356, 60.13232294206473, 139.32586832599782, 58.78972232523754, 268.94734107567143, 156.28852796908123, 116.70976291803635, 133.46956636813164, 67.50358908580023, 146.8605673817312, 76.93095020515628, 115.49138906235633, 247.37565051984416, 175.45002677365994, 207.69407906794112, 94.44652230817192, 135.56338062516897, 79.99944749798819, 99.42386258024368, 65.02453403877023, 60.12877333630507, 98.47957304376618, 118.02019995288707, 118.29075354721162, 71.44464744553017, 85.4400490834587, 104.7757523430453, 129.98313603864298, 130.60321251500363, 90.62803391221995, 97.63096237530499, 78.86417312289166, 81.82916542224248, 82.13206196691966, 87.5615804748321, 7.758201749923136, 3.2930108526257924, 1.9363811794289367, 1.269591004503053, 3.1046037424776416, 4.2436040164042845, 1.1793756396811061, 15.177509442577083, 2.7058338085683515, 7.5130801580775035, 12.681865681883318, 2.66470953478903, 1.4170664870479703, 2.8402481353942646, 1.3811144390090002, 2.3244827867639137, 1.668216533473952, 1.1303763557516302, 1.4025206913001185, 1.1533473667791363, 3.4097769620804455, 1.3921487189371176, 1.3134465398985713, 12.443806437223294, 1.6524267725945714, 1.3563033541638905, 2.4461386875761035, 18.803553785899556, 0.8061198482143254, 0.7764546216487868, 6.317385591517614, 5.839586947020346, 5.084101112518192, 25.730259362093815, 16.389577552747422, 39.34754601972535, 9.881636522130666, 8.390950824868938, 7.46879172664151, 15.513686689349738, 47.77525879288375, 19.032459521352234, 14.14929357714718, 14.362076963451774, 116.45431719721158, 119.91579899034537, 61.507882386738615, 26.74677662112406, 219.7467858238089, 17.95084591870384, 59.797590674559906, 17.907868288827814, 78.50192987338421, 72.99742018791045, 197.89980753555548, 45.03993386186432, 137.96072509831498, 164.10269855972277, 93.21240803071132, 106.89733952941677, 69.40642916049383, 26.657689718734577, 90.92379951095997, 78.70004962356077, 88.61872691558379, 43.99208827553768, 96.92274150469537, 117.28477034012718, 67.22885284343812, 34.546761658386416, 86.43830117775701, 69.9126603829269, 70.42843954512752, 54.543779012989745, 86.80522360664439, 59.477700689712655, 64.12292299821499, 63.22530785163538, 82.6191252707391, 72.74893573551279, 70.66713803931613, 78.16752076360555, 63.48541440450507, 60.681441554694, 2.3795347200032295, 0.8426863924587441, 6.2164806046057315, 17.287012480073997, 1.5542623624067713, 4.128648011675924, 1.579084092677598, 7.375743383691165, 10.995697468684115, 2.89834004470483, 3.4190792647149633, 1.027141056785191, 3.9237958981142613, 19.912090042364785, 2.3214504881650373, 4.4240101231695474, 17.50599932583188, 1.2453706276083947, 0.7364765830245493, 0.7229697579616675, 7.569728068648008, 1.94249784513201, 3.8726581528071407, 1.6752960508923502, 0.8768613569931746, 0.693206186567833, 1.184492122366873, 0.9365726633565068, 10.374441200231999, 0.4510073530675216, 6.383678377507041, 8.57206926014288, 12.569695909106743, 2.5265283620945493, 13.942779710099892, 10.928388881532468, 11.191064447042846, 9.570178244678884, 130.25411023570987, 22.71725824996586, 22.263408093904125, 52.98376117041707, 95.60269959885694, 56.644671385891186, 38.642469970045255, 101.90604852667356, 58.07847939291381, 69.32005727785284, 116.52270338075704, 58.132269378132996, 39.475230639055624, 47.73751401113219, 62.89702904599368, 33.178983741330605, 94.47651729093204, 80.24406036427068, 69.37966654764047, 138.8052502654046, 58.27927599700136, 72.30858698963699, 54.309966453200474, 61.62757174254049, 123.64355086566111, 44.00860406180548, 105.75274921457192, 61.226991078503396, 74.78252811820438, 66.50968413830638, 83.41997331453886, 70.39561237887729, 47.21825236606502, 60.085131401789994, 61.21907275580616, 48.454082396080906, 49.703988496150174, 51.81402808765735, 50.780840563823304, 1.5713531418096451, 2.5438082460715217, 1.5467793604367406, 8.620739021108463, 2.7015541947132444, 2.3962381394705052, 5.416039590626053, 1.652997591446023, 2.036326085394666, 6.869154534584633, 0.9313803090510414, 8.78969744999268, 1.301499262344483, 1.091330477869197, 1.9633934256924699, 1.5341926420857712, 5.069188002756316, 1.9607931845414592, 1.066282277134632, 1.0801145561557495, 0.8215492889856338, 0.8294248014781628, 1.472703796624784, 0.8130777305116559, 1.2133409475766612, 2.0781386144642466, 1.6734179370698703, 0.634333185187615, 0.6155698658586253, 1.899565051489764, 3.5758760065566233, 3.20252240632034, 2.628721253730378, 10.986525682571084, 4.501194877999106, 7.691450538828819, 13.864904165754742, 15.0395678482386, 9.230727492587357, 11.915927649453872, 24.78404081076119, 8.240920156025663, 75.162472335404, 32.83317467350116, 35.358810106296296, 20.863085993567363, 6.669917204870094, 41.67921756470765, 127.95343073335425, 59.83780698983501, 16.81073212209051, 45.37089185014776, 153.37859105820803, 42.403563597815335, 75.48797035145334, 81.75150389301376, 33.623989108724324, 47.89901722914676, 76.67395832001651, 100.80235895953503, 52.92404024759112, 54.548790581205694, 25.047754854230714, 86.68875832229119, 105.18926275505156, 73.6180610427364, 117.0553847653627, 64.54756575170873, 39.87244061367769, 70.09873830866383, 55.75427372727449, 43.869537556021776, 35.73855510566107, 58.38995296376592, 44.74022097762267, 58.00387968688623, 49.61632802250833, 50.91853496879445, 57.667352847629665, 65.23057384298514, 57.16976651077156, 47.296477108070604, 56.949834692893326, 47.638193349771484, 46.86359714019988, 4.026358356452829, 2.070616793567729, 1.50201025397041, 3.2463132106246104, 1.45847010621715, 14.027036725503686, 2.823235212616134, 0.9444452436883586, 2.7454321843489673, 3.706480777931948, 4.226126569380088, 0.8825738934832695, 1.8086696537899047, 3.3856818405443785, 1.3354132453601848, 1.1073843143972122, 1.3185375096101326, 0.666085561221687, 1.943663352503726, 40.463622030535824, 1.973770825311503, 1.7249241045783272, 0.637342874086355, 0.6381035131110536, 3.2773411282525813, 8.334742712033236, 2.9582968554527302, 1.0393543474222497, 3.527122004503372, 1.8317156604026381, 9.332897327751985, 3.7549761240742274, 1.6364255135941492, 5.94563845232203, 6.767938444190835, 5.366513218004431, 14.342054928176413, 3.8556577361045092, 10.242322403405552, 25.929627088295195, 20.72074667258608, 19.784120791977983, 36.16253115661904, 13.859304318805922, 90.05624808406566, 29.719614849303596, 60.03993130511508, 30.98703415635482, 44.812457818422715, 35.23035241688997, 25.98222985470263, 76.24449855755894, 71.27630730229968, 124.56148863541425, 56.585888536594105, 28.621291706320278, 63.29198686050176, 119.42446933639222, 40.06547312726031, 87.77092710082353, 76.54371106824478, 26.17433805332188, 35.386839614245226, 58.64564234459521, 92.31041836276736, 72.94144164522359, 52.27611300224872, 37.58839981416754, 71.43932932272794, 43.03671653726441, 58.726497678790444, 46.12525770584434, 38.85653573367063, 40.9045166228196, 39.431236982953735, 48.89170838933965, 48.80934183160645, 43.58420845049236, 42.682393116589594, 39.08707775930033, 39.41738978792289, 39.30585661590717, 2.5489232581353636, 1.3957910118129622, 0.4953265560078979, 2.2402834264741696, 0.4615639483854566, 0.7641910085386956, 0.9472958772207817, 0.750571677983846, 3.304469495925937, 5.818730821257241, 0.5936219353398935, 0.5765401135579997, 4.247989814384577, 0.2793635234796691, 0.4164587432663953, 1.0113191999632902, 0.4168063017970412, 2.643011122880891, 0.40900351011455477, 0.8239530416913212, 1.2672979772569894, 2.5539028174996856, 8.748968670621224, 0.6942142840346688, 0.6892103220678014, 0.6868985623471375, 0.5409193370295281, 9.84369370438062, 3.03285329284687, 0.2645510544566984, 3.22648711078631, 10.504639027586782, 6.19632121822615, 12.421020877656506, 9.187493675554343, 12.593991285982876, 5.101979960030689, 7.970869904669298, 2.026008577893751, 6.086686157806131, 116.13245880578458, 11.461355321205378, 35.87645580845802, 43.16239394102681, 50.93483378550415, 19.721412969305746, 27.287642489677093, 78.44123406040491, 23.394794582512876, 21.54528967171992, 114.86846398861559, 48.19130512873678, 45.07537133044871, 22.282273980606398, 20.48302557505206, 72.64550310867666, 28.138068188832353, 27.093987245062248, 90.99987133196812, 47.355043890469815, 48.555714787292686, 31.24234587569963, 27.461752797135055, 40.52108100120892, 64.71339739285496, 57.1840522276756, 45.5369723886456, 50.78175018884447, 31.285172562204178, 47.18758061387109, 40.44462681417407, 46.333298113142774, 31.693448815244334, 35.56972866468806, 49.08327191387223, 44.25360404772296, 33.595069114404446, 30.224660691436174, 34.782453567882314, 30.492518893723222, 33.46225420038203, 31.883750070935648, 0.9570146179845763, 1.8827661250557517, 0.9367498961789786, 3.3995638690456436, 1.0754084147550114, 4.214935571691202, 0.4271763725160795, 3.5773150332061547, 0.5599018369569364, 4.112495414799079, 0.7163003917450645, 10.061343666638088, 7.7791127245924265, 1.095832315852087, 0.5480709235317299, 0.5362742305015584, 1.087161314850833, 1.1790772959150944, 0.664189780539765, 8.440325502725923, 0.3927089830384294, 1.7171090331707082, 0.38442369231890255, 0.3882801932699303, 0.9126456116580868, 0.8945277754063324, 0.7812212960565919, 0.3966771731682275, 1.0169330393555631, 1.1504645746593696, 5.600401400878253, 14.887438756042409, 22.144405802276385, 7.550232048525477, 10.778193189680366, 6.4668636374258535, 5.723325186927127, 2.3721227036287345, 7.185363076465937, 6.312590530289151, 2.2451176912373163, 4.843017046088878, 6.427029034063596, 6.58741710224275, 4.14777005021187, 37.96276403288741, 2.7872888925596877, 17.407518066554434, 5.82151811703446, 9.964252667643242, 16.78987169178046, 11.525985745435246, 75.400218290589, 88.38800484758337, 13.587315955325526, 21.63051062103648, 40.29344325870188, 48.79392787197593, 50.11541245336035, 94.2748024092492, 20.58266080180068, 40.109973333449474, 49.502542489101415, 65.17093720416756, 88.61661955609092, 31.076959301446124, 27.835753892962543, 47.662017629896944, 22.127871791376748, 41.15296603477288, 17.825406295886438, 49.91964198705037, 49.77171633328822, 39.45156972302097, 42.712031182805234, 37.824523266805535, 34.50279021837123, 30.2139972896932, 41.31076000211836, 50.273650311391165, 51.993752396734656, 41.33724670251671, 44.16064092054756, 29.31221135130375, 32.8977438649659, 29.647985306623767, 32.68652160058419, 29.482815783717232, 3.7532063294964066, 4.099598613748449, 1.725718366705504, 2.199076292712959, 1.8569269336214778, 0.9965387586161965, 1.6250332570868404, 0.47404167882462533, 0.6395661025429805, 2.248172183448534, 0.46762434301187644, 0.777005580862535, 1.2578698916046513, 0.4523934392853069, 0.7612537998640414, 2.4266367816357612, 9.484953791915387, 1.207091264931047, 0.594667071653781, 0.890582538158348, 0.4331385312563057, 8.246145501172848, 1.1560648589794345, 2.1207109831371045, 3.0988049343082325, 0.4127949184432781, 10.344089501665403, 0.4209130328518001, 0.4047523177194824, 0.8332728626819088, 7.759941989452648, 47.52491505094083, 2.918198028617337, 4.033109600947796, 6.508291259400894, 4.123242263076617, 6.386654838624866, 15.3351318832135, 5.555249563088389, 73.78108615835714, 3.5527627260987424, 74.66109117612253, 31.890936187221097, 20.061764670894107, 33.72945843004252, 15.408650748101644, 63.832919625317196, 75.43695023703616, 26.083579861368996, 17.120643519357035, 42.35944235048196, 29.276413295327178, 38.089275628156145, 76.35712853507594, 45.41986725787781, 30.302814176548665, 15.588898767139868, 43.82677707603109, 20.32854193341812, 39.37045338883722, 19.72051046360699, 25.599714914072667, 24.351602708091136, 60.030641986362795, 34.95219489155247, 39.78848141546454, 36.73586902803433, 34.58347100824438, 36.22351811637676, 40.30556451803163, 27.097296016758225, 29.394076201744923, 30.433251481299283, 32.213263204071914, 28.51206692493077, 32.77968191399354, 28.584235982704435, 0.3469596106642303, 0.21135871019468716, 0.2436753234471448, 0.10945504513795974, 0.13125340796572732, 0.08468436946452988, 0.13442697788417168, 0.04178208602692706, 0.8332281984310828, 0.061800332763790995, 0.06369204732842552, 0.18696571427786066, 0.20459813044001843, 0.12135317710785151, 0.12290262555070174, 0.14859485070165893, 0.08208921318861732, 0.06038354276418848, 0.08124948469928857, 0.08115575951151154, 0.20006828353649544, 0.06034788452252878, 0.040689685465837774, 0.4443749100608289, 0.06129678919840252, 0.03919200534295807, 0.039617702781308585, 0.03938829214063075, 0.13697919207028061, 0.11491970464548106, 0.6502629339072264, 0.17729523718970305, 0.411753897184447, 0.6566686434931827, 0.1195014761455672, 0.590622241734199, 1.4572516673277345, 0.13550857497646543, 0.1699567601847217, 0.40926464208126845, 0.2803209885654868, 1.2873812397684945, 1.0048752326525263, 0.7309485115085608, 0.3827354810357466, 12.637021669660681, 1.7428723348813564, 1.6381815855090516, 2.769626725360045, 2.469580450914501, 0.7498439550924326, 1.6866854912983553, 1.4616163230782029, 1.2055978666663076, 3.330988553244524, 1.6167421764253085, 8.725057733620723, 2.12478982346342, 4.951584755051168, 4.315875275877912, 1.5397209180111138, 2.203297313041425, 7.142928077432784, 5.682269960885034, 4.357923300500388, 4.308876258810682, 9.414885552218264, 6.671286951558469, 5.22712468548778, 3.039191508607327, 2.6572839130130412, 2.7063816154973, 8.471461537216205, 3.934069001872732, 5.137767306171601, 2.6542588330361423, 6.353685950801212, 4.123843468555472, 5.1567074856287745, 4.5192159640857446, 4.141319920678175, 3.966908937871499, 3.19194663245482, 4.719227824977662, 4.220253787157977, 4.705660733739358, 3.189032658340653, 4.140957250269099, 3.9992026939736243, 3.842910011639526, 3.277709921925665, 3.211894592095485, 3.2216564706064337], \"Total\": [1195.0, 961.0, 1407.0, 873.0, 1431.0, 966.0, 823.0, 635.0, 618.0, 487.0, 679.0, 729.0, 758.0, 728.0, 610.0, 411.0, 500.0, 611.0, 457.0, 668.0, 540.0, 673.0, 455.0, 448.0, 642.0, 506.0, 427.0, 361.0, 372.0, 396.0, 46.26949659094198, 35.56784901430009, 16.99601807451592, 10.403003224865609, 48.80353294944979, 6.655490681157698, 10.565485858770964, 9.498133796461767, 10.829878561950213, 3.777344379589113, 18.68129499377603, 12.964792060551067, 34.87740927056573, 12.00358077984416, 3.8985050712953844, 5.689233230110952, 3.8645076572407833, 4.799944550589734, 19.818093617305923, 16.124253958087706, 10.949369239335514, 2.9310392106082115, 11.667424757072348, 3.794357238976192, 3.71118813286369, 5.6201264933949355, 4.967498710398547, 65.48756484455541, 41.663530023406736, 2.9397519335052804, 57.32453226622249, 34.7873591043794, 35.63244194529723, 35.47366361782921, 114.96772009043424, 39.37783633700974, 53.731468566411564, 20.919135857458, 46.07749249435479, 45.19743286383294, 339.29034503212927, 353.2032144071574, 274.3522996039958, 77.71135664815532, 1407.273660704956, 372.0398605412698, 192.76994848935044, 487.1202850504721, 248.08395488753914, 262.8474984018128, 191.28221975712316, 385.44722482884146, 1431.984008449568, 170.6696618202805, 673.671027390825, 642.8982434526419, 400.7476191147177, 961.5305415932702, 1195.6862574171353, 728.902310820616, 429.24297961056976, 618.5857198809623, 506.47366644701617, 873.441661941156, 356.92009944423427, 373.99838684654475, 610.2302484575663, 823.6936357079712, 521.2301962519678, 729.2915932180222, 966.4005498599995, 679.116562738571, 611.9734151321558, 562.8345892296189, 422.9248633538833, 758.4270425720434, 500.87214062634985, 635.8167746006652, 515.3836205188635, 668.1881093519099, 540.6046981534062, 57.519225846783286, 22.030677575857364, 12.762721234249081, 11.076081623172366, 39.772551231994086, 7.960908819967716, 39.464189950966116, 10.416784706941975, 32.06597908099562, 8.052683096862646, 7.757508002326741, 8.938244755540572, 7.892167465260508, 2.9137740767544713, 23.305939027642143, 3.6919424397062746, 7.903820763868093, 3.69235721977951, 4.936304351054405, 5.65151721850273, 1.9706876502478634, 3.8592361309170613, 10.857951333323772, 31.392613743296447, 2.9633588753443436, 32.363767793685696, 27.9497642568613, 13.392415642026759, 6.658935821243224, 18.465462729780537, 93.93700198326073, 81.36310600311428, 34.940056766417875, 41.76591986657553, 42.52071852766598, 36.95596170899812, 44.9594456983409, 85.49879868100693, 44.12254717652642, 34.821429100141756, 175.72231570527924, 108.95731563003001, 122.4720222877545, 823.6936357079712, 117.54288172041171, 234.79125123024468, 224.09978232045, 457.6020438017594, 183.9777113462449, 155.83254846190889, 679.116562738571, 243.7420205987841, 642.8982434526419, 243.6911925299078, 1407.273660704956, 758.4270425720434, 540.6046981534062, 635.8167746006652, 290.6353179951054, 728.902310820616, 344.5554213436141, 562.8345892296189, 1431.984008449568, 966.4005498599995, 1195.6862574171353, 455.574061055531, 729.2915932180222, 374.336987173198, 506.47366644701617, 293.1111838607104, 264.26049708072264, 515.3836205188635, 668.1881093519099, 673.671027390825, 335.66573368172186, 448.2831991780111, 618.5857198809623, 873.441661941156, 961.5305415932702, 521.2301962519678, 611.9734151321558, 411.93644338116826, 479.20331553453616, 500.87214062634985, 610.2302484575663, 19.769440970680876, 8.79029980453839, 5.724683985550034, 3.8061665792678, 9.633197836922887, 13.203591443492636, 3.6912851409869347, 47.523503150076365, 8.597812759957197, 24.060914457030652, 40.716315570542385, 8.641409036664125, 4.627660825549739, 9.394257978291074, 4.581293548854388, 7.723616819224283, 5.560070816114689, 3.7869905627957716, 4.757117498393694, 3.9150807755907975, 11.669688942189307, 4.772910629026891, 4.568697013863613, 43.33326670513176, 5.7611943256755085, 4.739239721771799, 8.555252982728824, 65.91959652892822, 2.82976024829395, 2.73932483516138, 22.338371667706085, 21.13644214736056, 18.365515235230543, 98.12751382281128, 62.84326627549161, 158.9880817763593, 37.4926810858836, 31.687899726785353, 28.038238935133077, 62.02957400452683, 217.59398152552058, 79.32230082919055, 57.63808458163869, 59.951471280849134, 635.8167746006652, 668.1881093519099, 319.25998454254005, 122.68403702836332, 1407.273660704956, 78.46232112754637, 318.54991662618966, 78.55704420539193, 448.2831991780111, 425.3948949085436, 1431.984008449568, 240.85904958358205, 961.5305415932702, 1195.6862574171353, 611.9734151321558, 729.2915932180222, 427.55208645610537, 129.34622975005055, 618.5857198809623, 521.2301962519678, 610.2302484575663, 249.22173329384376, 728.902310820616, 966.4005498599995, 449.99716905582846, 182.15302429248712, 679.116562738571, 506.47366644701617, 515.3836205188635, 353.2032144071574, 758.4270425720434, 429.24297961056976, 487.1202850504721, 479.20331553453616, 823.6936357079712, 642.8982434526419, 673.671027390825, 873.441661941156, 562.8345892296189, 540.6046981534062, 7.651414401666015, 2.7823344864733426, 21.146857883700243, 59.99279643192883, 5.4827702644829595, 14.622764565486273, 5.634729883851929, 26.336557170888753, 39.262847419323464, 10.392677710059358, 12.378484447530726, 3.734200275746947, 14.277796352014802, 72.47357920659536, 8.578563574482063, 16.418363694148507, 65.20642147653092, 4.655976904150263, 2.8052562483358776, 2.769372599738421, 29.263507061576295, 7.558806865531336, 15.126928233462827, 6.590247520270733, 3.4786841585067805, 2.755373298993003, 4.731506028015227, 3.745325104636049, 41.59348067062316, 1.8324173377864301, 26.230868967860307, 35.427910544968356, 53.49855353142143, 10.352295859335308, 64.50253709531879, 51.04044808565002, 52.881475567792464, 46.01497142831539, 873.441661941156, 124.28478255112395, 123.90392071614436, 336.14673770867523, 668.1881093519099, 367.9968167945871, 236.14488281788385, 758.4270425720434, 396.96683484839036, 500.87214062634985, 961.5305415932702, 416.8716465960334, 262.59401742362815, 335.66573368172186, 479.20331553453616, 211.96813582125745, 823.6936357079712, 679.116562738571, 562.8345892296189, 1431.984008449568, 457.6020438017594, 618.5857198809623, 425.3948949085436, 515.3836205188635, 1407.273660704956, 327.4854432408144, 1195.6862574171353, 540.6046981534062, 729.2915932180222, 611.9734151321558, 966.4005498599995, 728.902310820616, 385.44722482884146, 642.8982434526419, 673.671027390825, 411.93644338116826, 455.574061055531, 635.8167746006652, 610.2302484575663, 5.142995453497826, 8.380516666918913, 5.187699165888176, 30.025117159214776, 9.512086762021099, 8.461738245621612, 19.384186683170675, 5.976624197896211, 7.675552542732718, 26.07338836943623, 3.558404979515328, 33.98799667193393, 5.044054604690889, 4.251032272310826, 7.6760377106973685, 6.114566112037209, 20.21993598980563, 7.925111281398334, 4.381764489030022, 4.456047616466653, 3.4009725570491076, 3.477341506485557, 6.186375733222197, 3.420936521271077, 5.113784740849575, 8.772528747848547, 7.077597746193902, 2.685747136937168, 2.614300349540071, 8.107035314573407, 15.34066476762932, 13.890947556134222, 11.398499331679394, 49.91938654429592, 19.965470713269518, 34.90713302458184, 64.79005418921342, 70.85791485259932, 42.87701534854738, 57.48013945226924, 126.67849687167354, 39.47625999881498, 449.99716905582846, 183.07213907288482, 203.2210191592768, 115.82515269764814, 31.9356325341013, 262.8474984018128, 966.4005498599995, 411.93644338116826, 96.26335809251898, 321.2941028466383, 1431.984008449568, 301.03098892420627, 610.2302484575663, 679.116562738571, 229.14159375926826, 368.15876879435507, 673.671027390825, 961.5305415932702, 425.3948949085436, 455.574061055531, 164.2722459939792, 873.441661941156, 1195.6862574171353, 728.902310820616, 1407.273660704956, 635.8167746006652, 325.22218290354186, 758.4270425720434, 540.6046981534062, 379.3514952191112, 278.7057263684126, 611.9734151321558, 400.7476191147177, 618.5857198809623, 479.20331553453616, 506.47366644701617, 642.8982434526419, 823.6936357079712, 668.1881093519099, 448.2831991780111, 729.2915932180222, 487.1202850504721, 562.8345892296189, 13.166108260761519, 6.940840415398054, 5.279622227594939, 11.590188142284742, 5.2089749234288885, 50.172762657904485, 10.414380879288375, 3.5183735332533885, 10.518688065024335, 14.228294462146328, 16.454767431265047, 3.4518215999203368, 7.075910765248164, 13.268826779901474, 5.241309843165724, 4.361338630024036, 5.236990864069075, 2.646586863050284, 7.8924503550081155, 164.7227576651759, 8.035150814735763, 7.039484426770902, 2.6186657270752787, 2.637996388905934, 13.614305016790762, 34.89048078777976, 12.412757439620282, 4.385708805531656, 14.925874206726649, 7.779092766211843, 40.61044883018285, 16.064121021966656, 6.962491678057731, 27.22501369433256, 31.687899726785353, 24.894767772448617, 74.17229655094157, 17.75252998612476, 54.38330530311589, 158.9880817763593, 123.24203957150945, 118.51637631026557, 243.6911925299078, 81.36310600311428, 758.4270425720434, 202.89153183439126, 500.87214062634985, 228.15489387386432, 363.9001275370723, 270.42020264137363, 184.65268971066317, 729.2915932180222, 673.671027390825, 1431.984008449568, 521.2301962519678, 214.15828493822357, 610.2302484575663, 1407.273660704956, 339.29034503212927, 966.4005498599995, 823.6936357079712, 196.8163579059966, 299.9833727616921, 611.9734151321558, 1195.6862574171353, 873.441661941156, 562.8345892296189, 344.3387460816088, 961.5305415932702, 427.55208645610537, 728.902310820616, 487.1202850504721, 374.336987173198, 416.8716465960334, 396.96683484839036, 642.8982434526419, 668.1881093519099, 515.3836205188635, 540.6046981534062, 479.20331553453616, 618.5857198809623, 679.116562738571, 12.132782278815254, 7.1489827040948635, 2.6457512228243893, 12.811557351336688, 2.6634925635897373, 4.4179398958032685, 5.480829207978062, 4.486633737782545, 19.761754938208863, 35.333796351458794, 3.6360854359237442, 3.5580145264844045, 26.49035057952651, 1.76506427105297, 2.65632355811705, 6.483694463857461, 2.6797709224029775, 17.080267742351232, 2.6436318281786733, 5.3281414262704825, 8.218373825716087, 16.5636059443128, 56.876125505190174, 4.527051060763413, 4.513427726272856, 4.509043166939391, 3.5583410660368946, 65.07865524771572, 20.122797824044152, 1.7554746839395516, 22.177230405325158, 74.17229655094157, 43.7810821241646, 90.59570122393241, 66.43786835024711, 95.85021615407358, 37.499468849944954, 60.330217912516424, 14.140575484792876, 46.44414245470595, 1195.6862574171353, 96.26335809251898, 342.19693785537555, 427.55208645610537, 515.3836205188635, 179.29742938190142, 262.80202460465983, 873.441661941156, 221.9986786913688, 203.8907109707147, 1431.984008449568, 521.2301962519678, 487.1202850504721, 217.2703675808055, 198.4271360845006, 961.5305415932702, 301.03098892420627, 290.6353179951054, 1407.273660704956, 611.9734151321558, 635.8167746006652, 356.92009944423427, 302.20900342696063, 506.47366644701617, 966.4005498599995, 823.6936357079712, 610.2302484575663, 729.2915932180222, 367.9968167945871, 679.116562738571, 540.6046981534062, 668.1881093519099, 376.1793268911525, 449.99716905582846, 758.4270425720434, 673.671027390825, 429.24297961056976, 373.99838684654475, 618.5857198809623, 400.7476191147177, 728.902310820616, 642.8982434526419, 5.263868109446318, 11.007518273200683, 5.4827702644829595, 20.401054497963116, 6.599268349887305, 25.896283571042854, 2.6260480604056067, 22.484051980555705, 3.5264690843259467, 26.260955795821022, 4.598372204116655, 65.20642147653092, 51.04044808565002, 7.2006549512439015, 3.614949114086691, 3.5554160418090954, 7.218784162377749, 7.8774142453466, 4.450736824711313, 56.903491810961015, 2.6760202687844816, 11.762912743085213, 2.6408933273489126, 2.667934846340648, 6.304548817159557, 6.18103138725949, 5.421676603319898, 2.7532377071868477, 7.069384388165992, 8.00761886561377, 39.49750769035408, 106.44089439580074, 164.2722459939792, 54.850715536045634, 79.32230082919055, 47.84903392755638, 42.50489709846814, 17.009905393507083, 54.38330530311589, 48.640817476353284, 16.206084901590604, 37.00561907351631, 50.172762657904485, 52.14352577981948, 31.454880082437327, 372.0398605412698, 20.61114527685216, 161.46243557103494, 47.523503150076365, 88.2619134240464, 164.7227576651759, 106.46646454173849, 966.4005498599995, 1195.6862574171353, 129.22064394783905, 225.49360897463828, 479.20331553453616, 618.5857198809623, 642.8982434526419, 1407.273660704956, 217.2703675808055, 500.87214062634985, 673.671027390825, 961.5305415932702, 1431.984008449568, 376.1793268911525, 327.4854432408144, 668.1881093519099, 242.26537501337432, 562.8345892296189, 184.65268971066317, 729.2915932180222, 728.902310820616, 540.6046981534062, 610.2302484575663, 515.3836205188635, 457.6020438017594, 379.3514952191112, 611.9734151321558, 823.6936357079712, 873.441661941156, 635.8167746006652, 758.4270425720434, 400.7476191147177, 521.2301962519678, 422.9248633538833, 679.116562738571, 448.2831991780111, 18.145783270373627, 20.32156993554304, 9.035747124146242, 11.861072117850796, 10.09039067672324, 5.42087327299379, 9.12460384720486, 2.701612573295905, 3.646628090004765, 12.866857279676852, 2.6779948310121453, 4.495516626500624, 7.360424012560908, 2.663119673406336, 4.492334347516771, 14.852182104989728, 58.22803938723736, 7.43180579720451, 3.703994473471762, 5.552328414842075, 2.7107214493885854, 52.13800676956095, 7.349152133958479, 13.68086297492341, 20.139304907542762, 2.698567708849897, 67.6652253116254, 2.7573848998546904, 2.6812436027776867, 5.52554337470265, 56.143418562668906, 367.9968167945871, 20.536148856516874, 28.98200256722934, 50.347558143132574, 31.379589753506018, 51.18977961552851, 138.39080996302872, 44.531310247496116, 873.441661941156, 26.993822629227292, 966.4005498599995, 361.4696035154088, 219.69114683128575, 422.9248633538833, 164.90828034540374, 961.5305415932702, 1195.6862574171353, 321.63796384055263, 194.42458112888693, 635.8167746006652, 396.96683484839036, 562.8345892296189, 1431.984008449568, 729.2915932180222, 429.24297961056976, 178.6428346747292, 728.902310820616, 257.74011031998793, 642.8982434526419, 250.44585178464158, 368.15876879435507, 342.19693785537555, 1407.273660704956, 611.9734151321558, 758.4270425720434, 668.1881093519099, 618.5857198809623, 679.116562738571, 823.6936357079712, 427.55208645610537, 500.87214062634985, 540.6046981534062, 610.2302484575663, 479.20331553453616, 673.671027390825, 506.47366644701617, 11.398499331679394, 7.357650174268842, 9.321372046702704, 4.207148310561729, 5.063008588196205, 3.3720690676368097, 5.36590351924741, 1.7033616687831892, 33.98799667193393, 2.5285472534086244, 2.6085850341356913, 7.729712512544278, 8.461738245621612, 5.044054604690889, 5.113784740849575, 6.190561045511806, 3.424574581918268, 2.527062428446163, 3.4009725570491076, 3.3987117123416333, 8.380516666918913, 2.5400280212201576, 1.7281635605068049, 18.881491012670118, 2.617643577231054, 1.6781489922629345, 1.6970099996432317, 1.6913622305907052, 5.909513369030408, 4.967632546868985, 28.210766657438928, 7.675552542732718, 18.131136277862232, 29.422485169522748, 5.175141512901512, 27.771686997943974, 74.16138440858624, 5.976624197896211, 7.6760377106973685, 20.21993598980563, 13.362135243001166, 73.7547785993298, 56.530836137253296, 40.30012456652134, 19.384186683170675, 1195.6862574171353, 116.94492255046302, 108.95731563003001, 203.2210191592768, 181.47095348678437, 44.531310247496116, 119.24309695030803, 100.2832454153334, 79.39208746870557, 278.7057263684126, 115.82515269764814, 961.5305415932702, 164.90828034540374, 487.1202850504721, 411.93644338116826, 111.96371991644915, 178.6428346747292, 823.6936357079712, 618.5857198809623, 455.574061055531, 457.6020438017594, 1407.273660704956, 873.441661941156, 635.8167746006652, 293.1111838607104, 243.7420205987841, 251.10781474485762, 1431.984008449568, 448.2831991780111, 679.116562738571, 247.347028040399, 966.4005498599995, 500.87214062634985, 728.902310820616, 610.2302484575663, 540.6046981534062, 506.47366644701617, 353.2032144071574, 729.2915932180222, 611.9734151321558, 758.4270425720434, 361.4696035154088, 673.671027390825, 642.8982434526419, 668.1881093519099, 427.55208645610537, 396.96683484839036, 449.99716905582846], \"Category\": [\"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic9\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\", \"Topic10\"], \"logprob\": [30.0, 29.0, 28.0, 27.0, 26.0, 25.0, 24.0, 23.0, 22.0, 21.0, 20.0, 19.0, 18.0, 17.0, 16.0, 15.0, 14.0, 13.0, 12.0, 11.0, 10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0, -7.6025, -7.9375, -8.7105, -9.2268, -7.6818, -9.6776, -9.2304, -9.3395, -9.217, -10.2816, -8.6855, -9.0706, -8.0869, -9.1544, -10.2868, -9.9115, -10.3051, -10.0947, -8.6807, -8.8905, -9.2832, -10.6045, -9.2256, -10.3549, -10.3821, -9.9687, -10.0942, -7.5206, -7.9908, -10.6429, -7.6768, -8.1854, -8.166, -8.173, -7.0602, -8.0958, -7.8041, -8.7034, -7.9681, -7.9896, -6.1467, -6.1147, -6.3672, -7.5049, -4.9138, -6.1041, -6.6967, -5.8834, -6.4822, -6.4342, -6.7169, -6.1166, -4.9893, -6.8307, -5.6541, -5.6969, -6.1002, -5.363, -5.2004, -5.6183, -6.0631, -5.7677, -5.9563, -5.5269, -6.2477, -6.2132, -5.8466, -5.6267, -6.0138, -5.7778, -5.597, -5.8579, -5.9318, -5.9908, -6.1804, -5.8804, -6.1182, -6.0266, -6.1277, -6.0856, -6.1404, -7.5403, -8.5318, -9.079, -9.2259, -7.9701, -9.5973, -8.0119, -9.3458, -8.2444, -9.6339, -9.6769, -9.5458, -9.6723, -10.6711, -8.5927, -10.4457, -9.6891, -10.4546, -10.1655, -10.033, -11.0898, -10.4206, -9.3874, -8.3265, -10.6885, -8.2989, -8.4459, -9.1821, -9.883, -8.8674, -7.2464, -7.3951, -8.234, -8.0601, -8.045, -8.1866, -8.0031, -7.381, -8.024, -8.2644, -6.7328, -7.1888, -7.0828, -5.3068, -7.1296, -6.489, -6.5458, -5.9256, -6.75, -6.9023, -5.5865, -6.5137, -5.6734, -6.5362, -5.0157, -5.5585, -5.8505, -5.7163, -6.398, -5.6207, -6.2673, -5.861, -5.0993, -5.4428, -5.2741, -6.0622, -5.7008, -6.2282, -6.0108, -6.4354, -6.5137, -6.0204, -5.8393, -5.8371, -6.3413, -6.1624, -5.9584, -5.7428, -5.738, -6.1034, -6.029, -6.2425, -6.2056, -6.2019, -6.1379, -8.2334, -9.0904, -9.6214, -10.0435, -9.1493, -8.8368, -10.1172, -7.5624, -9.2868, -8.2656, -7.742, -9.3021, -9.9336, -9.2383, -9.9593, -9.4387, -9.7704, -10.1596, -9.9439, -10.1395, -9.0556, -9.9514, -10.0095, -7.761, -9.78, -9.9774, -9.3877, -7.3482, -10.4977, -10.5352, -8.4389, -8.5175, -8.6561, -7.0345, -7.4856, -6.6098, -7.9915, -8.155, -8.2715, -7.5405, -6.4157, -7.3361, -7.6325, -7.6176, -5.5247, -5.4954, -6.163, -6.9958, -4.8897, -7.3946, -6.1912, -7.397, -5.9191, -5.9918, -4.9944, -6.4746, -5.3552, -5.1817, -5.7473, -5.6103, -6.0422, -6.9991, -5.7722, -5.9166, -5.7979, -6.4982, -5.7083, -5.5176, -6.0741, -6.7399, -5.8228, -6.035, -6.0276, -6.2832, -5.8185, -6.1966, -6.1214, -6.1355, -5.868, -5.9952, -6.0242, -5.9233, -6.1314, -6.1766, -9.2367, -10.2748, -8.2764, -7.2537, -9.6626, -8.6857, -9.6468, -8.1054, -7.7061, -9.0395, -8.8743, -10.0769, -8.7366, -7.1123, -9.2614, -8.6166, -7.2411, -9.8842, -10.4095, -10.428, -8.0795, -9.4397, -8.7497, -9.5876, -10.235, -10.4701, -9.9343, -10.1692, -7.7643, -10.8999, -8.2499, -7.9551, -7.5723, -9.1768, -7.4687, -7.7123, -7.6885, -7.845, -5.2341, -6.9805, -7.0007, -6.1337, -5.5434, -6.0668, -6.4493, -5.4796, -6.0418, -5.8649, -5.3456, -6.0409, -6.428, -6.2379, -5.9621, -6.6017, -5.5553, -5.7186, -5.864, -5.1706, -6.0384, -5.8227, -6.1089, -5.9825, -5.2862, -6.3193, -5.4425, -5.989, -5.7891, -5.9063, -5.6797, -5.8495, -6.2489, -6.0079, -5.9892, -6.223, -6.1976, -6.156, -6.1761, -9.5945, -9.1127, -9.6102, -7.8922, -9.0526, -9.1725, -8.357, -9.5438, -9.3353, -8.1194, -10.1175, -7.8728, -9.7829, -9.959, -9.3717, -9.6184, -8.4232, -9.3731, -9.9822, -9.9693, -10.243, -10.2334, -9.6593, -10.2533, -9.853, -9.3149, -9.5315, -10.5016, -10.5316, -9.4048, -8.7722, -8.8825, -9.0799, -7.6497, -8.5421, -8.0063, -7.417, -7.3357, -7.8239, -7.5685, -6.8362, -7.9373, -5.7268, -6.555, -6.4809, -7.0084, -8.1488, -6.3164, -5.1947, -5.9548, -7.2244, -6.2315, -5.0135, -6.2992, -5.7224, -5.6427, -6.5312, -6.1773, -5.7068, -5.4332, -6.0776, -6.0473, -6.8256, -5.5841, -5.3906, -5.7475, -5.2838, -5.879, -6.3607, -5.7965, -6.0255, -6.2652, -6.4702, -5.9793, -6.2455, -5.9859, -6.1421, -6.1162, -5.9917, -5.8685, -6.0004, -6.19, -6.0042, -6.1828, -6.1992, -8.5564, -9.2215, -9.5425, -8.7718, -9.5719, -7.3083, -8.9114, -10.0065, -8.9394, -8.6392, -8.508, -10.0742, -9.3567, -8.7297, -9.6601, -9.8473, -9.6728, -10.3556, -9.2847, -6.2489, -9.2694, -9.4041, -10.3997, -10.3986, -8.7623, -7.8289, -8.8647, -9.9107, -8.6888, -9.344, -7.7158, -8.6262, -9.4568, -8.1666, -8.0371, -8.2691, -7.2861, -8.5998, -7.6228, -6.6939, -6.9182, -6.9644, -6.3613, -7.3203, -5.4489, -6.5575, -5.8543, -6.5157, -6.1468, -6.3874, -6.6919, -5.6154, -5.6827, -5.1245, -5.9135, -6.5952, -5.8015, -5.1666, -6.2588, -5.4746, -5.6114, -6.6845, -6.383, -5.8778, -5.4241, -5.6596, -5.9928, -6.3226, -5.6805, -6.1872, -5.8764, -6.1179, -6.2894, -6.2381, -6.2747, -6.0597, -6.0614, -6.1746, -6.1955, -6.2835, -6.2751, -6.2779, -8.7669, -9.3692, -10.4052, -8.896, -10.4757, -9.9716, -9.7568, -9.9895, -8.5073, -7.9415, -10.2241, -10.2533, -8.2562, -10.9779, -10.5786, -9.6914, -10.5777, -8.7307, -10.5966, -9.8963, -9.4657, -8.765, -7.5337, -10.0676, -10.0748, -10.0782, -10.3171, -7.4158, -8.5931, -11.0323, -8.5312, -7.3508, -7.8787, -7.1832, -7.4848, -7.1694, -8.073, -7.6268, -8.9965, -7.8965, -4.9479, -7.2636, -6.1225, -5.9376, -5.7721, -6.7209, -6.3962, -5.3403, -6.5501, -6.6325, -4.9588, -5.8274, -5.8943, -6.5988, -6.683, -5.417, -6.3655, -6.4033, -5.1918, -5.8449, -5.8199, -6.2608, -6.3898, -6.0008, -5.5326, -5.6563, -5.8841, -5.7751, -6.2595, -5.8485, -6.0027, -5.8668, -6.2465, -6.1311, -5.8091, -5.9127, -6.1882, -6.294, -6.1535, -6.2851, -6.1922, -6.2405, -9.7089, -9.0322, -9.7303, -8.4413, -9.5923, -8.2264, -10.5155, -8.3904, -10.245, -8.251, -9.9986, -7.3563, -7.6135, -9.5735, -10.2663, -10.2881, -9.5814, -9.5003, -10.0742, -7.532, -10.5997, -9.1243, -10.621, -10.611, -9.7564, -9.7764, -9.9119, -10.5896, -9.6482, -9.5248, -7.9421, -6.9645, -6.5674, -7.6434, -7.2875, -7.7983, -7.9204, -8.8012, -7.6929, -7.8224, -8.8562, -8.0874, -7.8045, -7.7798, -8.2424, -6.0284, -8.6399, -6.8081, -7.9034, -7.366, -6.8442, -7.2204, -5.3422, -5.1833, -7.0559, -6.5909, -5.9688, -5.7774, -5.7507, -5.1188, -6.6405, -5.9734, -5.763, -5.488, -5.1807, -6.2285, -6.3387, -5.8009, -6.5681, -5.9477, -6.7844, -5.7546, -5.7575, -5.9899, -5.9105, -6.032, -6.1239, -6.2567, -5.9439, -5.7475, -5.7139, -5.9432, -5.8772, -6.287, -6.1716, -6.2756, -6.178, -6.2812, -8.1888, -8.1006, -8.9658, -8.7234, -8.8925, -9.5149, -9.0259, -10.2579, -9.9584, -8.7013, -10.2715, -9.7638, -9.282, -10.3047, -9.7842, -8.625, -7.2618, -9.3232, -10.0312, -9.6273, -10.3482, -7.4017, -9.3664, -8.7597, -8.3804, -10.3963, -7.175, -10.3768, -10.4159, -9.6939, -7.4625, -5.6502, -8.4405, -8.1169, -7.6384, -8.0948, -7.6572, -6.7813, -7.7967, -5.2104, -8.2437, -5.1985, -6.0491, -6.5126, -5.9931, -6.7765, -5.3552, -5.1882, -6.2502, -6.6712, -5.7653, -6.1347, -5.8715, -5.176, -5.6955, -6.1002, -6.7649, -5.7312, -6.4994, -5.8384, -6.5298, -6.2689, -6.3189, -5.4166, -5.9575, -5.8279, -5.9077, -5.9681, -5.9218, -5.815, -6.212, -6.1307, -6.0959, -6.0391, -6.1611, -6.0217, -6.1586, -8.591, -9.0867, -8.9444, -9.7447, -9.5631, -10.0013, -9.5392, -10.7077, -7.7149, -10.3163, -10.2862, -9.2093, -9.1192, -9.6415, -9.6288, -9.439, -10.0324, -10.3395, -10.0427, -10.0438, -9.1416, -10.3401, -10.7342, -8.3435, -10.3245, -10.7717, -10.7609, -10.7667, -9.5204, -9.696, -7.9628, -9.2624, -8.4198, -7.953, -9.6569, -8.059, -7.1559, -9.5312, -9.3047, -8.4259, -8.8043, -7.2799, -7.5276, -7.8459, -8.4929, -4.9958, -6.9769, -7.0389, -6.5137, -6.6284, -7.8204, -7.0097, -7.1529, -7.3455, -6.3292, -7.052, -5.3663, -6.7788, -5.9328, -6.0702, -7.1009, -6.7425, -5.5663, -5.7951, -6.0605, -6.0718, -5.2902, -5.6346, -5.8786, -6.4209, -6.5552, -6.5368, -5.3958, -6.1628, -5.8958, -6.5563, -5.6834, -6.1157, -5.8922, -6.0241, -6.1114, -6.1545, -6.3718, -5.9808, -6.0926, -5.9837, -6.3727, -6.1115, -6.1464, -6.1862, -6.3453, -6.3656, -6.3626], \"loglift\": [30.0, 29.0, 28.0, 27.0, 26.0, 25.0, 24.0, 23.0, 22.0, 21.0, 20.0, 19.0, 18.0, 17.0, 16.0, 15.0, 14.0, 13.0, 12.0, 11.0, 10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0, 0.8959, 0.824, 0.7895, 0.764, 0.7633, 0.7598, 0.7449, 0.7424, 0.7336, 0.7223, 0.7199, 0.7001, 0.6942, 0.6933, 0.6855, 0.6829, 0.676, 0.6696, 0.6657, 0.6621, 0.6564, 0.6531, 0.6505, 0.6445, 0.6394, 0.6379, 0.6358, 0.6304, 0.6125, 0.6117, 0.6074, 0.5983, 0.5936, 0.5911, 0.5281, 0.5639, 0.5448, 0.5889, 0.5345, 0.5323, 0.3594, 0.3512, 0.3513, 0.4751, 0.1697, 0.3098, 0.3747, 0.261, 0.3369, 0.3272, 0.3623, 0.2619, 0.0768, 0.3625, 0.166, 0.1701, 0.2394, 0.1014, 0.046, 0.1231, 0.2078, 0.1378, 0.1492, 0.0335, 0.2077, 0.1955, 0.0725, -0.0075, 0.063, -0.037, -0.1376, -0.0457, -0.0156, 0.0092, 0.1053, -0.1787, -0.0017, -0.1485, -0.0397, -0.2573, -0.1002, 0.7405, 0.7087, 0.7074, 0.7022, 0.6796, 0.6611, 0.6456, 0.6437, 0.6207, 0.613, 0.6073, 0.5968, 0.5948, 0.5923, 0.5915, 0.5811, 0.5765, 0.572, 0.5708, 0.568, 0.5647, 0.5619, 0.5607, 0.5599, 0.5581, 0.557, 0.5566, 0.5562, 0.5539, 0.5496, 0.5439, 0.5389, 0.5453, 0.5408, 0.5379, 0.5366, 0.524, 0.5035, 0.522, 0.5183, 0.4313, 0.4532, 0.4422, 0.3123, 0.4366, 0.3852, 0.375, 0.2813, 0.3681, 0.3819, 0.2256, 0.3232, 0.1936, 0.3008, 0.0678, 0.1432, 0.1897, 0.1617, 0.2628, 0.1207, 0.2234, 0.1389, -0.0332, 0.0165, -0.0277, 0.1492, 0.0401, 0.1796, 0.0946, 0.2169, 0.2423, 0.0677, -0.011, -0.0169, 0.1755, 0.0651, -0.0529, -0.1823, -0.2736, -0.0267, -0.1128, 0.0696, -0.0448, -0.0853, -0.2188, 1.1153, 1.0689, 0.9668, 0.9528, 0.9184, 0.9157, 0.9097, 0.9093, 0.8946, 0.8868, 0.8843, 0.8743, 0.8673, 0.8545, 0.8516, 0.8499, 0.8469, 0.8417, 0.8294, 0.8286, 0.8204, 0.8186, 0.8042, 0.803, 0.8018, 0.7996, 0.7987, 0.7963, 0.795, 0.79, 0.7877, 0.7644, 0.7664, 0.7121, 0.7067, 0.6543, 0.7173, 0.7219, 0.7279, 0.6648, 0.5346, 0.6234, 0.6462, 0.6218, 0.3533, 0.3329, 0.4039, 0.5275, 0.1938, 0.5757, 0.3779, 0.5721, 0.3084, 0.2881, 0.0717, 0.3741, 0.1092, 0.0647, 0.1689, 0.1305, 0.2326, 0.4713, 0.1333, 0.1602, 0.1212, 0.3164, 0.0331, -0.0582, 0.1496, 0.3882, -0.0106, 0.0705, 0.0604, 0.1827, -0.1169, 0.0743, 0.023, 0.0253, -0.2488, -0.1282, -0.204, -0.3629, -0.1314, -0.1363, 1.0613, 1.0348, 1.005, 0.985, 0.9687, 0.9647, 0.9572, 0.9565, 0.9565, 0.9523, 0.9427, 0.9385, 0.9376, 0.9374, 0.9222, 0.9179, 0.9143, 0.9106, 0.8919, 0.8863, 0.8771, 0.8705, 0.8668, 0.8597, 0.8512, 0.8493, 0.8444, 0.8433, 0.8407, 0.8274, 0.8161, 0.8103, 0.7809, 0.8189, 0.6975, 0.688, 0.6764, 0.659, 0.3263, 0.5298, 0.5127, 0.3817, 0.2849, 0.358, 0.4192, 0.2221, 0.3072, 0.2517, 0.1188, 0.2592, 0.3344, 0.2789, 0.1987, 0.3748, 0.0638, 0.0936, 0.1359, -0.1045, 0.1685, 0.0828, 0.171, 0.1055, -0.2027, 0.2222, -0.1961, 0.0512, -0.0482, 0.0099, -0.2204, -0.1081, 0.1297, -0.1409, -0.169, 0.089, 0.0138, -0.278, -0.257, 1.1008, 1.0943, 1.0764, 1.0387, 1.0278, 1.0249, 1.0114, 1.0013, 0.9596, 0.9526, 0.9461, 0.9341, 0.9318, 0.9268, 0.9231, 0.9038, 0.903, 0.8898, 0.8732, 0.8693, 0.8659, 0.8532, 0.8513, 0.8497, 0.848, 0.8464, 0.8444, 0.8434, 0.8403, 0.8354, 0.8302, 0.8192, 0.8195, 0.7728, 0.7969, 0.7739, 0.7447, 0.7365, 0.7507, 0.713, 0.6551, 0.7199, 0.4969, 0.5681, 0.5378, 0.5724, 0.7204, 0.4449, 0.2646, 0.3573, 0.5414, 0.329, 0.0526, 0.3265, 0.1967, 0.1694, 0.3674, 0.2471, 0.1133, 0.0312, 0.2024, 0.1641, 0.4058, -0.0236, -0.1442, -0.0061, -0.2002, -0.001, 0.1877, -0.0948, 0.0148, 0.1293, 0.2326, -0.063, 0.0941, -0.0804, 0.0187, -0.0107, -0.1248, -0.2494, -0.172, 0.0375, -0.2634, -0.0384, -0.1992, 1.1988, 1.174, 1.1266, 1.111, 1.1106, 1.1091, 1.0783, 1.0685, 1.0404, 1.0385, 1.0243, 1.0198, 1.0195, 1.0178, 1.0163, 1.0128, 1.0044, 1.004, 0.9823, 0.9798, 0.9797, 0.9773, 0.9705, 0.9643, 0.9595, 0.9518, 0.9495, 0.9439, 0.941, 0.9374, 0.9131, 0.9301, 0.9356, 0.8621, 0.8399, 0.8491, 0.7404, 0.8566, 0.7141, 0.5702, 0.6006, 0.5935, 0.4757, 0.6137, 0.2528, 0.4628, 0.2623, 0.3872, 0.2892, 0.3456, 0.4226, 0.1255, 0.1374, -0.0584, 0.1632, 0.3711, 0.1175, -0.0831, 0.2473, -0.0152, 0.0077, 0.3661, 0.2462, 0.0384, -0.1777, -0.0992, 0.0072, 0.1687, -0.2161, 0.0876, -0.135, 0.0265, 0.1183, 0.0621, 0.0743, -0.1928, -0.233, -0.0866, -0.1553, -0.1227, -0.3696, -0.4658, 1.0701, 0.9968, 0.9548, 0.8866, 0.8775, 0.8757, 0.8749, 0.8423, 0.8418, 0.8266, 0.8179, 0.8104, 0.8, 0.7869, 0.7774, 0.7723, 0.7694, 0.7643, 0.7641, 0.7637, 0.7608, 0.7607, 0.7584, 0.7553, 0.751, 0.7487, 0.7465, 0.7415, 0.738, 0.7379, 0.7026, 0.6757, 0.6751, 0.6433, 0.6519, 0.6007, 0.6356, 0.6063, 0.6873, 0.5982, 0.2986, 0.5022, 0.375, 0.3372, 0.3159, 0.423, 0.3653, 0.2202, 0.3802, 0.3829, 0.1073, 0.2493, 0.2501, 0.353, 0.3595, 0.0474, 0.2602, 0.2576, -0.1082, 0.0713, 0.0581, 0.1946, 0.232, 0.1047, -0.0733, -0.0372, 0.035, -0.0342, 0.1654, -0.0364, 0.0376, -0.0384, 0.1564, 0.0926, -0.1074, -0.0925, 0.0827, 0.1147, -0.248, 0.0545, -0.4508, -0.3736, 0.9631, 0.9021, 0.901, 0.876, 0.8537, 0.8525, 0.8519, 0.8297, 0.8276, 0.8139, 0.8086, 0.7991, 0.7868, 0.7853, 0.7815, 0.7764, 0.7748, 0.7687, 0.7657, 0.7596, 0.7489, 0.7436, 0.7408, 0.7406, 0.7353, 0.735, 0.7306, 0.7305, 0.729, 0.7277, 0.7145, 0.7009, 0.664, 0.6849, 0.6719, 0.6666, 0.6629, 0.6979, 0.6439, 0.626, 0.6913, 0.6344, 0.613, 0.5991, 0.642, 0.3855, 0.6672, 0.4406, 0.5683, 0.4866, 0.3845, 0.4447, 0.1172, 0.0632, 0.4156, 0.3238, 0.192, 0.1281, 0.1163, -0.0353, 0.3112, 0.1432, 0.0572, -0.0236, -0.1146, 0.1743, 0.2028, 0.0275, 0.2747, 0.0522, 0.3301, -0.0137, -0.0162, 0.0503, 0.0086, 0.056, 0.083, 0.1378, -0.0276, -0.1284, -0.1534, -0.0652, -0.1755, 0.0526, -0.0948, 0.0101, -0.3659, -0.0537, 1.2456, 1.2207, 1.1659, 1.1362, 1.1288, 1.1277, 1.096, 1.0812, 1.0807, 1.0769, 1.0763, 1.0661, 1.0548, 1.0488, 1.0463, 1.0098, 1.0068, 1.0039, 0.9923, 0.9914, 0.9876, 0.9773, 0.9719, 0.9572, 0.9498, 0.9439, 0.9433, 0.9419, 0.9307, 0.9297, 0.8425, 0.7746, 0.8702, 0.8493, 0.7756, 0.7919, 0.7401, 0.6215, 0.74, 0.3501, 0.7936, 0.2608, 0.3936, 0.4281, 0.2926, 0.451, 0.1092, 0.0583, 0.3093, 0.3917, 0.1127, 0.2144, 0.1284, -0.1099, 0.0453, 0.1707, 0.3826, 0.0102, 0.2815, 0.0285, 0.2799, 0.1555, 0.1787, -0.3331, -0.0412, -0.1262, -0.0793, -0.0626, -0.1096, -0.1958, 0.0628, -0.0141, -0.0557, -0.12, -0.0003, -0.2015, -0.0532, 1.3084, 1.2505, 1.1562, 1.1514, 1.1479, 1.1161, 1.1137, 1.0926, 1.092, 1.089, 1.088, 1.0786, 1.0782, 1.0732, 1.0722, 1.0709, 1.0695, 1.0664, 1.0662, 1.0657, 1.0655, 1.0607, 1.0516, 1.0512, 1.0462, 1.0435, 1.0431, 1.0406, 1.036, 1.034, 1.0304, 1.0325, 1.0155, 0.9981, 1.0322, 0.9499, 0.8708, 1.0139, 0.9902, 0.9004, 0.9362, 0.7523, 0.7705, 0.7907, 0.8756, 0.2506, 0.5943, 0.6031, 0.5049, 0.5034, 0.7164, 0.5421, 0.572, 0.613, 0.3736, 0.5288, 0.0981, 0.4487, 0.2117, 0.2419, 0.5139, 0.405, 0.0528, 0.1104, 0.1509, 0.1351, -0.2067, -0.0742, -0.0006, 0.2315, 0.2817, 0.2702, -0.3296, 0.0647, -0.0837, 0.2658, -0.2241, 0.0009, -0.1508, -0.105, -0.0712, -0.049, 0.0941, -0.24, -0.1763, -0.282, 0.07, -0.2913, -0.2794, -0.3579, -0.0705, -0.0165, -0.1389]}, \"token.table\": {\"Topic\": [3, 3, 2, 3, 4, 5, 6, 7, 2, 3, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 2, 1, 2, 3, 4, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 1, 2, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 5, 1, 2, 3, 4, 5, 6, 7, 9, 2, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 1, 2, 3, 4, 5, 6, 7, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 9, 3, 6, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 3, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 9, 1, 3, 5, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 8, 2, 1, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 7, 1, 2, 3, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 3, 5, 1, 2, 5, 6, 8, 1, 2, 3, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 9, 2, 6, 1, 2, 3, 7, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 8, 2, 3, 4, 5, 1, 6, 8, 2, 3, 5, 1, 2, 3, 4, 5, 7, 8, 9, 1, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 1, 2, 3, 4, 5, 6, 7, 8, 9, 6, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 3, 5, 1, 6, 8, 1, 3, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 4, 7, 2, 4, 2, 3, 5, 1, 2, 4, 5, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 2, 3, 4, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 5, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 5, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 6, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 9, 1, 1, 2, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 3, 4, 5, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 6, 1, 2, 3, 5, 5, 1, 2, 3, 4, 5, 6, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 7, 4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 2, 4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 1, 2, 6, 8, 1, 3, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 5, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 4, 1, 2, 3, 4, 5, 7, 8, 9, 2, 3, 5, 1, 4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 4, 1, 2, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 5, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 5, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 3, 4, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 4, 8, 1, 2, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 9, 1, 2, 4, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 5, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 1, 6, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 3, 5, 2, 3, 5, 2, 4, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 3, 6, 8, 2, 3, 4, 5, 6, 8, 9, 1, 2, 4, 5, 7, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 7, 1, 2, 3, 5, 6, 7, 8, 1, 2, 3, 4, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 3, 4, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 9, 1, 4, 9, 1, 2, 3, 4, 5, 6, 7, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 4, 5, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 6, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 5, 6, 8, 1, 2, 3, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 5, 1, 2, 4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 3, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 9, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 2, 3, 4, 5, 8, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 2, 5, 1, 3, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 3, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 4, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 3, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 3, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 9, 1, 2, 3, 4, 5, 6, 8, 9, 1, 2, 3, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 7, 1, 2, 4, 5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 6, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 8, 3, 6, 1, 2, 3, 4, 5, 6, 8, 9, 2, 4, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 8, 1, 2, 4, 9, 1, 2, 3, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 7, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 3, 8, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 2, 1, 2, 3, 4, 6, 7, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 1, 2, 3, 7, 8, 3, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10], \"Freq\": [0.3689054809469834, 0.35338682865550025, 0.13028378014906677, 0.13028378014906677, 0.13028378014906677, 0.26056756029813355, 0.13028378014906677, 0.13028378014906677, 0.19443921524758212, 0.19443921524758212, 0.38887843049516424, 0.2623122093013643, 0.15326106610866227, 0.10610381499830467, 0.10315648680390731, 0.08841984583192054, 0.11789312777589407, 0.06484122027674173, 0.05305190749915233, 0.04126259472156292, 0.005894656388794703, 0.45386272916812104, 0.10806255456383834, 0.10806255456383834, 0.064837532738303, 0.08645004365107067, 0.04322502182553534, 0.064837532738303, 0.02161251091276767, 0.04322502182553534, 0.36934855521405696, 0.09233713880351424, 0.09233713880351424, 0.09233713880351424, 0.09233713880351424, 0.09233713880351424, 0.09233713880351424, 0.3833495887287923, 0.15423305425238254, 0.15423305425238254, 0.15423305425238254, 0.15423305425238254, 0.15423305425238254, 0.17636782395892348, 0.17636782395892348, 0.17636782395892348, 0.11757854930594898, 0.11757854930594898, 0.05878927465297449, 0.05878927465297449, 0.11757854930594898, 0.05878927465297449, 0.13917850715451197, 0.19137044733745395, 0.08698656697156998, 0.06958925357725598, 0.20876776073176795, 0.06958925357725598, 0.08698656697156998, 0.05219194018294199, 0.08698656697156998, 0.017397313394313996, 0.16414216527374442, 0.24210969377877303, 0.09848529916424666, 0.12721017808715193, 0.061553311977654165, 0.14772794874637, 0.04513909545027972, 0.05744975784581055, 0.04924264958212333, 0.00820710826368722, 0.151880140370869, 0.151880140370869, 0.11391010527815174, 0.2657902456490207, 0.0759400701854345, 0.0759400701854345, 0.03797003509271725, 0.0759400701854345, 0.0759400701854345, 0.2708601275158446, 0.142055857982587, 0.142055857982587, 0.142055857982587, 0.142055857982587, 0.284111715965174, 0.142055857982587, 0.18890786550449215, 0.14883650009444838, 0.08930190005666902, 0.14883650009444838, 0.09960596544782313, 0.08357741928380562, 0.08930190005666902, 0.05953460003777935, 0.0847223154383783, 0.008014273082008758, 0.2040472726920261, 0.17381804710802223, 0.08061126822401031, 0.14610792365601868, 0.10328318741201321, 0.09824498314801257, 0.0680157575640087, 0.04282473624400548, 0.07305396182800934, 0.007557306396000967, 0.2263498425929084, 0.2263498425929084, 0.2263498425929084, 0.2263498425929084, 0.1830022567393761, 0.204322908009983, 0.11193341917068635, 0.12259374480598981, 0.08350588414321045, 0.09238948883929667, 0.053301628176517304, 0.07284555850790699, 0.06751539569025525, 0.0053301628176517304, 0.3954840071317408, 0.24335714612321557, 0.12167857306160779, 0.12167857306160779, 0.12167857306160779, 0.12167857306160779, 0.12167857306160779, 0.12167857306160779, 0.12167857306160779, 0.3764601631244259, 0.22192609091133786, 0.16310230777821216, 0.0962570996723875, 0.12299518291471738, 0.08823567469968856, 0.09893090799662049, 0.0802142497269896, 0.06951901643005765, 0.05347616648465973, 0.005347616648465973, 0.26473625370339593, 0.38450435067049343, 0.09612608766762336, 0.09612608766762336, 0.09612608766762336, 0.09612608766762336, 0.09612608766762336, 0.09612608766762336, 0.09612608766762336, 0.18877281950764369, 0.20633308178742452, 0.12731190152841088, 0.10975163924863006, 0.12072680317349306, 0.07682614747404104, 0.05268078683934243, 0.05926588519426023, 0.04829072126939722, 0.008780131139890405, 0.14888291449561425, 0.14888291449561425, 0.0992552763304095, 0.198510552660819, 0.07444145724780712, 0.07444145724780712, 0.04962763816520475, 0.0992552763304095, 0.07444145724780712, 0.024813819082602374, 0.2940335398847385, 0.2940335398847385, 0.13397196887700294, 0.16969782724420374, 0.11610903969340255, 0.13397196887700294, 0.11610903969340255, 0.08038318132620177, 0.06252025214260137, 0.08038318132620177, 0.08931464591800196, 0.01786292918360039, 0.2230749367657968, 0.09560354432819863, 0.09560354432819863, 0.12747139243759817, 0.06373569621879908, 0.12747139243759817, 0.06373569621879908, 0.06373569621879908, 0.12747139243759817, 0.25394629019260445, 0.19348288776579387, 0.08464876339753481, 0.08867965689265551, 0.09271055038777622, 0.08464876339753481, 0.0765869764072934, 0.05240161543656917, 0.06449429592193129, 0.008061786990241411, 0.17255975273631283, 0.08627987636815641, 0.08627987636815641, 0.08627987636815641, 0.08627987636815641, 0.2588396291044692, 0.08627987636815641, 0.08627987636815641, 0.08627987636815641, 0.10644800284502212, 0.10644800284502212, 0.31934400853506634, 0.10644800284502212, 0.21289600569004424, 0.10644800284502212, 0.10644800284502212, 0.12890737588669787, 0.3867221276600937, 0.12890737588669787, 0.12890737588669787, 0.12890737588669787, 0.12890737588669787, 0.1838799599300385, 0.09193997996501925, 0.09193997996501925, 0.09193997996501925, 0.1103279759580231, 0.1838799599300385, 0.01838799599300385, 0.12871597195102694, 0.05516398797901155, 0.21888078744673187, 0.21888078744673187, 0.1051294027292412, 0.2102588054584824, 0.1051294027292412, 0.3153882081877236, 0.1051294027292412, 0.1051294027292412, 0.1794645345975324, 0.17111734694183323, 0.1314682055772621, 0.1314682055772621, 0.10433984569623976, 0.08138507964306702, 0.05216992284811988, 0.08347187655699181, 0.06051711050381906, 0.006260390741774386, 0.15582252995561882, 0.15582252995561882, 0.03116450599112376, 0.11426985530078713, 0.17659886728303464, 0.07271718064595545, 0.11426985530078713, 0.08310534930966336, 0.08310534930966336, 0.01038816866370792, 0.13598489624805823, 0.29463394187079284, 0.11332074687338187, 0.11332074687338187, 0.06799244812402912, 0.0906565974987055, 0.06799244812402912, 0.06799244812402912, 0.04532829874935275, 0.23083283606087582, 0.22704869120741883, 0.11730849045716639, 0.10595605589679545, 0.052978027948397725, 0.10217191104333848, 0.06811460736222565, 0.052978027948397725, 0.030273158827655845, 0.007568289706913961, 0.17260052643933735, 0.1150670176262249, 0.13424485389726237, 0.07671134508414994, 0.09588918135518741, 0.15342269016829987, 0.03835567254207497, 0.13424485389726237, 0.05753350881311245, 0.12290582908200723, 0.30726457270501806, 0.061452914541003614, 0.13519641199020796, 0.036871748724602166, 0.17206816071481013, 0.07374349744920433, 0.061452914541003614, 0.04916233163280289, 0.19825320667028787, 0.19825320667028787, 0.19825320667028787, 0.13974471776062058, 0.1778569135135171, 0.11010189884170106, 0.1651528482625516, 0.13551002934363207, 0.08045907992278155, 0.06775501467181604, 0.06775501467181604, 0.05081626100386203, 0.008469376833977004, 0.18097767625501876, 0.18097767625501876, 0.18097767625501876, 0.18097767625501876, 0.18097767625501876, 0.21827896189932272, 0.21827896189932272, 0.21827896189932272, 0.21827896189932272, 0.2645640588673887, 0.1919386309430075, 0.12968826415068074, 0.1037506113205446, 0.08300048905643567, 0.07262542792438122, 0.031125183396163377, 0.08818801962246291, 0.02593765283013615, 0.0051875305660272295, 0.17002591481227242, 0.08501295740613621, 0.17002591481227242, 0.08501295740613621, 0.17002591481227242, 0.08501295740613621, 0.08501295740613621, 0.17002591481227242, 0.08501295740613621, 0.15861563277585003, 0.15861563277585003, 0.15861563277585003, 0.15861563277585003, 0.15861563277585003, 0.15861563277585003, 0.15861563277585003, 0.5074370866809994, 0.26354924879709635, 0.2097229751888142, 0.1434946672344518, 0.07726635928008944, 0.13245661590872476, 0.13245661590872476, 0.044152205302908254, 0.13245661590872476, 0.06622830795436238, 0.044152205302908254, 0.011038051325727063, 0.17564127478876548, 0.11709418319251033, 0.058547091596255166, 0.11709418319251033, 0.11709418319251033, 0.058547091596255166, 0.17564127478876548, 0.058547091596255166, 0.058547091596255166, 0.12937091753116758, 0.25874183506233517, 0.12937091753116758, 0.12937091753116758, 0.25874183506233517, 0.12937091753116758, 0.12937091753116758, 0.2461489750866107, 0.12307448754330536, 0.08791034824521811, 0.08791034824521811, 0.12307448754330536, 0.05274620894713086, 0.1582386268413926, 0.05274620894713086, 0.07032827859617449, 0.049456140736754764, 0.19782456294701906, 0.14836842221026428, 0.09891228147350953, 0.2472807036837738, 0.049456140736754764, 0.049456140736754764, 0.049456140736754764, 0.049456140736754764, 0.1072803440297971, 0.2145606880595942, 0.2145606880595942, 0.1072803440297971, 0.2145606880595942, 0.1072803440297971, 0.1072803440297971, 0.1072803440297971, 0.15719276978591917, 0.13473665981650215, 0.0898244398776681, 0.1796488797553362, 0.11228054984708512, 0.06736832990825108, 0.04491221993883405, 0.0898244398776681, 0.13473665981650215, 0.022456109969417024, 0.1893213920791242, 0.19237496291911008, 0.12214283359943498, 0.13435711695937846, 0.06717855847968923, 0.06107141679971749, 0.07939284183963273, 0.08549998351960449, 0.06412498763970337, 0.006107141679971749, 0.18511565367065025, 0.1234104357804335, 0.1234104357804335, 0.1234104357804335, 0.1234104357804335, 0.1234104357804335, 0.06170521789021675, 0.1234104357804335, 0.06170521789021675, 0.1241822120616672, 0.37254663618500156, 0.1241822120616672, 0.1241822120616672, 0.1241822120616672, 0.2253199968188404, 0.1126599984094202, 0.0563299992047101, 0.1126599984094202, 0.0563299992047101, 0.2253199968188404, 0.0563299992047101, 0.1126599984094202, 0.0563299992047101, 0.2750210405179742, 0.2750210405179742, 0.2110043084349704, 0.2110043084349704, 0.2110043084349704, 0.029422151874745953, 0.1765329112484757, 0.1765329112484757, 0.11768860749898381, 0.26479936687271355, 0.05884430374949191, 0.05884430374949191, 0.05884430374949191, 0.029422151874745953, 0.029422151874745953, 0.22441411898397373, 0.22441411898397373, 0.22441411898397373, 0.16217052972992765, 0.18408546617991786, 0.15340455514993157, 0.09204273308995893, 0.08327675850996284, 0.1358726059899394, 0.06136182205997262, 0.05259584747997653, 0.06574480934997066, 0.008765974579996089, 0.1524920888019779, 0.11436906660148342, 0.11436906660148342, 0.22873813320296685, 0.07624604440098895, 0.11436906660148342, 0.038123022200494475, 0.07624604440098895, 0.07624604440098895, 0.33827912812390626, 0.19732949140561198, 0.14094963671829427, 0.11275970937463542, 0.05637985468731771, 0.05637985468731771, 0.028189927343658855, 0.05637985468731771, 0.028189927343658855, 0.29200707301864404, 0.29200707301864404, 0.29200707301864404, 0.1413245634627401, 0.1413245634627401, 0.1413245634627401, 0.2826491269254802, 0.1413245634627401, 0.25542257167072036, 0.25542257167072036, 0.25542257167072036, 0.16722155778531797, 0.15483477572714627, 0.14244799366897457, 0.11148103852354531, 0.12386782058171701, 0.08051408337811607, 0.04335373720360096, 0.10528764749445946, 0.061933910290858506, 0.00619339102908585, 0.17842959268617548, 0.11470473815539851, 0.22940947631079703, 0.07646982543693234, 0.10195976724924313, 0.07646982543693234, 0.06372485453077696, 0.07646982543693234, 0.07646982543693234, 0.012744970906155392, 0.17203690296456692, 0.233970188031811, 0.11354435595661416, 0.09978140371944881, 0.08257771342299212, 0.07913697536370078, 0.09289992760086613, 0.055051808948661414, 0.06537402312653542, 0.006881476118582677, 0.20640742349099195, 0.0971329051722315, 0.0789204854524381, 0.10320371174549597, 0.0789204854524381, 0.24283226293057877, 0.03642483943958681, 0.10320371174549597, 0.04856645258611575, 0.006070806573264469, 0.15771974172938755, 0.17743470944556097, 0.1133610643679973, 0.09857483858086721, 0.07393112893565042, 0.14786225787130083, 0.09364609665182386, 0.09364609665182386, 0.04435867736139024, 0.004928741929043361, 0.15705159803218036, 0.15705159803218036, 0.16788274272405487, 0.09206472988093332, 0.08123358518905881, 0.1408048809943686, 0.048740151113435284, 0.09748030222687057, 0.05415572345937254, 0.005415572345937254, 0.1461493653148846, 0.2318231311891273, 0.09575303244768302, 0.10079266573440318, 0.11591156559456366, 0.07055486601408223, 0.10079266573440318, 0.06551523272736207, 0.05543596615392175, 0.010079266573440318, 0.14554180159165353, 0.13994403999197455, 0.1287485167926166, 0.1511395631913325, 0.10075970879422168, 0.0951619471945427, 0.061575377596468805, 0.07277090079582677, 0.08956418559486372, 0.011195523199357965, 0.19861658673740434, 0.09930829336870217, 0.09930829336870217, 0.14896244005305326, 0.049654146684351086, 0.09930829336870217, 0.049654146684351086, 0.049654146684351086, 0.14896244005305326, 0.179064081281389, 0.0895320406406945, 0.26859612192208354, 0.0895320406406945, 0.0895320406406945, 0.0895320406406945, 0.04476602032034725, 0.0895320406406945, 0.0895320406406945, 0.2529282319669088, 0.08430941065563625, 0.08430941065563625, 0.1686188213112725, 0.08430941065563625, 0.08430941065563625, 0.08430941065563625, 0.1686188213112725, 0.3431975073077242, 0.3790755757685983, 0.12947302065930744, 0.12947302065930744, 0.2589460413186149, 0.12947302065930744, 0.12947302065930744, 0.0958993315969855, 0.13425906423577968, 0.17261879687457388, 0.13425906423577968, 0.11507919791638259, 0.0767194652775884, 0.057539598958191294, 0.0383597326387942, 0.1534389305551768, 0.21602052855001566, 0.19114974401300727, 0.15633064566119553, 0.08811363664540113, 0.08313947973799944, 0.0845606674258285, 0.06466403979622179, 0.06679582132796537, 0.04263563063487151, 0.0063953445952307265, 0.2008454164225452, 0.16067633313803614, 0.08033816656901807, 0.08033816656901807, 0.08033816656901807, 0.2008454164225452, 0.040169083284509036, 0.08033816656901807, 0.040169083284509036, 0.2938086075687837, 0.07345215189219592, 0.07345215189219592, 0.07345215189219592, 0.07345215189219592, 0.22035645567658776, 0.07345215189219592, 0.07345215189219592, 0.07345215189219592, 0.16178529720156853, 0.16178529720156853, 0.16178529720156853, 0.16178529720156853, 0.16178529720156853, 0.16178529720156853, 0.16178529720156853, 0.19751102187173447, 0.19751102187173447, 0.19751102187173447, 0.19751102187173447, 0.2280133142306915, 0.2280133142306915, 0.2280133142306915, 0.16354390183653225, 0.16354390183653225, 0.3270878036730645, 0.09659693014842093, 0.19319386029684185, 0.09659693014842093, 0.2897907904452628, 0.09659693014842093, 0.09659693014842093, 0.09659693014842093, 0.09659693014842093, 0.2766290668112621, 0.2766290668112621, 0.23206625707582282, 0.15720617414813803, 0.10729945219634819, 0.09232743561081123, 0.11229012439152718, 0.08983209951322174, 0.07486008292768478, 0.07236474683009529, 0.05240205804937935, 0.004990672195178986, 0.1644709200023046, 0.1439120550020165, 0.0822354600011523, 0.10279432500144037, 0.12335319000172844, 0.12335319000172844, 0.06167659500086422, 0.12335319000172844, 0.0822354600011523, 0.3374549091303697, 0.3374549091303697, 0.1274185078282676, 0.31854626957066906, 0.1274185078282676, 0.0637092539141338, 0.09556388087120071, 0.1274185078282676, 0.0637092539141338, 0.0637092539141338, 0.0318546269570669, 0.3778451461243426, 0.18874971722609118, 0.16178547190807815, 0.053928490636026054, 0.08089273595403908, 0.06741061329503258, 0.18874971722609118, 0.14830334924907165, 0.06741061329503258, 0.053928490636026054, 0.17375514966703093, 0.22382019279142965, 0.1266351090793615, 0.11780010146917351, 0.12074510400590284, 0.05742754946622208, 0.06920755961313943, 0.04859254185603407, 0.053010045661128075, 0.007362506341823344, 0.11548824169198375, 0.22301177706038244, 0.10752353536839868, 0.12743530117736138, 0.12743530117736138, 0.07964706323585087, 0.0915941227212285, 0.05973529742688815, 0.05973529742688815, 0.01194705948537763, 0.07573696931472748, 0.07573696931472748, 0.3029478772589099, 0.07573696931472748, 0.15147393862945496, 0.07573696931472748, 0.07573696931472748, 0.07573696931472748, 0.07573696931472748, 0.22821856412035665, 0.22821856412035665, 0.22821856412035665, 0.1919763513358852, 0.1919763513358852, 0.1919763513358852, 0.2627315381956751, 0.2627315381956751, 0.3727341070304542, 0.14335927193479006, 0.11468741754783204, 0.05734370877391602, 0.08601556316087404, 0.05734370877391602, 0.05734370877391602, 0.02867185438695801, 0.02867185438695801, 0.22089434967216084, 0.22089434967216084, 0.22089434967216084, 0.22089434967216084, 0.356473673516712, 0.356473673516712, 0.19554988148247038, 0.19554988148247038, 0.19554988148247038, 0.1863619046471887, 0.3727238092943774, 0.1863619046471887, 0.1863619046471887, 0.12694520928497913, 0.12694520928497913, 0.12694520928497913, 0.12694520928497913, 0.12694520928497913, 0.12694520928497913, 0.12694520928497913, 0.12694520928497913, 0.14705122229341264, 0.09803414819560842, 0.09803414819560842, 0.09803414819560842, 0.14705122229341264, 0.14705122229341264, 0.04901707409780421, 0.14705122229341264, 0.04901707409780421, 0.1920421408801655, 0.09602107044008275, 0.09602107044008275, 0.09602107044008275, 0.09602107044008275, 0.28806321132024826, 0.09602107044008275, 0.09602107044008275, 0.09602107044008275, 0.22785776484466505, 0.15190517656311003, 0.07595258828155502, 0.07595258828155502, 0.07595258828155502, 0.30381035312622007, 0.07595258828155502, 0.07595258828155502, 0.07595258828155502, 0.37233587117974265, 0.37233587117974265, 0.17875742800502836, 0.13903355511502205, 0.13903355511502205, 0.13903355511502205, 0.0794477457800126, 0.059585809335009454, 0.059585809335009454, 0.0794477457800126, 0.13903355511502205, 0.1775569317971829, 0.20176924067861693, 0.12913231403431483, 0.1775569317971829, 0.08070769627144676, 0.06456615701715741, 0.06456615701715741, 0.05649538739001274, 0.048424617762868064, 0.008070769627144677, 0.08773084692129077, 0.17546169384258153, 0.17546169384258153, 0.08773084692129077, 0.2631925407638723, 0.08773084692129077, 0.08773084692129077, 0.08773084692129077, 0.4026171150912162, 0.2013085575456081, 0.2013085575456081, 0.1600052379322671, 0.24616190451118014, 0.14769714270670808, 0.12308095225559007, 0.09436206339595239, 0.06564317453631471, 0.0533350793107557, 0.0533350793107557, 0.04923238090223603, 0.012308095225559007, 0.355863876435967, 0.1779319382179835, 0.1779319382179835, 0.1779319382179835, 0.16921867124298817, 0.16921867124298817, 0.16921867124298817, 0.16921867124298817, 0.15888575737937743, 0.15888575737937743, 0.052961919126459145, 0.21184767650583658, 0.052961919126459145, 0.10592383825291829, 0.052961919126459145, 0.10592383825291829, 0.10592383825291829, 0.16840053476710581, 0.1707394310833156, 0.1613838458184764, 0.0795224747511333, 0.0795224747511333, 0.10057254159702152, 0.10057254159702152, 0.06548909685387448, 0.06315020053766468, 0.007016688948629409, 0.12793952174613493, 0.19683003345559222, 0.13286027258252475, 0.07381126254584708, 0.17222627927364317, 0.06396976087306747, 0.05412825920028786, 0.0885735150550165, 0.08365276421862669, 0.014762252509169417, 0.1263039931410628, 0.19734998928291062, 0.1263039931410628, 0.07893999571316425, 0.19734998928291062, 0.07893999571316425, 0.04736399742789855, 0.10262199442711352, 0.04736399742789855, 0.007893999571316425, 0.17437584376677245, 0.15361681474691857, 0.18683126117868476, 0.07058069866750313, 0.1204023683151524, 0.09549153349132776, 0.07058069866750313, 0.05812528125559081, 0.05812528125559081, 0.008303611607941545, 0.36109261718500946, 0.36109261718500946, 0.08586133731996126, 0.3148249035065246, 0.11448178309328168, 0.08586133731996126, 0.1431022288666021, 0.05724089154664084, 0.08586133731996126, 0.05724089154664084, 0.08586133731996126, 0.1303923443557305, 0.1303923443557305, 0.1303923443557305, 0.21732057392621748, 0.10866028696310874, 0.06519617217786525, 0.04346411478524349, 0.06519617217786525, 0.06519617217786525, 0.24082651050228177, 0.12041325525114088, 0.07525828453196305, 0.09030994143835566, 0.10536159834474827, 0.07525828453196305, 0.13546491215753348, 0.06020662762557044, 0.09030994143835566, 0.06090740944887214, 0.3045370472443607, 0.06090740944887214, 0.24362963779548855, 0.06090740944887214, 0.06090740944887214, 0.06090740944887214, 0.12181481889774427, 0.1668326384876068, 0.21151995236821575, 0.12810363312441236, 0.14299940441794867, 0.06554139369155981, 0.0834163192438034, 0.06852054795026707, 0.05958308517414528, 0.06852054795026707, 0.005958308517414528, 0.1617125644483747, 0.23601293189762793, 0.1136358560988579, 0.12674768564872613, 0.08741219699912146, 0.07648567237423128, 0.06337384282436306, 0.07648567237423128, 0.05026201327449484, 0.008741219699912145, 0.16742641182536697, 0.19025728616518975, 0.11795951742241764, 0.10654408025250625, 0.09893378880589866, 0.09132349735929107, 0.10273893452920246, 0.0799080601893797, 0.03424631150973415, 0.007610291446607589, 0.1966470857570812, 0.14420786288852622, 0.14584658860316854, 0.08357501144675951, 0.12290442859817574, 0.10323972002246762, 0.07538138287354779, 0.07046520572962076, 0.052439222868554985, 0.008193628573211716, 0.23608132021864883, 0.16217760258498484, 0.13138438690429152, 0.07390371763366398, 0.09853829017821865, 0.09443252808745953, 0.09237964704207997, 0.05337490717986843, 0.047216264043729764, 0.010264405226897776, 0.14953675327504587, 0.14953675327504587, 0.1121525649562844, 0.24299722407194954, 0.0560762824781422, 0.1121525649562844, 0.03738418831876147, 0.03738418831876147, 0.09346047079690367, 0.3131313726294842, 0.16526377999889444, 0.12177331157813275, 0.07828284315737105, 0.07828284315737105, 0.07828284315737105, 0.05218856210491403, 0.06088665578906637, 0.043490468420761694, 0.008698093684152338, 0.12312102289014488, 0.12312102289014488, 0.07387261373408693, 0.09849681831211592, 0.07387261373408693, 0.2216178412022608, 0.07387261373408693, 0.12312102289014488, 0.07387261373408693, 0.15476272253296888, 0.16386641209373173, 0.11834796428991737, 0.14110718819182455, 0.10014058516839162, 0.09103689560762875, 0.06827767170572156, 0.06827767170572156, 0.09103689560762875, 0.009103689560762875, 0.12669714002016647, 0.3294125640524328, 0.10135771201613317, 0.07601828401209987, 0.10135771201613317, 0.10135771201613317, 0.07601828401209987, 0.05067885600806658, 0.02533942800403329, 0.18245414371686122, 0.18245414371686122, 0.18245414371686122, 0.18245414371686122, 0.12854969468206592, 0.12854969468206592, 0.12854969468206592, 0.12854969468206592, 0.12854969468206592, 0.25709938936413185, 0.12854969468206592, 0.12854969468206592, 0.2057888934816606, 0.2016731156120274, 0.13307681778480718, 0.09603481695810827, 0.10152252078428589, 0.08094363143611984, 0.04527355656596533, 0.0685962978272202, 0.060364742087953775, 0.00685962978272202, 0.14618960833581454, 0.14618960833581454, 0.14618960833581454, 0.07309480416790727, 0.07309480416790727, 0.07309480416790727, 0.07309480416790727, 0.07309480416790727, 0.14618960833581454, 0.07155695417033889, 0.322006293766525, 0.07155695417033889, 0.14311390834067778, 0.10733543125550833, 0.14311390834067778, 0.035778477085169444, 0.035778477085169444, 0.035778477085169444, 0.22288425096500877, 0.22288425096500877, 0.22288425096500877, 0.22288425096500877, 0.19013778026655198, 0.09506889013327599, 0.09506889013327599, 0.09506889013327599, 0.09506889013327599, 0.285206670399828, 0.09506889013327599, 0.09506889013327599, 0.09506889013327599, 0.2897021097565061, 0.2897021097565061, 0.17974637015277123, 0.16013767522701436, 0.15196738567461568, 0.10948188000214247, 0.09477535880782482, 0.09640941671830457, 0.07680072179254771, 0.06699637432966928, 0.05719202686679085, 0.006536231641918954, 0.3332338968982163, 0.16661694844910815, 0.16661694844910815, 0.08330847422455408, 0.08330847422455408, 0.08330847422455408, 0.08330847422455408, 0.11656963212053549, 0.23313926424107098, 0.11656963212053549, 0.23313926424107098, 0.11656963212053549, 0.11656963212053549, 0.18924660887165937, 0.09462330443582968, 0.283869913307489, 0.09462330443582968, 0.04731165221791484, 0.09462330443582968, 0.04731165221791484, 0.04731165221791484, 0.04731165221791484, 0.16003123346276413, 0.18670310570655815, 0.2667187224379402, 0.053343744487588045, 0.053343744487588045, 0.1333593612189701, 0.026671872243794022, 0.053343744487588045, 0.053343744487588045, 0.22840915561748037, 0.15988640893223627, 0.09136366224699215, 0.09136366224699215, 0.06852274668524411, 0.09136366224699215, 0.13704549337048821, 0.06852274668524411, 0.09136366224699215, 0.1844721227817468, 0.1844721227817468, 0.1844721227817468, 0.1844721227817468, 0.1844721227817468, 0.34016475628527854, 0.14362674258577524, 0.14362674258577524, 0.14362674258577524, 0.14362674258577524, 0.28725348517155047, 0.14362674258577524, 0.10431294774346385, 0.38248080839270077, 0.052156473871731925, 0.12169843903404115, 0.06954196516230923, 0.13908393032461847, 0.052156473871731925, 0.034770982581154616, 0.034770982581154616, 0.16731853415712553, 0.16731853415712553, 0.16731853415712553, 0.33463706831425105, 0.3957163814963126, 0.033305448724722055, 0.16652724362361027, 0.13322179489888822, 0.09991634617416616, 0.29974903852249846, 0.06661089744944411, 0.09991634617416616, 0.033305448724722055, 0.033305448724722055, 0.3818738641059295, 0.1501741459663595, 0.300348291932719, 0.1501741459663595, 0.1501741459663595, 0.3936964441516784, 0.13455679915319532, 0.13455679915319532, 0.13455679915319532, 0.13455679915319532, 0.13455679915319532, 0.13455679915319532, 0.13455679915319532, 0.1493382563279919, 0.2986765126559838, 0.07466912816399594, 0.07466912816399594, 0.07466912816399594, 0.07466912816399594, 0.07466912816399594, 0.07466912816399594, 0.07466912816399594, 0.0551537413141051, 0.2206149652564204, 0.1654612239423153, 0.1103074826282102, 0.2206149652564204, 0.0551537413141051, 0.1103074826282102, 0.0551537413141051, 0.0551537413141051, 0.11817902787496808, 0.11817902787496808, 0.11817902787496808, 0.11817902787496808, 0.23635805574993615, 0.11817902787496808, 0.11817902787496808, 0.07771905588628047, 0.15543811177256095, 0.15543811177256095, 0.07771905588628047, 0.07771905588628047, 0.07771905588628047, 0.07771905588628047, 0.07771905588628047, 0.15543811177256095, 0.1509983791264531, 0.18874797390806636, 0.11324878434483981, 0.07549918956322654, 0.1509983791264531, 0.07549918956322654, 0.1509983791264531, 0.03774959478161327, 0.03774959478161327, 0.11458976012676758, 0.14323720015845948, 0.14323720015845948, 0.14323720015845948, 0.22917952025353516, 0.05729488006338379, 0.028647440031691895, 0.08594232009507569, 0.08594232009507569, 0.1163086505741698, 0.1163086505741698, 0.3489259517225094, 0.1163086505741698, 0.1163086505741698, 0.1163086505741698, 0.1163086505741698, 0.1163086505741698, 0.1163086505741698, 0.28102983425187816, 0.28102983425187816, 0.3594104177127594, 0.18470292103237454, 0.16568938504374775, 0.10864877707786738, 0.10864877707786738, 0.13037853249344086, 0.0869190216622939, 0.06790548567366711, 0.07062170510061379, 0.07062170510061379, 0.008148658280840054, 0.09841759304737277, 0.19683518609474554, 0.19683518609474554, 0.09841759304737277, 0.09841759304737277, 0.049208796523686385, 0.049208796523686385, 0.049208796523686385, 0.19683518609474554, 0.3626624632827642, 0.36292723035585295, 0.36292723035585295, 0.11487179312763232, 0.2871794828190808, 0.11487179312763232, 0.05743589656381616, 0.08615384484572425, 0.05743589656381616, 0.08615384484572425, 0.05743589656381616, 0.08615384484572425, 0.18666931065105674, 0.13333522189361197, 0.13333522189361197, 0.05333408875744478, 0.08000113313616718, 0.08000113313616718, 0.13333522189361197, 0.13333522189361197, 0.08000113313616718, 0.1544623184645996, 0.0772311592322998, 0.0772311592322998, 0.0772311592322998, 0.1544623184645996, 0.1544623184645996, 0.0386155796161499, 0.1544623184645996, 0.0772311592322998, 0.18997436470823456, 0.18997436470823456, 0.18997436470823456, 0.18997436470823456, 0.18997436470823456, 0.18997436470823456, 0.1640817245872676, 0.1640817245872676, 0.1640817245872676, 0.0729252109276745, 0.10938781639151174, 0.05469390819575587, 0.0729252109276745, 0.145850421855349, 0.05469390819575587, 0.17633676015445857, 0.19286833141893905, 0.1432736176254976, 0.1102104750965366, 0.11572099885136343, 0.08265785632240245, 0.06061576130309513, 0.06612628505792197, 0.04408419003861464, 0.01102104750965366, 0.17828559210799635, 0.17828559210799635, 0.10187748120456934, 0.2801630733125657, 0.05093874060228467, 0.076408110903427, 0.025469370301142335, 0.025469370301142335, 0.076408110903427, 0.20359582518033675, 0.1409509558940793, 0.19419909478739814, 0.09709954739369907, 0.09709954739369907, 0.07830608660782183, 0.06577711275057034, 0.07517384314350896, 0.0469836519646931, 0.003132243464312873, 0.2025807018536849, 0.4051614037073698, 0.2025807018536849, 0.19079200236634078, 0.19079200236634078, 0.19079200236634078, 0.19079200236634078, 0.26406192025515457, 0.26406192025515457, 0.26406192025515457, 0.20667812162127075, 0.13667424171729195, 0.12334016935462931, 0.10333906081063537, 0.10667257890130102, 0.116673133173298, 0.07333739799464445, 0.0700038799039788, 0.05666980754131617, 0.010000554271996972, 0.1991020513813706, 0.17031621262743749, 0.13433391418502114, 0.1391315539773433, 0.08875633615796039, 0.09835161574260474, 0.045577578027060736, 0.06476813719634947, 0.05037521781938292, 0.007196459688483274, 0.21608254808854274, 0.12347574176488157, 0.06173787088244079, 0.06173787088244079, 0.21608254808854274, 0.07717233860305098, 0.12347574176488157, 0.04630340316183059, 0.07717233860305098, 0.015434467720610197, 0.25764204935886886, 0.12457417771198055, 0.1557177221399757, 0.07644324541416989, 0.1160804837770728, 0.07927447672580581, 0.07927447672580581, 0.04529970098617475, 0.059455857544354354, 0.008493693934907766, 0.2923176135488307, 0.2923176135488307, 0.28102478659868696, 0.15275523820149478, 0.11456642865112109, 0.22913285730224217, 0.10183682546766319, 0.10183682546766319, 0.10183682546766319, 0.0636480159172895, 0.10183682546766319, 0.038188809550373695, 0.012729603183457899, 0.33602529579550133, 0.14401084105521486, 0.14401084105521486, 0.09600722737014325, 0.09600722737014325, 0.048003613685071625, 0.07200542052760743, 0.024001806842535812, 0.024001806842535812, 0.14143695934800174, 0.14143695934800174, 0.14143695934800174, 0.14143695934800174, 0.14143695934800174, 0.07071847967400087, 0.14143695934800174, 0.07071847967400087, 0.2669995186164686, 0.2669995186164686, 0.2669995186164686, 0.1157218684773697, 0.1157218684773697, 0.3471656054321091, 0.1157218684773697, 0.1157218684773697, 0.1157218684773697, 0.1157218684773697, 0.1157218684773697, 0.19276368347947406, 0.19276368347947406, 0.3855273669589481, 0.25876517494442985, 0.2874650167807268, 0.1713841739833716, 0.0856920869916858, 0.2570762609750574, 0.0856920869916858, 0.1713841739833716, 0.0856920869916858, 0.0856920869916858, 0.0856920869916858, 0.0856920869916858, 0.1902190367018132, 0.14674039974139877, 0.11141400721106202, 0.15489264417147647, 0.05978312582056987, 0.07608761468072528, 0.08423985911080299, 0.04619605177044035, 0.13043591088124334, 0.002717414810025903, 0.17342192018683633, 0.20955148689242722, 0.13729235348124544, 0.11561461345789088, 0.06503322007006362, 0.06503322007006362, 0.0722591334111818, 0.050581393387827266, 0.10838870011677271, 0.00722591334111818, 0.18010462013141618, 0.18010462013141618, 0.18010462013141618, 0.18010462013141618, 0.18010462013141618, 0.17611082349489207, 0.2096557422558239, 0.10063475628279547, 0.10063475628279547, 0.09224852659256251, 0.10902098597302842, 0.0754760672120966, 0.0754760672120966, 0.04193114845116478, 0.016772459380465912, 0.16469668904222168, 0.18116635794644384, 0.19214613721592527, 0.098818013425333, 0.098818013425333, 0.08234834452111084, 0.06038878598214795, 0.07136856525162939, 0.054898896347407224, 0.005489889634740722, 0.26779495639129086, 0.26779495639129086, 0.26779495639129086, 0.25911863541824076, 0.25911863541824076, 0.1682956051558338, 0.14425337584785752, 0.14425337584785752, 0.24042229307976257, 0.048084458615952515, 0.09616891723190503, 0.048084458615952515, 0.048084458615952515, 0.048084458615952515, 0.16598906081308248, 0.16045609211931305, 0.132791248650466, 0.1051264051816189, 0.10235992083473419, 0.09406046779408007, 0.08022804605965653, 0.06086265563146358, 0.08852749910031066, 0.008299453040654124, 0.11346115664723641, 0.13867474701328894, 0.2395291084774991, 0.0756407710981576, 0.08824756628118388, 0.0756407710981576, 0.06303397591513134, 0.13867474701328894, 0.06303397591513134, 0.012606795183026268, 0.1540114708825831, 0.26310292942441277, 0.16684576012279836, 0.08342288006139918, 0.08984002468150681, 0.07058859082118392, 0.08984002468150681, 0.038502867720645775, 0.038502867720645775, 0.006417144620107629, 0.1351146157038171, 0.1351146157038171, 0.08106876942229026, 0.08106876942229026, 0.10809169256305368, 0.16213753884458051, 0.08106876942229026, 0.1351146157038171, 0.08106876942229026, 0.1906854733720155, 0.17395867746218957, 0.13715972646057253, 0.08865201832207738, 0.08781567852658607, 0.07694326118519923, 0.09701541627699034, 0.07359790200323404, 0.0627254846618472, 0.010872417341386848, 0.17925328055112474, 0.15289250399948875, 0.11862349448236197, 0.11862349448236197, 0.11598741682719838, 0.09226271793072598, 0.07117409668941718, 0.07908232965490798, 0.05799370841359919, 0.007908232965490798, 0.1615387099161629, 0.13846175135671107, 0.27692350271342214, 0.06923087567835554, 0.06923087567835554, 0.11538479279725922, 0.04615391711890369, 0.04615391711890369, 0.06923087567835554, 0.21924246179150753, 0.05979339867041115, 0.05979339867041115, 0.05979339867041115, 0.05979339867041115, 0.2790358604619187, 0.05979339867041115, 0.1195867973408223, 0.039862265780274096, 0.1816190200418486, 0.23115148005326186, 0.11970344502758204, 0.07017098501616878, 0.0908095100209243, 0.07017098501616878, 0.0825541000190221, 0.0908095100209243, 0.057787870013315465, 0.008255410001902209, 0.19377234996411377, 0.17458696877954805, 0.1515645113580692, 0.09017129156745889, 0.08057860097517602, 0.10935667275202461, 0.09208982968591546, 0.06331175790906687, 0.04028930048758801, 0.005755614355369716, 0.20062778849558646, 0.11464445056890654, 0.05732222528445327, 0.05732222528445327, 0.0859833379266799, 0.22928890113781308, 0.05732222528445327, 0.0859833379266799, 0.05732222528445327, 0.2217765417133434, 0.2217765417133434, 0.2217765417133434, 0.2217765417133434, 0.12659026587698038, 0.15190831905237645, 0.17722637222777254, 0.07595415952618823, 0.07595415952618823, 0.05063610635079215, 0.12659026587698038, 0.15190831905237645, 0.07595415952618823, 0.13855285017333663, 0.23931855939030874, 0.13855285017333663, 0.12595713652121512, 0.08816999556485058, 0.07557428191272908, 0.08816999556485058, 0.05038285460848605, 0.03778714095636454, 0.012595713652121513, 0.28126103618837583, 0.28126103618837583, 0.28126103618837583, 0.28126103618837583, 0.1726573988231211, 0.1569612716573828, 0.18835352598885938, 0.08789831212813437, 0.09417676299442969, 0.10987289016016796, 0.05964528322980547, 0.06278450866295313, 0.06592373409610078, 0.006278450866295312, 0.1834231577004367, 0.19671469086713503, 0.11696549186694515, 0.10899057196692616, 0.1063322653335865, 0.0637993592001519, 0.0850658122668692, 0.08240750563352954, 0.04784951940011392, 0.007974919900018987, 0.22156109738480073, 0.22156109738480073, 0.22156109738480073, 0.22156109738480073, 0.12525194219117927, 0.1565649277389741, 0.09393895664338446, 0.09393895664338446, 0.21919089883456375, 0.09393895664338446, 0.03131298554779482, 0.09393895664338446, 0.09393895664338446, 0.14553544521676845, 0.21223919094112068, 0.12127953768064038, 0.1273435145646724, 0.11521556079660836, 0.06063976884032019, 0.05457579195628817, 0.06063976884032019, 0.09095965326048028, 0.012127953768064038, 0.1562811963654789, 0.2148866450025335, 0.07814059818273945, 0.11721089727410917, 0.07814059818273945, 0.07814059818273945, 0.03907029909136973, 0.07814059818273945, 0.11721089727410917, 0.019535149545684864, 0.3030638995054889, 0.15153194975274445, 0.3030638995054889, 0.15153194975274445, 0.17694363501646243, 0.35388727003292486, 0.17694363501646243, 0.16334962355126875, 0.10889974903417916, 0.2722493725854479, 0.05444987451708958, 0.05444987451708958, 0.16334962355126875, 0.05444987451708958, 0.05444987451708958, 0.05444987451708958, 0.3856598684072898, 0.1542639473629159, 0.1542639473629159, 0.07713197368145795, 0.07713197368145795, 0.07713197368145795, 0.07713197368145795, 0.07713197368145795, 0.41667148003913396, 0.20833574001956698, 0.20833574001956698, 0.20833574001956698, 0.17949985740920435, 0.16569217607003478, 0.09665376937418695, 0.09665376937418695, 0.11506401115974638, 0.0920512089277971, 0.10125632982057681, 0.09665376937418695, 0.04602560446389855, 0.00920512089277971, 0.1426715041948355, 0.13248068246663297, 0.26496136493326594, 0.09171739555382283, 0.10190821728202537, 0.0815265738256203, 0.06114493036921522, 0.0815265738256203, 0.04076328691281015, 0.010190821728202537, 0.17462720276091104, 0.1901496207841031, 0.1358211577029308, 0.12029873967973871, 0.0853732991275565, 0.0853732991275565, 0.09895541489784958, 0.07373148561016243, 0.031044836046384184, 0.003880604505798023, 0.1385274829536691, 0.1385274829536691, 0.1385274829536691, 0.1385274829536691, 0.1385274829536691, 0.1385274829536691, 0.1385274829536691, 0.1490846365516522, 0.19877951540220293, 0.1490846365516522, 0.04969487885055073, 0.04969487885055073, 0.09938975770110146, 0.1490846365516522, 0.09938975770110146, 0.09938975770110146, 0.23416356948103553, 0.07805452316034517, 0.15610904632069034, 0.07805452316034517, 0.07805452316034517, 0.07805452316034517, 0.15610904632069034, 0.07805452316034517, 0.3359416410156693, 0.15270074591621333, 0.10689052214134934, 0.06108029836648533, 0.10689052214134934, 0.06108029836648533, 0.09162044754972799, 0.045810223774863996, 0.030540149183242667, 0.20846500033287774, 0.1323587303700811, 0.12574079385157705, 0.11250492081456893, 0.09926904777756083, 0.10588698429606488, 0.08934214299980475, 0.05294349214803244, 0.06617936518504056, 0.006617936518504055, 0.3346218528192402, 0.14340936549396008, 0.14340936549396008, 0.09560624366264005, 0.047803121831320024, 0.14340936549396008, 0.047803121831320024, 0.047803121831320024, 0.047803121831320024, 0.16953568668055474, 0.18961228115588358, 0.17622788483899768, 0.10038297237664424, 0.10484443781560622, 0.06469124886494852, 0.06915271430391048, 0.06469124886494852, 0.051306852548062616, 0.008922930877923934, 0.17160525637171645, 0.15985147168872216, 0.17160525637171645, 0.12694087457633818, 0.12459011763973933, 0.07052270809796567, 0.05406740954177367, 0.06582119422476795, 0.04466438179537825, 0.007052270809796566, 0.1367730425422117, 0.20515956381331757, 0.1367730425422117, 0.2735460850844234, 0.06838652127110585, 0.06838652127110585, 0.06838652127110585, 0.06838652127110585, 0.06838652127110585, 0.253013966402985, 0.101205586561194, 0.151808379841791, 0.101205586561194, 0.050602793280597, 0.050602793280597, 0.151808379841791, 0.101205586561194, 0.2842222380735438, 0.2842222380735438, 0.16112455348689886, 0.16112455348689886, 0.08056227674344943, 0.08056227674344943, 0.08056227674344943, 0.2416868302303483, 0.08056227674344943, 0.08056227674344943, 0.08056227674344943, 0.18940749108398547, 0.37881498216797094, 0.18940749108398547, 0.15912603835965675, 0.27051426521141647, 0.25460166137545076, 0.06365041534386269, 0.06365041534386269, 0.031825207671931345, 0.04773781150789702, 0.031825207671931345, 0.04773781150789702, 0.015912603835965673, 0.19692957347011572, 0.17248795974155526, 0.13826970052157062, 0.09706812309342584, 0.10684476858485002, 0.08729147760200165, 0.08030815939384152, 0.06215153205262518, 0.05307321838201701, 0.005586654566528106, 0.14482867194618768, 0.20919697058893774, 0.08850641063378135, 0.18505885859790647, 0.056322261312406315, 0.08850641063378135, 0.06436829864275008, 0.09655244796412511, 0.056322261312406315, 0.00804603733034376, 0.1406789475663861, 0.17659697673227193, 0.17959014582942906, 0.14367211666354326, 0.0853053192689788, 0.0733326428803502, 0.06884288923461447, 0.07183605833177163, 0.055373628297407294, 0.005986338194314302, 0.07201579076373957, 0.21604737229121873, 0.10802368614560937, 0.10802368614560937, 0.18003947690934893, 0.10802368614560937, 0.07201579076373957, 0.036007895381869784, 0.036007895381869784, 0.036007895381869784, 0.21021133077702278, 0.21021133077702278, 0.21021133077702278, 0.1932314309680271, 0.1932314309680271, 0.1932314309680271, 0.21746825955166446, 0.21746825955166446, 0.21746825955166446, 0.15095471139628955, 0.11950581318872922, 0.2453014060189705, 0.06289779641512064, 0.06918757605663271, 0.16353427067931367, 0.08176713533965684, 0.05660801677360858, 0.03773867784907239, 0.0062897796415120645, 0.16680157778203508, 0.13344126222562808, 0.23352220889484912, 0.06672063111281404, 0.08340078889101754, 0.11676110444742456, 0.05004047333461053, 0.06672063111281404, 0.03336031555640702, 0.1934561085188858, 0.0967280542594429, 0.2579414780251811, 0.08060671188286908, 0.12897073901259054, 0.08060671188286908, 0.04836402712972145, 0.06448536950629527, 0.04836402712972145, 0.15173936895755916, 0.15173936895755916, 0.15173936895755916, 0.3034787379151183, 0.15173936895755916, 0.15503266150946085, 0.2170457261132452, 0.09301959690567652, 0.2170457261132452, 0.1085228630566226, 0.04650979845283826, 0.06201306460378434, 0.09301959690567652, 0.03100653230189217, 0.17985382436168107, 0.35970764872336214, 0.17985382436168107, 0.17985382436168107, 0.2523623869729578, 0.1261811934864789, 0.1261811934864789, 0.2523623869729578, 0.1261811934864789, 0.1261811934864789, 0.1261811934864789, 0.13988004187325986, 0.13988004187325986, 0.13988004187325986, 0.13988004187325986, 0.13988004187325986, 0.13988004187325986, 0.17192125323501495, 0.0736805371007207, 0.31928232743645635, 0.04912035806714713, 0.12280089516786782, 0.04912035806714713, 0.0736805371007207, 0.0736805371007207, 0.0736805371007207, 0.10380768846738726, 0.10380768846738726, 0.3114230654021618, 0.10380768846738726, 0.10380768846738726, 0.10380768846738726, 0.10380768846738726, 0.10380768846738726, 0.10380768846738726, 0.2810556259836516, 0.2810556259836516, 0.18056927242354218, 0.36113854484708435, 0.09028463621177109, 0.09028463621177109, 0.09028463621177109, 0.09028463621177109, 0.09028463621177109, 0.13229601149886058, 0.13229601149886058, 0.13229601149886058, 0.26459202299772117, 0.13229601149886058, 0.11688725067730671, 0.11688725067730671, 0.23377450135461342, 0.11688725067730671, 0.11688725067730671, 0.11688725067730671, 0.11688725067730671, 0.11688725067730671, 0.1866678409916366, 0.17555665998022965, 0.148889825552853, 0.06666708606844164, 0.1666677151711041, 0.07111155847300442, 0.08000050328212997, 0.04888919645019054, 0.04888919645019054, 0.006666708606844164, 0.17940069137261472, 0.1901647328549716, 0.1327565116157349, 0.07176027654904589, 0.1291684977882826, 0.07176027654904589, 0.07893630420395048, 0.06817226272159359, 0.0645842488941413, 0.010764041482356882, 0.21126468578439628, 0.1954691952584601, 0.1382105421019415, 0.092798506839875, 0.10069625210284308, 0.07107970736671276, 0.08095188894542288, 0.04541203526206649, 0.05725865315651862, 0.007897745262968086, 0.2613938716493698, 0.17251995528858408, 0.09932967122676052, 0.07319028406182354, 0.1359248132576723, 0.08364603892779834, 0.09410179379377313, 0.03659514203091177, 0.031367264597924376, 0.010455754865974793, 0.2510961694568093, 0.14837500922447822, 0.14457052180846597, 0.06467628607220846, 0.15978847147251501, 0.08369872315226977, 0.07989423573625751, 0.030435899328098096, 0.026631411912085835, 0.007608974832024524, 0.15114037022258772, 0.18686445772974483, 0.12366030290938997, 0.1181642894467504, 0.09892824232751196, 0.12366030290938997, 0.06870016828299443, 0.06870016828299443, 0.05221212789507576, 0.0054960134626395535, 0.22260141891548438, 0.22260141891548438, 0.22260141891548438, 0.22260141891548438, 0.2085316829857309, 0.16184548530235832, 0.1493958325867923, 0.09025998218785368, 0.14005859305011778, 0.07158550311450465, 0.05913585039893862, 0.07158550311450465, 0.03734895814669807, 0.0062248263577830126, 0.15780386679796152, 0.19579368658265597, 0.1168917531836752, 0.10520257786530768, 0.08474652105816452, 0.08182422722857265, 0.10520257786530768, 0.07013505191020512, 0.07013505191020512, 0.00876688148877564, 0.14145503272873095, 0.14145503272873095, 0.14145503272873095, 0.14145503272873095, 0.14145503272873095, 0.14145503272873095, 0.14145503272873095, 0.12488106849018366, 0.12488106849018366, 0.12488106849018366, 0.12488106849018366, 0.12488106849018366, 0.12488106849018366, 0.12488106849018366, 0.12488106849018366, 0.22244384418580487, 0.22244384418580487, 0.22244384418580487, 0.21918759800324483, 0.10959379900162242, 0.10959379900162242, 0.10959379900162242, 0.10959379900162242, 0.10959379900162242, 0.10959379900162242, 0.21918759800324483, 0.1755127869158562, 0.1864823360980972, 0.14671772031247354, 0.10283952358350949, 0.0781580379234672, 0.10421071723128962, 0.06993087603678645, 0.06855968238900632, 0.06170371415010569, 0.006855968238900632, 0.14609893384473255, 0.1400114782678687, 0.14609893384473255, 0.09131183365295785, 0.15218638942159643, 0.060874555768638566, 0.05478710019177471, 0.13392402269100484, 0.060874555768638566, 0.0060874555768638565, 0.12210417670912287, 0.16906732159724705, 0.1033189187538732, 0.1314968056867477, 0.1596746926196222, 0.08453366079862353, 0.04696314488812418, 0.11271154773149802, 0.06574840284337385, 0.009392628977624835, 0.21662062080626454, 0.3249309312093968, 0.054155155201566135, 0.10831031040313227, 0.10831031040313227, 0.054155155201566135, 0.054155155201566135, 0.054155155201566135, 0.054155155201566135, 0.18089249878922922, 0.24402276682305418, 0.10076562013091292, 0.11412009990729897, 0.0789128350422812, 0.09348135843470234, 0.06920048611400044, 0.060702180801754775, 0.048561744641403816, 0.008498305312245668, 0.17606160810687954, 0.15613010530232713, 0.13287668536368266, 0.09633559688866994, 0.13952051963186682, 0.0597945084136572, 0.09301367975457787, 0.06643834268184133, 0.06643834268184133, 0.006643834268184133, 0.24728533084103388, 0.13170631751315934, 0.10751536123523212, 0.0940759410808281, 0.0913880570499473, 0.0940759410808281, 0.05913344867937766, 0.10213959317347052, 0.06450921674113927, 0.008063652092642409, 0.1412908780437214, 0.1412908780437214, 0.1412908780437214, 0.2825817560874428, 0.1412908780437214, 0.1412908780437214, 0.18522756367919155, 0.11113653820751493, 0.07409102547167662, 0.07409102547167662, 0.14818205094335324, 0.11113653820751493, 0.11113653820751493, 0.03704551273583831, 0.14818205094335324, 0.13617375088306832, 0.3631300023548488, 0.13617375088306832, 0.0453912502943561, 0.0907825005887122, 0.0907825005887122, 0.0453912502943561, 0.0453912502943561, 0.0453912502943561, 0.1306947902053587, 0.1306947902053587, 0.1306947902053587, 0.2613895804107174, 0.1306947902053587, 0.17573613994060297, 0.15816252594654268, 0.08786806997030149, 0.10544168396436178, 0.10544168396436178, 0.10544168396436178, 0.05272084198218089, 0.14058891195248238, 0.05272084198218089, 0.15569586971705737, 0.31139173943411474, 0.22242267102436766, 0.044484534204873534, 0.0667268013073103, 0.044484534204873534, 0.0667268013073103, 0.044484534204873534, 0.044484534204873534, 0.2052248141824519, 0.2137758481067207, 0.12826550886403243, 0.12826550886403243, 0.09406137316695712, 0.06840827139415062, 0.042755169621344145, 0.07695930531841945, 0.025653101772806487, 0.017102067848537655, 0.2376906935963192, 0.2376906935963192, 0.2376906935963192, 0.18165450471111957, 0.21371118201308184, 0.1415836580836667, 0.09884142168105035, 0.08281308303006922, 0.1041842012313774, 0.06411335460392455, 0.06144196482876103, 0.04808501595294341, 0.005342779550327046, 0.3754468901734962, 0.14917897384442919, 0.16575441538269908, 0.09945264922961945, 0.09945264922961945, 0.09945264922961945, 0.06630176615307963, 0.13260353230615926, 0.09945264922961945, 0.06630176615307963, 0.22468189861234109, 0.22468189861234109, 0.22468189861234109, 0.22468189861234109, 0.1265210876961492, 0.3795632630884476, 0.1265210876961492, 0.1265210876961492, 0.1265210876961492, 0.1265210876961492, 0.21647386862759257, 0.2976515693629398, 0.054118467156898144, 0.054118467156898144, 0.10823693431379629, 0.054118467156898144, 0.08117770073534722, 0.08117770073534722, 0.027059233578449072, 0.13798131828833374, 0.12418318645950037, 0.06899065914416687, 0.2759626365766675, 0.13798131828833374, 0.08278879097300024, 0.06899065914416687, 0.04139439548650012, 0.04139439548650012, 0.013798131828833373, 0.15128170903144034, 0.13237149540251028, 0.05673064088679012, 0.20801234991823045, 0.11346128177358024, 0.11346128177358024, 0.0945510681446502, 0.037820427257860084, 0.05673064088679012, 0.018910213628930042, 0.16484265142482196, 0.08242132571241098, 0.16484265142482196, 0.08242132571241098, 0.08242132571241098, 0.08242132571241098, 0.24726397713723294, 0.08242132571241098, 0.2587913427461059, 0.13850804259650737, 0.1640226820221798, 0.06925402129825368, 0.080188866766399, 0.10934845468145318, 0.05467422734072659, 0.06196412431949014, 0.05102927885134482, 0.007289896978763546, 0.1818233349780275, 0.19956219692710334, 0.15964975754168267, 0.07539016328357237, 0.0842595942581103, 0.07539016328357237, 0.06208601682176548, 0.09756374071991719, 0.057651301334496516, 0.008869430974537925, 0.06733017363583502, 0.20199052090750505, 0.13466034727167003, 0.06733017363583502, 0.06733017363583502, 0.06733017363583502, 0.06733017363583502, 0.06733017363583502, 0.13466034727167003, 0.3411759202608869, 0.21166152199766278, 0.3057333095521796, 0.0705538406658876, 0.0470358937772584, 0.117589734443146, 0.0705538406658876, 0.0705538406658876, 0.0940717875545168, 0.0235179468886292, 0.0235179468886292, 0.29422913287076746, 0.29422913287076746, 0.29422913287076746, 0.35154122165615553, 0.17577061082807777, 0.17577061082807777, 0.2490021081471097, 0.12450105407355486, 0.06225052703677743, 0.06225052703677743, 0.06225052703677743, 0.2490021081471097, 0.06225052703677743, 0.12450105407355486, 0.06225052703677743, 0.17811526722117413, 0.1246806870548219, 0.1246806870548219, 0.10686916033270448, 0.08905763361058706, 0.10686916033270448, 0.05343458016635224, 0.07124610688846965, 0.1424922137769393, 0.29655383088010506, 0.29655383088010506, 0.29655383088010506, 0.1214474778405446, 0.1214474778405446, 0.2428949556810892, 0.034699279383012746, 0.08674819845753186, 0.15614675722355734, 0.06939855876602549, 0.1214474778405446, 0.034699279383012746, 0.017349639691506373, 0.17015066933250908, 0.27224107093201455, 0.07656780119962908, 0.06806026773300364, 0.10209040159950546, 0.07656780119962908, 0.07656780119962908, 0.08507533466625454, 0.05955273426637818, 0.008507533466625455, 0.1577913317561344, 0.19177715705745565, 0.10681259380415252, 0.11652282960453003, 0.14565353700566253, 0.08496456325330315, 0.07039920955273689, 0.06311653270245376, 0.05340629690207626, 0.009710235800377501, 0.2162083990982925, 0.2162083990982925, 0.11354829592931909, 0.09332736651724856, 0.09021645430000694, 0.07621734932241965, 0.0497745954758659, 0.07777280543104047, 0.060662788236211566, 0.0062218244344832375, 0.21121728113135171, 0.17551858572886972, 0.08627184722266479, 0.15766923802762875, 0.07139739080496396, 0.08329695593912462, 0.06247271695434346, 0.07734717337204429, 0.06842249952142379, 0.0059497825670803296, 0.20874207197360886, 0.13143019346486484, 0.20874207197360886, 0.09277425421049283, 0.07731187850874402, 0.07731187850874402, 0.061849502806995214, 0.06958069065786962, 0.061849502806995214, 0.007731187850874402, 0.254332554409157, 0.1271662772045785, 0.09537470790343387, 0.09537470790343387, 0.09537470790343387, 0.09537470790343387, 0.03179156930114462, 0.1271662772045785, 0.06358313860228924, 0.25340672541968035, 0.12670336270984017, 0.12670336270984017, 0.12670336270984017, 0.12670336270984017, 0.25340672541968035, 0.12670336270984017, 0.12623114925533718, 0.12623114925533718, 0.25246229851067437, 0.031557787313834296, 0.06311557462766859, 0.22090451119684004, 0.06311557462766859, 0.09467336194150289, 0.031557787313834296, 0.09910419051532111, 0.19820838103064223, 0.19820838103064223, 0.09910419051532111, 0.09910419051532111, 0.09910419051532111, 0.09910419051532111, 0.19820838103064223, 0.11399221692436733, 0.22798443384873465, 0.11399221692436733, 0.11399221692436733, 0.22798443384873465, 0.11399221692436733, 0.11399221692436733, 0.18419681011664396, 0.27629521517496597, 0.09209840505832198, 0.09209840505832198, 0.09209840505832198, 0.09209840505832198, 0.09209840505832198, 0.09209840505832198, 0.16164553255790468, 0.16164553255790468, 0.16164553255790468, 0.16164553255790468, 0.16164553255790468, 0.13591295812040613, 0.13591295812040613, 0.13591295812040613, 0.27182591624081226, 0.13591295812040613, 0.13591295812040613, 0.20334581258240972, 0.17970095065422254, 0.13714019918348563, 0.11113085106247973, 0.0662056133989241, 0.08039253055583641, 0.06857009959174282, 0.07093458578456154, 0.08039253055583641, 0.007093458578456153, 0.10613676375549053, 0.1415156850073207, 0.12382622438140561, 0.19458406688506596, 0.10613676375549053, 0.08844730312957544, 0.05306838187774526, 0.08844730312957544, 0.08844730312957544, 0.017689460625915088, 0.2352369815005917, 0.2352369815005917, 0.2352369815005917, 0.12087184582908195, 0.21584258183764632, 0.12087184582908195, 0.07770332946155267, 0.18130776874362292, 0.08633703273505854, 0.06906962618804682, 0.05180221964103512, 0.05180221964103512, 0.017267406547011706, 0.10016148727234969, 0.18029067709022945, 0.10016148727234969, 0.12019378472681963, 0.22035527199916932, 0.08012918981787975, 0.060096892363409815, 0.060096892363409815, 0.08012918981787975, 0.02003229745446994, 0.16719250825653098, 0.16719250825653098, 0.12539438119239824, 0.06269719059619912, 0.10449531766033185, 0.12539438119239824, 0.06269719059619912, 0.12539438119239824, 0.041798127064132745, 0.020899063532066373, 0.16463045974999033, 0.21642431225560527, 0.11283660724437539, 0.11283660724437539, 0.10358770501122987, 0.0795405592050515, 0.07399121786516419, 0.07214143741853508, 0.05549341339887314, 0.007399121786516419, 0.23505958838538485, 0.3917659806423081, 0.15670639225692323, 0.07835319612846162, 0.07835319612846162, 0.07835319612846162, 0.07835319612846162, 0.3825115198320377, 0.12849068388888818, 0.28451508575396667, 0.10095696591269784, 0.09177905992063441, 0.10095696591269784, 0.06424534194444409, 0.07342324793650752, 0.07342324793650752, 0.08260115392857097, 0.01835581198412688, 0.06518622335781186, 0.26074489343124746, 0.06518622335781186, 0.06518622335781186, 0.26074489343124746, 0.06518622335781186, 0.06518622335781186, 0.06518622335781186, 0.06518622335781186, 0.07198932945063215, 0.2879573178025286, 0.07198932945063215, 0.07198932945063215, 0.21596798835189646, 0.07198932945063215, 0.07198932945063215, 0.07198932945063215, 0.07198932945063215, 0.1627013222450246, 0.2576104268879556, 0.06779221760209357, 0.12202599168376843, 0.10846754816334972, 0.0813506611225123, 0.04067533056125615, 0.0813506611225123, 0.04067533056125615, 0.013558443520418715, 0.3653178473176368, 0.1826589236588184, 0.0913294618294092, 0.0913294618294092, 0.0913294618294092, 0.0913294618294092, 0.0913294618294092, 0.14113166492428453, 0.1693579979091414, 0.0846789989545707, 0.2540369968637121, 0.11290533193942762, 0.0846789989545707, 0.0846789989545707, 0.05645266596971381, 0.05645266596971381, 0.09622159253837657, 0.19244318507675315, 0.09622159253837657, 0.2886647776151297, 0.09622159253837657, 0.09622159253837657, 0.09622159253837657, 0.09622159253837657, 0.38931607717170824, 0.10245159925571269, 0.061470959553427615, 0.14343223895799778, 0.061470959553427615, 0.061470959553427615, 0.04098063970228508, 0.10245159925571269, 0.02049031985114254, 0.2293503755761307, 0.1595480873573083, 0.07977404367865415, 0.09971755459831769, 0.129632820977813, 0.10968931005814946, 0.05983053275899061, 0.07977404367865415, 0.039887021839327075, 0.009971755459831769, 0.11187878910790931, 0.3356363673237279, 0.11187878910790931, 0.11187878910790931, 0.11187878910790931, 0.11187878910790931, 0.11187878910790931, 0.11187878910790931, 0.42113536045260547, 0.10528384011315137, 0.10528384011315137, 0.10528384011315137, 0.10528384011315137, 0.10528384011315137, 0.10528384011315137, 0.10528384011315137, 0.16040146726897936, 0.16040146726897936, 0.11322456513104426, 0.15568377705518585, 0.10378918470345724, 0.0849184238482832, 0.07548304342069617, 0.07548304342069617, 0.06604766299310916, 0.009435380427587022, 0.20690095888285903, 0.11822911936163373, 0.11822911936163373, 0.16256503912224637, 0.059114559680816865, 0.07389319960102109, 0.07389319960102109, 0.04433591976061265, 0.14778639920204217, 0.12334965387437936, 0.24669930774875873, 0.12334965387437936, 0.12334965387437936, 0.24669930774875873, 0.12334965387437936, 0.2154866941456383, 0.31125855821036646, 0.0718288980485461, 0.0718288980485461, 0.0718288980485461, 0.09577186406472814, 0.0718288980485461, 0.0718288980485461, 0.023942966016182034, 0.373166225381418, 0.13027554549483186, 0.13027554549483186, 0.13027554549483186, 0.2605510909896637, 0.13027554549483186, 0.13027554549483186, 0.1262533189325787, 0.0420844396441929, 0.31563329733144674, 0.06312665946628936, 0.1262533189325787, 0.06312665946628936, 0.06312665946628936, 0.1262533189325787, 0.06312665946628936, 0.12359500369361866, 0.3089875092340466, 0.06179750184680933, 0.12359500369361866, 0.030898750923404664, 0.1544937546170233, 0.06179750184680933, 0.06179750184680933, 0.030898750923404664, 0.15758679169743203, 0.26406435365515635, 0.0809229470878705, 0.14480948426250512, 0.08944115204448845, 0.10221845947941537, 0.06814563965294358, 0.046850127261398714, 0.04259102478308974, 0.008518204956617948, 0.22880272885139427, 0.1494221902702983, 0.13541385993245783, 0.0840499820270428, 0.0840499820270428, 0.13541385993245783, 0.060702764797308684, 0.060702764797308684, 0.05603332135136186, 0.009338886891893643, 0.2241398008253646, 0.1596996080880723, 0.12047514294363348, 0.0924576678404629, 0.0924576678404629, 0.09525941535077996, 0.08685417281982878, 0.0728454352682435, 0.04762970767538998, 0.005603495020634115, 0.1700414677860033, 0.14246717571259734, 0.2205943365872475, 0.09191430691135312, 0.08731859156578547, 0.0735314455290825, 0.0735314455290825, 0.0735314455290825, 0.059744299492379534, 0.009191430691135313, 0.15071868334799507, 0.2153124047828501, 0.10765620239142505, 0.08612496191314004, 0.08612496191314004, 0.06459372143485503, 0.12918744286971007, 0.08612496191314004, 0.08612496191314004, 0.28757601119559567, 0.28757601119559567, 0.2565085800100608, 0.2565085800100608, 0.15831944634361478, 0.18108433405315413, 0.12106781190982306, 0.08588571272235311, 0.13245025576459274, 0.09105955083815752, 0.06725989550545726, 0.07760757173706606, 0.07760757173706606, 0.006208605738965285, 0.1823179824650587, 0.06077266082168624, 0.06077266082168624, 0.06077266082168624, 0.12154532164337248, 0.24309064328674496, 0.06077266082168624, 0.12154532164337248, 0.06077266082168624, 0.1366890165140903, 0.17086127064261286, 0.034172254128522574, 0.2733780330281806, 0.1366890165140903, 0.034172254128522574, 0.034172254128522574, 0.10251676238556771, 0.034172254128522574, 0.2755461103827594, 0.055109222076551875, 0.11021844415310375, 0.16532766622965564, 0.055109222076551875, 0.055109222076551875, 0.055109222076551875, 0.055109222076551875, 0.2204368883062075, 0.17424701891020974, 0.16843878494653608, 0.11326056229163634, 0.10164409436428902, 0.11326056229163634, 0.1103564453097995, 0.08131527549143121, 0.06969880756408389, 0.06389057360041024, 0.005808233963673658, 0.1775015310659194, 0.11093845691619962, 0.13312614829943956, 0.1257302511716929, 0.10354255978845298, 0.1294281997355662, 0.09244871409683302, 0.06656307414971978, 0.05546922845809981, 0.007395897127746642, 0.22891275993771665, 0.16683472334443755, 0.10087680946407852, 0.1086365640382384, 0.09311705488991863, 0.08923717760283868, 0.05819815930619914, 0.07371766845451892, 0.07759754574159886, 0.007759754574159886, 0.36320874052744306, 0.1811199813667425, 0.1811199813667425, 0.120746654244495, 0.0603733271222475, 0.1811199813667425, 0.0603733271222475, 0.1811199813667425, 0.0603733271222475, 0.0603733271222475, 0.1266584017875577, 0.22798512321760386, 0.10132672143004616, 0.10132672143004616, 0.20265344286009232, 0.07599504107253462, 0.07599504107253462, 0.02533168035751154, 0.05066336071502308, 0.19187881199506224, 0.25881328129566533, 0.0981705549742179, 0.09370825702084436, 0.08478366111409727, 0.062472171347229564, 0.08032136316072373, 0.062472171347229564, 0.05800987339385603, 0.00892459590674708, 0.16911515539612662, 0.18756408143934045, 0.1291424823024967, 0.09531945122327137, 0.1229928402880921, 0.10761873525208059, 0.06764606215845065, 0.06457124115124835, 0.05227195712243914, 0.009224463021606908, 0.14407477195146698, 0.14407477195146698, 0.14407477195146698, 0.14407477195146698, 0.28814954390293396, 0.14407477195146698, 0.1751753541505509, 0.1561345547863606, 0.11805295605797997, 0.14851823504068448, 0.09520399682095158, 0.09139583694811353, 0.09139583694811353, 0.057122398092570956, 0.057122398092570956, 0.0038081598728380637, 0.20209057172188422, 0.21141782887827887, 0.09327257156394655, 0.07150897153235902, 0.10570891443913943, 0.08705440012635011, 0.06839988581356081, 0.07461805725115725, 0.08083622868875369, 0.006218171437596437, 0.20951576044990428, 0.20951576044990428, 0.20951576044990428, 0.18891242287664972, 0.12021699637604982, 0.10304313975089985, 0.1373908530011998, 0.08586928312574987, 0.0686954265005999, 0.08586928312574987, 0.017173856625149975, 0.15456470962634977, 0.1666676586460176, 0.19819937784931824, 0.12162234549844528, 0.11711781418368805, 0.10360422023941636, 0.08108156366563019, 0.10360422023941636, 0.06306343840660125, 0.04054078183281509, 0.009009062629514465, 0.270829700507611, 0.2760333755903326, 0.06900834389758315, 0.10351251584637472, 0.1380166877951663, 0.06900834389758315, 0.06900834389758315, 0.06900834389758315, 0.03450417194879157, 0.1380166877951663, 0.4118611765008595, 0.11767462185738843, 0.17651193278608265, 0.058837310928694216, 0.11767462185738843, 0.058837310928694216, 0.058837310928694216, 0.058837310928694216, 0.058837310928694216, 0.34495288831767956, 0.11498429610589318, 0.08623822207941989, 0.14373037013236648, 0.08623822207941989, 0.08623822207941989, 0.028746074026473296, 0.08623822207941989, 0.028746074026473296, 0.1256137989536797, 0.37684139686103907, 0.1256137989536797, 0.1256137989536797, 0.1256137989536797, 0.19618353288171766, 0.1520422379833312, 0.1275192963731165, 0.08337800147473, 0.1177101197290306, 0.07847341315268706, 0.10790094308494472, 0.06866423650860118, 0.06375964818655824, 0.0049045883220429414, 0.17723729598399782, 0.14178983678719825, 0.07089491839359913, 0.17723729598399782, 0.07089491839359913, 0.10634237759039869, 0.03544745919679956, 0.07089491839359913, 0.10634237759039869, 0.03544745919679956, 0.20130313395065486, 0.20130313395065486, 0.20130313395065486, 0.20130313395065486, 0.20130313395065486, 0.22364955179254392, 0.18171526083144193, 0.13745128703916762, 0.10949509306509964, 0.06756080210399765, 0.07920921625985931, 0.07920921625985931, 0.04426397379227432, 0.06989048493516997, 0.0069890484935169975, 0.2742259356639501, 0.2742259356639501, 0.2742259356639501, 0.22395543203696902, 0.2021061215943379, 0.07101025893855116, 0.06554793132789337, 0.18025681115170678, 0.10924655221315563, 0.06554793132789337, 0.04369862088526225, 0.027311638053288907, 0.005462327610657782, 0.23991674092250515, 0.14112749466029714, 0.098789246262208, 0.056450997864118856, 0.2116912419904457, 0.08467649679617828, 0.098789246262208, 0.04233824839808914, 0.028225498932059428, 0.014112749466029714, 0.18238954976427466, 0.18238954976427466, 0.3647790995285493, 0.18238954976427466, 0.11431182190415913, 0.27761442462438646, 0.10614669176814777, 0.16330260272022734, 0.0734861712241023, 0.08981643149612503, 0.057155910952079565, 0.06532104108809093, 0.0489907808160682, 0.008165130136011367, 0.3038359780909349, 0.10851284931819104, 0.08681027945455283, 0.10851284931819104, 0.06510770959091462, 0.08681027945455283, 0.08681027945455283, 0.06510770959091462, 0.06510770959091462, 0.18168309358592558, 0.16371443597852636, 0.11779453320406165, 0.13775970832339413, 0.07586766545346343, 0.1197910507159949, 0.0539059728221977, 0.07986070047732993, 0.0578990078460642, 0.007986070047732993, 0.2835697622998278, 0.2835697622998278, 0.2835697622998278, 0.22451501541053054, 0.22451501541053054, 0.07483833847017685, 0.07483833847017685, 0.07483833847017685, 0.07483833847017685, 0.07483833847017685, 0.07483833847017685, 0.29596703740657143, 0.16728571679501866, 0.15441758473386336, 0.0643406603057764, 0.09007692442808697, 0.07720879236693168, 0.0643406603057764, 0.051472528244621125, 0.03860439618346584, 0.012868132061155281, 0.1193243853266685, 0.1193243853266685, 0.1193243853266685, 0.1193243853266685, 0.3579731559800055, 0.1193243853266685, 0.1193243853266685, 0.3532125811479329, 0.15137682049197126, 0.15137682049197126, 0.10091788032798084, 0.10091788032798084, 0.05045894016399042, 0.05045894016399042, 0.05045894016399042, 0.05045894016399042, 0.23608938951476097, 0.18420161159942888, 0.1193418892052638, 0.1219362781010304, 0.09339800024759776, 0.06745411128993171, 0.0622653334983985, 0.07004850018569832, 0.03632144454073246, 0.007783166687299813, 0.42172918564654377, 0.11246111617241167, 0.1405763952155146, 0.08434583712930875, 0.08434583712930875, 0.028115279043102917, 0.08434583712930875, 0.028115279043102917, 0.028115279043102917, 0.19094934972313712, 0.19094934972313712, 0.19094934972313712, 0.20636101770979903, 0.14501044487715609, 0.10039184645341574, 0.12827847046825344, 0.08365987204451313, 0.08923719684748066, 0.11154649605935082, 0.05577324802967541, 0.07808254724154558, 0.005577324802967541, 0.051588442494118085, 0.20635376997647234, 0.15476532748235425, 0.10317688498823617, 0.25794221247059046, 0.10317688498823617, 0.051588442494118085, 0.051588442494118085, 0.051588442494118085, 0.14832517067637754, 0.12135695782612708, 0.09438874497587663, 0.18877748995175325, 0.09438874497587663, 0.09438874497587663, 0.040452319275375695, 0.09438874497587663, 0.10787285140100186, 0.013484106425125233, 0.2009932522845455, 0.13399550152303036, 0.06699775076151518, 0.06699775076151518, 0.06699775076151518, 0.2679910030460607, 0.06699775076151518, 0.06699775076151518, 0.06699775076151518, 0.11376176265156561, 0.11376176265156561, 0.3412852879546968, 0.11376176265156561, 0.11376176265156561, 0.11376176265156561, 0.11376176265156561, 0.11376176265156561, 0.12571484214917217, 0.3520015580176821, 0.0754289052895033, 0.0754289052895033, 0.0754289052895033, 0.0754289052895033, 0.0754289052895033, 0.0754289052895033, 0.0754289052895033, 0.22545646632229643, 0.13527387979337785, 0.09018258652891857, 0.09018258652891857, 0.09018258652891857, 0.09018258652891857, 0.13527387979337785, 0.045091293264459285, 0.09018258652891857, 0.10017294501705654, 0.20034589003411307, 0.10017294501705654, 0.10017294501705654, 0.25043236254264134, 0.10017294501705654, 0.05008647250852827, 0.10017294501705654, 0.05008647250852827, 0.15710809813874693, 0.16147221197593434, 0.13965164278999728, 0.12655930127843504, 0.1483798704643721, 0.08728227674374829, 0.05236936604624898, 0.07418993523218605, 0.05236936604624898, 0.00872822767437483, 0.3747063574731923, 0.16058843891708244, 0.10705895927805495, 0.053529479639027475, 0.053529479639027475, 0.053529479639027475, 0.053529479639027475, 0.053529479639027475, 0.053529479639027475, 0.15174935924840557, 0.050583119749468525, 0.4046649579957482, 0.050583119749468525, 0.10116623949893705, 0.050583119749468525, 0.050583119749468525, 0.050583119749468525, 0.050583119749468525, 0.17468213136727737, 0.34936426273455473, 0.17468213136727737, 0.16451213727680458, 0.17253711958299017, 0.17654961073608294, 0.060187367296391917, 0.10432476998041265, 0.09228729652113427, 0.08827480536804147, 0.06821234960257751, 0.060187367296391917, 0.008024982306185589, 0.1536303075927538, 0.17788772458108335, 0.12937289060442425, 0.11320127927887122, 0.125329987773036, 0.07277225096498864, 0.06468644530221213, 0.0768151537963769, 0.07277225096498864, 0.012128708494164774, 0.15750723329006128, 0.18291162575620018, 0.13210284082392235, 0.09653669137132788, 0.11177932685101122, 0.13210284082392235, 0.06097054191873339, 0.06605142041196117, 0.05588966342550561, 0.010161756986455566, 0.1559284994033856, 0.3430426986874483, 0.06237139976135424, 0.12474279952270848, 0.03118569988067712, 0.1559284994033856, 0.06237139976135424, 0.06237139976135424, 0.03118569988067712, 0.15294813276516453, 0.20568886820142818, 0.1147110995738734, 0.13448887536247228, 0.09229628701346136, 0.11866665473159319, 0.06460740090942295, 0.05801480897989, 0.05274073543626363, 0.006592591929532954, 0.19039672580369918, 0.1142380354822195, 0.07615869032147968, 0.07615869032147968, 0.15231738064295935, 0.15231738064295935, 0.03807934516073984, 0.15231738064295935, 0.03807934516073984, 0.18821360704548834, 0.1176335044034302, 0.07058010264205812, 0.1176335044034302, 0.14116020528411624, 0.1176335044034302, 0.023526700880686043, 0.14116020528411624, 0.07058010264205812, 0.15477403136973347, 0.17799013607519348, 0.12381922509578677, 0.14703532980124678, 0.10060312039032675, 0.06964831411638006, 0.07738701568486674, 0.10834182195881342, 0.03869350784243337, 0.309752105660028, 0.154876052830014, 0.08850060161715084, 0.11062575202143855, 0.08850060161715084, 0.08850060161715084, 0.06637545121286313, 0.04425030080857542, 0.04425030080857542, 0.21477769769618404, 0.21477769769618404, 0.21477769769618404, 0.21477769769618404, 0.3721102393695853, 0.12403674645652844, 0.06201837322826422, 0.12403674645652844, 0.06201837322826422, 0.06201837322826422, 0.06201837322826422, 0.06201837322826422, 0.06201837322826422, 0.3785912028531457, 0.09464780071328642, 0.09464780071328642, 0.09464780071328642, 0.09464780071328642, 0.09464780071328642, 0.09464780071328642, 0.09464780071328642, 0.09464780071328642, 0.14555231937398558, 0.09703487958265705, 0.14555231937398558, 0.09703487958265705, 0.14555231937398558, 0.14555231937398558, 0.048517439791328526, 0.14555231937398558, 0.048517439791328526, 0.0979615224303986, 0.1959230448607972, 0.11755382691647832, 0.21551534934687694, 0.07836921794431888, 0.07836921794431888, 0.05877691345823916, 0.15673843588863776, 0.03918460897215944, 0.1572780146651629, 0.20917975950466666, 0.18244249701158896, 0.08178456762588471, 0.10223070953235588, 0.05032896469285213, 0.07706622718592983, 0.06448398601271679, 0.06605676615936842, 0.007863900733258145, 0.26945548546695824, 0.2636672477114658, 0.18749670948370903, 0.12304471559868405, 0.08788908257048861, 0.07031126605639089, 0.0820298103991227, 0.05859272171365907, 0.04687417737092726, 0.0761705382277568, 0.011718544342731814, 0.18335534687870755, 0.11668067528645026, 0.11668067528645026, 0.28336735426709353, 0.050006003694192974, 0.10001200738838595, 0.050006003694192974, 0.033337335796128645, 0.050006003694192974, 0.1891533967835077, 0.1891533967835077, 0.09457669839175385, 0.28373009517526154, 0.047288349195876926, 0.09457669839175385, 0.047288349195876926, 0.047288349195876926, 0.047288349195876926, 0.2203855640759153, 0.11019278203795765, 0.07346185469197176, 0.11019278203795765, 0.07346185469197176, 0.2203855640759153, 0.07346185469197176, 0.11019278203795765, 0.07346185469197176, 0.15072922671878083, 0.15072922671878083, 0.07536461335939042, 0.07536461335939042, 0.07536461335939042, 0.22609384007817127, 0.07536461335939042, 0.07536461335939042, 0.07536461335939042, 0.076706562709143, 0.230119688127429, 0.1150598440637145, 0.1150598440637145, 0.2684729694820005, 0.0383532813545715, 0.0383532813545715, 0.076706562709143, 0.076706562709143, 0.2152385869429403, 0.17515967765011695, 0.10539268739964663, 0.09054864692082316, 0.11429911168694071, 0.10539268739964663, 0.06531377810682326, 0.07422020239411735, 0.048985333580117446, 0.0059376161915293876, 0.2699788045479177, 0.2699788045479177, 0.2699788045479177, 0.08581503614284275, 0.343260144571371, 0.08581503614284275, 0.1716300722856855, 0.12872255421426412, 0.12872255421426412, 0.042907518071421374, 0.042907518071421374, 0.042907518071421374, 0.2213430690922499, 0.11067153454612495, 0.11067153454612495, 0.11067153454612495, 0.11067153454612495, 0.11067153454612495, 0.11067153454612495, 0.2213430690922499, 0.16157066791799074, 0.16157066791799074, 0.08078533395899537, 0.2423560018769861, 0.08078533395899537, 0.08078533395899537, 0.08078533395899537, 0.08078533395899537, 0.17357512062097447, 0.17357512062097447, 0.34715024124194893, 0.17357512062097447, 0.19562447227304267, 0.18747345259499923, 0.220077531307173, 0.05705713774630411, 0.073359177102391, 0.08151019678043445, 0.08966121645847788, 0.04075509839021722, 0.04075509839021722, 0.008151019678043445, 0.22828852306438505, 0.25546572819109753, 0.17393411281096002, 0.04891896922808251, 0.059789851278767514, 0.07609617435479502, 0.08153161538013752, 0.043483528202740006, 0.021741764101370003, 0.005435441025342501, 0.1662447205464043, 0.041561180136601074, 0.3324894410928086, 0.08312236027320215, 0.12468354040980323, 0.041561180136601074, 0.041561180136601074, 0.08312236027320215, 0.08312236027320215, 0.1919978243063002, 0.3839956486126004, 0.1919978243063002, 0.0959989121531501, 0.0959989121531501, 0.0959989121531501, 0.161536247304274, 0.323072494608548, 0.161536247304274, 0.161536247304274, 0.11709975587639218, 0.3087175382195794, 0.06387259411439573, 0.09580889117159361, 0.12774518822879147, 0.06387259411439573, 0.07451802646679503, 0.04258172940959716, 0.08516345881919432, 0.01064543235239929, 0.17790387619897052, 0.08895193809948526, 0.08895193809948526, 0.08895193809948526, 0.17790387619897052, 0.1334279071492279, 0.04447596904974263, 0.17790387619897052, 0.04447596904974263, 0.20854021658441152, 0.16974203675475358, 0.14710976518745308, 0.11639453948897388, 0.0937622679216734, 0.06304704222319418, 0.05658067891825119, 0.07921295048555167, 0.05658067891825119, 0.00969954495741449, 0.06610727469360658, 0.19832182408081972, 0.06610727469360658, 0.26442909877442633, 0.19832182408081972, 0.06610727469360658, 0.13221454938721316, 0.20176166185895578, 0.13624112218310933, 0.14352118214709225, 0.12168100225514343, 0.105040865194611, 0.07384060820611268, 0.07592062533867924, 0.06760055680841302, 0.06656054824212974, 0.009360077096549495, 0.16370804190941804, 0.1796795581932637, 0.14773652562557238, 0.09982197677403538, 0.08784333956115115, 0.1038148558449968, 0.06388606513538264, 0.0758647023482669, 0.0798575814192283, 0.00798575814192283, 0.12670790431168222, 0.38012371293504665, 0.12670790431168222, 0.12670790431168222, 0.12670790431168222, 0.12670790431168222, 0.3314462281482134, 0.12211176826513125, 0.06977815329436071, 0.08722269161795089, 0.12211176826513125, 0.13955630658872142, 0.052333614970770534, 0.017444538323590178, 0.034889076647180356, 0.15091911715457254, 0.2234763850173478, 0.10448246572239638, 0.09867788429337435, 0.10448246572239638, 0.09867788429337435, 0.0841664307208193, 0.07255726786277526, 0.0580458142902202, 0.00870687214353303, 0.2113491970799581, 0.2113491970799581, 0.2113491970799581, 0.2113491970799581, 0.15169995762355112, 0.12135996609884091, 0.28822991948474713, 0.04550998728706534, 0.060679983049420454, 0.10618997033648579, 0.09101997457413068, 0.07584997881177556, 0.04550998728706534, 0.3301349492323882, 0.12697498047399544, 0.15236997656879453, 0.07618498828439726, 0.10157998437919635, 0.050789992189598177, 0.07618498828439726, 0.050789992189598177, 0.025394996094799088, 0.13152839497891633, 0.19729259246837447, 0.0939488535563688, 0.17850282175710072, 0.08455396820073192, 0.08455396820073192, 0.0469744267781844, 0.1409232803345532, 0.03757954142254752, 0.10699673424356462, 0.14266231232475282, 0.24965904656831744, 0.10699673424356462, 0.10699673424356462, 0.07133115616237641, 0.07133115616237641, 0.07133115616237641, 0.035665578081188205, 0.2108474777480429, 0.07028249258268096, 0.07028249258268096, 0.07028249258268096, 0.07028249258268096, 0.28112997033072384, 0.07028249258268096, 0.07028249258268096, 0.07028249258268096, 0.22928740114694757, 0.22928740114694757, 0.1816939977169309, 0.1816939977169309, 0.09084699885846545, 0.09084699885846545, 0.09084699885846545, 0.09084699885846545, 0.09084699885846545, 0.1816939977169309, 0.09084699885846545, 0.3163881511816147, 0.09305533858282786, 0.16749960944909015, 0.07444427086626228, 0.09305533858282786, 0.09305533858282786, 0.07444427086626228, 0.05583320314969671, 0.03722213543313114, 0.13887625594769704, 0.13887625594769704, 0.13887625594769704, 0.13887625594769704, 0.13887625594769704, 0.13887625594769704, 0.13887625594769704, 0.2709083589604869, 0.2709083589604869, 0.1400776387819659, 0.1400776387819659, 0.1400776387819659, 0.2801552775639318, 0.07003881939098296, 0.07003881939098296, 0.07003881939098296, 0.07003881939098296, 0.17747079640247015, 0.3549415928049403, 0.17747079640247015, 0.3428348657295049, 0.08570871643237622, 0.17141743286475244, 0.08570871643237622, 0.08570871643237622, 0.08570871643237622, 0.08570871643237622, 0.08570871643237622, 0.08570871643237622, 0.04600773868689849, 0.2760464321213909, 0.09201547737379698, 0.2760464321213909, 0.061343651582531314, 0.04600773868689849, 0.030671825791265657, 0.15335912895632828, 0.015335912895632828, 0.24890634240892842, 0.12445317120446421, 0.12445317120446421, 0.12445317120446421, 0.12445317120446421, 0.24890634240892842, 0.12445317120446421, 0.45075564578480654, 0.15025188192826885, 0.15025188192826885, 0.15025188192826885, 0.18444478953017268, 0.18444478953017268, 0.18444478953017268, 0.18444478953017268, 0.18444478953017268, 0.3031927454446855, 0.04664503776072085, 0.11661259440180212, 0.06996755664108127, 0.20990266992324383, 0.11661259440180212, 0.06996755664108127, 0.023322518880360424, 0.023322518880360424, 0.1246290678874306, 0.19260855946239272, 0.15861881367491165, 0.06797949157496214, 0.13595898314992427, 0.07930940683745583, 0.056649576312468454, 0.11329915262493691, 0.056649576312468454, 0.01132991526249369, 0.1622822867061956, 0.2028528583827445, 0.0811411433530978, 0.10548348635902714, 0.06491291468247824, 0.17039640104150539, 0.06491291468247824, 0.09736937202371736, 0.04868468601185868, 0.00811411433530978, 0.1997582763583042, 0.12292817006664875, 0.06146408503332437, 0.10756214880831765, 0.169026233841642, 0.06146408503332437, 0.15366021258331092, 0.06146408503332437, 0.06146408503332437, 0.015366021258331093, 0.13586173816799865, 0.2717234763359973, 0.13586173816799865, 0.13586173816799865, 0.13586173816799865, 0.13586173816799865, 0.13586173816799865, 0.1981106114489449, 0.08490454776383352, 0.16980909552766704, 0.11320606368511137, 0.05660303184255568, 0.08490454776383352, 0.16980909552766704, 0.08490454776383352, 0.02830151592127784, 0.02830151592127784, 0.1947784868500626, 0.0973892434250313, 0.14608386513754695, 0.14608386513754695, 0.04869462171251565, 0.0973892434250313, 0.0973892434250313, 0.04869462171251565, 0.14608386513754695, 0.16717205858404, 0.22175885322372652, 0.10917358927937305, 0.10917358927937305, 0.08870354128949061, 0.09893856528443183, 0.06823349329960816, 0.07164516796458857, 0.05458679463968653, 0.010235023994941224, 0.1856283552106412, 0.126564787643619, 0.11812713513404441, 0.084376525095746, 0.1012518301148952, 0.168753050191492, 0.0506259150574476, 0.084376525095746, 0.0675012200765968, 0.0084376525095746, 0.23659559780409617, 0.12858456402396531, 0.14401471170684116, 0.12858456402396531, 0.04629044304862751, 0.05657720817054474, 0.07200735585342058, 0.08743750353629641, 0.08743750353629641, 0.010286765121917226, 0.21609189560282993, 0.21609189560282993, 0.21609189560282993, 0.16993805829764663, 0.2719008932762346, 0.06797522331905866, 0.1359504466381173, 0.1359504466381173, 0.06797522331905866, 0.03398761165952933, 0.06797522331905866, 0.03398761165952933, 0.03398761165952933, 0.3820229800184642, 0.27214023652586145, 0.13607011826293072, 0.13607011826293072, 0.13607011826293072, 0.13607011826293072, 0.13607011826293072, 0.13607011826293072, 0.15934231169000462, 0.2731582486114365, 0.10243434322928868, 0.08536195269107391, 0.0910527495371455, 0.07398035899893071, 0.07398035899893071, 0.06259876530678753, 0.07398035899893071, 0.0056907968460715936, 0.16374499075985285, 0.29240176921402294, 0.11696070768560919, 0.07017642461136551, 0.10526463691704827, 0.058480353842804596, 0.08187249537992643, 0.04678428307424368, 0.04678428307424368, 0.01169607076856092, 0.18768270584363334, 0.18768270584363334, 0.18768270584363334, 0.18768270584363334, 0.18768270584363334, 0.3650534566635606, 0.3367717547515365, 0.14032156447980687, 0.08419293868788412, 0.11225725158384549, 0.14032156447980687, 0.08419293868788412, 0.056128625791922744, 0.028064312895961372, 0.056128625791922744, 0.17736005908086322, 0.14606122512541678, 0.1251953358217858, 0.1251953358217858, 0.11476239116997032, 0.0625976679108929, 0.1356282804736013, 0.052164723259077415, 0.052164723259077415, 0.010432944651815483], \"Term\": [\"abalone\", \"abend\", \"acetylcholine\", \"acetylcholine\", \"acetylcholine\", \"acetylcholine\", \"acetylcholine\", \"acetylcholine\", \"ach\", \"ach\", \"ach\", \"action\", \"action\", \"action\", \"action\", \"action\", \"action\", \"action\", \"action\", \"action\", \"action\", \"actor\", \"actor\", \"actor\", \"actor\", \"actor\", \"actor\", \"actor\", \"actor\", \"actor\", \"actors\", \"actors\", \"actors\", \"actors\", \"actors\", \"actors\", \"actors\", \"acute\", \"adequate\", \"adequate\", \"adequate\", \"adequate\", \"adequate\", \"admixture\", \"admixture\", \"admixture\", \"admixture\", \"admixture\", \"admixture\", \"admixture\", \"admixture\", \"admixture\", \"afferent\", \"afferent\", \"afferent\", \"afferent\", \"afferent\", \"afferent\", \"afferent\", \"afferent\", \"afferent\", \"afferent\", \"agent\", \"agent\", \"agent\", \"agent\", \"agent\", \"agent\", \"agent\", \"agent\", \"agent\", \"agent\", \"aim\", \"aim\", \"aim\", \"aim\", \"aim\", \"aim\", \"aim\", \"aim\", \"aim\", \"ajk\", \"alchemy\", \"alchemy\", \"alchemy\", \"alchemy\", \"alchemy\", \"alchemy\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithms\", \"algorithms\", \"algorithms\", \"algorithms\", \"algorithms\", \"algorithms\", \"algorithms\", \"algorithms\", \"algorithms\", \"algorithms\", \"alocal\", \"alocal\", \"alocal\", \"alocal\", \"also\", \"also\", \"also\", \"also\", \"also\", \"also\", \"also\", \"also\", \"also\", \"also\", \"alzheimer\", \"amari\", \"amari\", \"amari\", \"amari\", \"amari\", \"amari\", \"amari\", \"amari\", \"amf\", \"analysis\", \"analysis\", \"analysis\", \"analysis\", \"analysis\", \"analysis\", \"analysis\", \"analysis\", \"analysis\", \"analysis\", \"appointment\", \"appointments\", \"appointments\", \"appointments\", \"appointments\", \"appointments\", \"appointments\", \"appointments\", \"appointments\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"ar\", \"ar\", \"ar\", \"ar\", \"ar\", \"ar\", \"ar\", \"ar\", \"ar\", \"ar\", \"associations\", \"associations\", \"assumption\", \"assumption\", \"assumption\", \"assumption\", \"assumption\", \"assumption\", \"assumption\", \"assumption\", \"assumption\", \"assumption\", \"atoms\", \"atoms\", \"atoms\", \"atoms\", \"atoms\", \"atoms\", \"atoms\", \"atoms\", \"atoms\", \"average\", \"average\", \"average\", \"average\", \"average\", \"average\", \"average\", \"average\", \"average\", \"average\", \"ax\", \"ax\", \"ax\", \"ax\", \"ax\", \"ax\", \"ax\", \"ax\", \"ax\", \"axon\", \"axon\", \"axon\", \"axon\", \"axon\", \"axon\", \"axon\", \"baldwin\", \"baldwin\", \"baldwin\", \"baldwin\", \"baldwin\", \"baldwin\", \"balls\", \"balls\", \"balls\", \"balls\", \"balls\", \"balls\", \"balls\", \"balls\", \"balls\", \"bandpass\", \"bandpass\", \"barkai\", \"barkai\", \"barkai\", \"barkai\", \"barkai\", \"barkai\", \"based\", \"based\", \"based\", \"based\", \"based\", \"based\", \"based\", \"based\", \"based\", \"based\", \"batch\", \"batch\", \"batch\", \"batch\", \"batch\", \"batch\", \"batch\", \"batch\", \"batch\", \"batch\", \"bayes\", \"bayes\", \"bayes\", \"bayes\", \"bayes\", \"bayes\", \"bayes\", \"bayes\", \"bayes\", \"bayesian\", \"bayesian\", \"bayesian\", \"bayesian\", \"bayesian\", \"bayesian\", \"bayesian\", \"bayesian\", \"bayesian\", \"bayesian\", \"bci\", \"bci\", \"bci\", \"bci\", \"bci\", \"bci\", \"bci\", \"bci\", \"bci\", \"beliefs\", \"beliefs\", \"beliefs\", \"beliefs\", \"beliefs\", \"beliefs\", \"beliefs\", \"beliefs\", \"beliefs\", \"berke\", \"berke\", \"berke\", \"best\", \"best\", \"best\", \"best\", \"best\", \"best\", \"best\", \"best\", \"best\", \"best\", \"bg\", \"bg\", \"bg\", \"bg\", \"bg\", \"bhlmann\", \"bhlmann\", \"bhlmann\", \"bhlmann\", \"bias\", \"bias\", \"bias\", \"bias\", \"bias\", \"bias\", \"bias\", \"bias\", \"bias\", \"bias\", \"birds\", \"birds\", \"birds\", \"birds\", \"birds\", \"birds\", \"birds\", \"birds\", \"birds\", \"blog\", \"blog\", \"blog\", \"blog\", \"blog\", \"blog\", \"blog\", \"blurred\", \"bonn\", \"boosting\", \"boosting\", \"boosting\", \"boosting\", \"boosting\", \"boosting\", \"boosting\", \"boosting\", \"boosting\", \"boosting\", \"bottou\", \"bottou\", \"bottou\", \"bottou\", \"bottou\", \"bottou\", \"bottou\", \"bottou\", \"bottou\", \"bower\", \"bower\", \"bower\", \"bower\", \"bower\", \"bower\", \"bower\", \"br\", \"br\", \"br\", \"br\", \"br\", \"br\", \"br\", \"br\", \"br\", \"cal\", \"cal\", \"cal\", \"cal\", \"cal\", \"cal\", \"cal\", \"cal\", \"cal\", \"calcium\", \"calcium\", \"calcium\", \"calcium\", \"calcium\", \"calcium\", \"calcium\", \"calcium\", \"calibrated\", \"calibrated\", \"calibrated\", \"calibrated\", \"calibrated\", \"calibrated\", \"calibrated\", \"calibrated\", \"calibrated\", \"calibrated\", \"case\", \"case\", \"case\", \"case\", \"case\", \"case\", \"case\", \"case\", \"case\", \"case\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"cbinin\", \"cbinin\", \"cbinin\", \"cbinin\", \"cbinin\", \"centre\", \"centre\", \"centre\", \"centre\", \"centre\", \"centre\", \"centre\", \"centre\", \"centre\", \"chambers\", \"chambers\", \"cho\", \"cho\", \"cho\", \"cholinergic\", \"cholinergic\", \"cholinergic\", \"cholinergic\", \"cholinergic\", \"cholinergic\", \"cholinergic\", \"cholinergic\", \"cholinergic\", \"cholinergic\", \"cilz\", \"cilz\", \"cilz\", \"clustering\", \"clustering\", \"clustering\", \"clustering\", \"clustering\", \"clustering\", \"clustering\", \"clustering\", \"clustering\", \"clustering\", \"cm\", \"cm\", \"cm\", \"cm\", \"cm\", \"cm\", \"cm\", \"cm\", \"cm\", \"cnn\", \"cnn\", \"cnn\", \"cnn\", \"cnn\", \"cnn\", \"cnn\", \"cnn\", \"cnn\", \"collaterals\", \"collaterals\", \"collaterals\", \"compilation\", \"compilation\", \"compilation\", \"compilation\", \"compilation\", \"compliant\", \"compliant\", \"compliant\", \"condition\", \"condition\", \"condition\", \"condition\", \"condition\", \"condition\", \"condition\", \"condition\", \"condition\", \"condition\", \"confidence\", \"confidence\", \"confidence\", \"confidence\", \"confidence\", \"confidence\", \"confidence\", \"confidence\", \"confidence\", \"confidence\", \"consider\", \"consider\", \"consider\", \"consider\", \"consider\", \"consider\", \"consider\", \"consider\", \"consider\", \"consider\", \"constraint\", \"constraint\", \"constraint\", \"constraint\", \"constraint\", \"constraint\", \"constraint\", \"constraint\", \"constraint\", \"constraint\", \"constraints\", \"constraints\", \"constraints\", \"constraints\", \"constraints\", \"constraints\", \"constraints\", \"constraints\", \"constraints\", \"constraints\", \"control\", \"control\", \"control\", \"control\", \"control\", \"control\", \"control\", \"control\", \"control\", \"control\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convex\", \"convex\", \"convex\", \"convex\", \"convex\", \"convex\", \"convex\", \"convex\", \"convex\", \"convex\", \"covariate\", \"covariate\", \"covariate\", \"covariate\", \"covariate\", \"covariate\", \"covariate\", \"covariate\", \"covariate\", \"coverage\", \"coverage\", \"coverage\", \"coverage\", \"coverage\", \"coverage\", \"coverage\", \"coverage\", \"coverage\", \"crm\", \"crm\", \"crm\", \"crm\", \"crm\", \"crm\", \"crm\", \"crm\", \"crystal\", \"cseiitdacin\", \"daes\", \"daes\", \"daes\", \"daes\", \"daes\", \"dag\", \"dag\", \"dag\", \"dag\", \"dag\", \"dag\", \"dag\", \"dag\", \"dag\", \"data\", \"data\", \"data\", \"data\", \"data\", \"data\", \"data\", \"data\", \"data\", \"data\", \"dc\", \"dc\", \"dc\", \"dc\", \"dc\", \"dc\", \"dc\", \"dc\", \"dc\", \"decomposer\", \"decomposer\", \"decomposer\", \"decomposer\", \"decomposer\", \"decomposer\", \"decomposer\", \"decomposer\", \"decomposer\", \"deg\", \"deg\", \"deg\", \"deg\", \"deg\", \"deg\", \"deg\", \"degraded\", \"degraded\", \"degraded\", \"degraded\", \"delhi\", \"delhi\", \"delhi\", \"dentate\", \"dentate\", \"dentate\", \"descendant\", \"descendant\", \"descendant\", \"descendant\", \"descendant\", \"descendant\", \"descendant\", \"descendant\", \"developmental\", \"developmental\", \"different\", \"different\", \"different\", \"different\", \"different\", \"different\", \"different\", \"different\", \"different\", \"different\", \"discovery\", \"discovery\", \"discovery\", \"discovery\", \"discovery\", \"discovery\", \"discovery\", \"discovery\", \"discovery\", \"discriminated\", \"discriminated\", \"disease\", \"disease\", \"disease\", \"disease\", \"disease\", \"disease\", \"disease\", \"disease\", \"disease\", \"disjunction\", \"displays\", \"displays\", \"displays\", \"displays\", \"displays\", \"displays\", \"displays\", \"displays\", \"displays\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distributions\", \"distributions\", \"distributions\", \"distributions\", \"distributions\", \"distributions\", \"distributions\", \"distributions\", \"distributions\", \"distributions\", \"domay\", \"domay\", \"domay\", \"domay\", \"domay\", \"domay\", \"domay\", \"domay\", \"domay\", \"dominate\", \"dominate\", \"dominate\", \"domingos\", \"domingos\", \"domingos\", \"dornay\", \"dornay\", \"dorsal\", \"dorsal\", \"dorsal\", \"dorsal\", \"dorsal\", \"dorsal\", \"dorsal\", \"dorsal\", \"dorsal\", \"dropout\", \"dropout\", \"dropout\", \"dropout\", \"ecserpiedu\", \"ecserpiedu\", \"edi\", \"edi\", \"edi\", \"eejj\", \"eejj\", \"eejj\", \"eejj\", \"eff\", \"eff\", \"eff\", \"eff\", \"eff\", \"eff\", \"eff\", \"eff\", \"eggs\", \"eggs\", \"eggs\", \"eggs\", \"eggs\", \"eggs\", \"eggs\", \"eggs\", \"eggs\", \"eigenfunction\", \"eigenfunction\", \"eigenfunction\", \"eigenfunction\", \"eigenfunction\", \"eigenfunction\", \"eigenfunction\", \"eigenfunction\", \"eigenfunction\", \"eigenfunctions\", \"eigenfunctions\", \"eigenfunctions\", \"eigenfunctions\", \"eigenfunctions\", \"eigenfunctions\", \"eigenfunctions\", \"eigenfunctions\", \"eigenfunctions\", \"elisabeth\", \"elisabeth\", \"embedded\", \"embedded\", \"embedded\", \"embedded\", \"embedded\", \"embedded\", \"embedded\", \"embedded\", \"embedded\", \"energy\", \"energy\", \"energy\", \"energy\", \"energy\", \"energy\", \"energy\", \"energy\", \"energy\", \"energy\", \"entorhinal\", \"entorhinal\", \"entorhinal\", \"entorhinal\", \"entorhinal\", \"entorhinal\", \"entorhinal\", \"entorhinal\", \"epv\", \"epv\", \"epv\", \"eq\", \"eq\", \"eq\", \"eq\", \"eq\", \"eq\", \"eq\", \"eq\", \"eq\", \"eq\", \"era\", \"era\", \"era\", \"era\", \"eric\", \"eric\", \"eric\", \"eric\", \"err\", \"err\", \"err\", \"err\", \"err\", \"err\", \"err\", \"err\", \"err\", \"error\", \"error\", \"error\", \"error\", \"error\", \"error\", \"error\", \"error\", \"error\", \"error\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimators\", \"estimators\", \"estimators\", \"estimators\", \"estimators\", \"estimators\", \"estimators\", \"estimators\", \"estimators\", \"estimators\", \"et\", \"et\", \"et\", \"et\", \"et\", \"et\", \"et\", \"et\", \"et\", \"et\", \"ethernet\", \"ethernet\", \"excess\", \"excess\", \"excess\", \"excess\", \"excess\", \"excess\", \"excess\", \"excess\", \"excess\", \"expansion\", \"expansion\", \"expansion\", \"expansion\", \"expansion\", \"expansion\", \"expansion\", \"expansion\", \"expansion\", \"expressions\", \"expressions\", \"expressions\", \"expressions\", \"expressions\", \"expressions\", \"expressions\", \"expressions\", \"expressions\", \"farima\", \"farima\", \"farima\", \"farima\", \"farima\", \"farima\", \"farima\", \"farima\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"features\", \"features\", \"features\", \"features\", \"features\", \"features\", \"features\", \"features\", \"features\", \"features\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"first\", \"first\", \"first\", \"first\", \"first\", \"first\", \"first\", \"first\", \"first\", \"first\", \"fm\", \"fm\", \"fm\", \"fm\", \"fm\", \"fm\", \"fm\", \"fm\", \"fm\", \"fmri\", \"fmri\", \"fmri\", \"fmri\", \"fmri\", \"fmri\", \"fmri\", \"fmri\", \"fmri\", \"fmri\", \"formula\", \"formula\", \"formula\", \"formula\", \"formula\", \"formula\", \"formula\", \"formula\", \"formula\", \"framework\", \"framework\", \"framework\", \"framework\", \"framework\", \"framework\", \"framework\", \"framework\", \"framework\", \"framework\", \"frequentist\", \"frequentist\", \"frequentist\", \"frequentist\", \"frequentist\", \"frequentist\", \"frequentist\", \"frequentist\", \"frequentist\", \"fresh\", \"fresh\", \"fresh\", \"fresh\", \"fs\", \"fs\", \"fs\", \"fs\", \"fs\", \"fs\", \"fs\", \"fs\", \"function\", \"function\", \"function\", \"function\", \"function\", \"function\", \"function\", \"function\", \"function\", \"function\", \"fv\", \"fv\", \"fv\", \"fv\", \"fv\", \"fv\", \"fv\", \"fv\", \"fv\", \"fxi\", \"fxi\", \"fxi\", \"fxi\", \"fxi\", \"fxi\", \"fxi\", \"fxi\", \"fxi\", \"gauss\", \"gauss\", \"gauss\", \"gauss\", \"gcfove\", \"gcfove\", \"gcfove\", \"gcfove\", \"gcfove\", \"gcfove\", \"gcfove\", \"gcfove\", \"gcfove\", \"gcfvoe\", \"gcfvoe\", \"given\", \"given\", \"given\", \"given\", \"given\", \"given\", \"given\", \"given\", \"given\", \"given\", \"goodness\", \"goodness\", \"goodness\", \"goodness\", \"goodness\", \"goodness\", \"goodness\", \"gop\", \"gop\", \"gop\", \"gop\", \"gop\", \"gop\", \"gp\", \"gp\", \"gp\", \"gp\", \"gp\", \"gp\", \"gp\", \"gp\", \"gp\", \"gpr\", \"gpr\", \"gpr\", \"gpr\", \"gpr\", \"gpr\", \"gpr\", \"gpr\", \"gpr\", \"grammar\", \"grammar\", \"grammar\", \"grammar\", \"grammar\", \"grammar\", \"grammar\", \"grammar\", \"grammar\", \"griffin\", \"griffin\", \"griffin\", \"griffin\", \"griffin\", \"gross\", \"groundings\", \"groundings\", \"groundings\", \"groundings\", \"groundings\", \"groundings\", \"gs\", \"gs\", \"gs\", \"gs\", \"gs\", \"gs\", \"gs\", \"gs\", \"gs\", \"gyrus\", \"gyrus\", \"gyrus\", \"gyrus\", \"hagan\", \"hasselmo\", \"hasselmo\", \"hasselmo\", \"hasselmo\", \"hasselmo\", \"hasselmo\", \"hasselmo\", \"hasselmo\", \"hasselmo\", \"hauz\", \"heaviside\", \"heaviside\", \"heaviside\", \"heaviside\", \"hetero\", \"hilbertian\", \"hilbertian\", \"hilbertian\", \"hilbertian\", \"hilbertian\", \"hilbertian\", \"hilbertian\", \"hinge\", \"hinge\", \"hinge\", \"hinge\", \"hinge\", \"hinge\", \"hinge\", \"hinge\", \"hinge\", \"hippocampal\", \"hippocampal\", \"hippocampal\", \"hippocampal\", \"hippocampal\", \"hippocampal\", \"hippocampal\", \"hippocampal\", \"hippocampal\", \"hippocampus\", \"hippocampus\", \"hippocampus\", \"hippocampus\", \"hippocampus\", \"hippocampus\", \"hippocampus\", \"hkl\", \"hkl\", \"hkl\", \"hkl\", \"hkl\", \"hkl\", \"hkl\", \"hkl\", \"hkl\", \"hlm\", \"hlm\", \"hlm\", \"hlm\", \"hlm\", \"hlm\", \"hlm\", \"hlm\", \"hlm\", \"hme\", \"hme\", \"hme\", \"hme\", \"hme\", \"hme\", \"hme\", \"hme\", \"hme\", \"hogan\", \"hogan\", \"hogan\", \"hogan\", \"hogan\", \"hogan\", \"hogan\", \"hogan\", \"hogan\", \"hollow\", \"hollow\", \"hosking\", \"however\", \"however\", \"however\", \"however\", \"however\", \"however\", \"however\", \"however\", \"however\", \"however\", \"hull\", \"hull\", \"hull\", \"hull\", \"hull\", \"hull\", \"hull\", \"hull\", \"hull\", \"hulls\", \"hurst\", \"hurst\", \"hyperplane\", \"hyperplane\", \"hyperplane\", \"hyperplane\", \"hyperplane\", \"hyperplane\", \"hyperplane\", \"hyperplane\", \"hyperplane\", \"ica\", \"ica\", \"ica\", \"ica\", \"ica\", \"ica\", \"ica\", \"ica\", \"ica\", \"identication\", \"identication\", \"identication\", \"identication\", \"identication\", \"identication\", \"identication\", \"identication\", \"identication\", \"identied\", \"identied\", \"identied\", \"identied\", \"identied\", \"identied\", \"identifiability\", \"identifiability\", \"identifiability\", \"identifiability\", \"identifiability\", \"identifiability\", \"identifiability\", \"identifiability\", \"identifiability\", \"ii\", \"ii\", \"ii\", \"ii\", \"ii\", \"ii\", \"ii\", \"ii\", \"ii\", \"ii\", \"im\", \"im\", \"im\", \"im\", \"im\", \"im\", \"im\", \"im\", \"im\", \"image\", \"image\", \"image\", \"image\", \"image\", \"image\", \"image\", \"image\", \"image\", \"image\", \"inadequate\", \"inadequate\", \"inadequate\", \"ineq\", \"ineq\", \"ineq\", \"ineq\", \"inertia\", \"inertia\", \"inertia\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"information\", \"information\", \"information\", \"information\", \"information\", \"information\", \"information\", \"information\", \"information\", \"information\", \"inhibitory\", \"inhibitory\", \"inhibitory\", \"inhibitory\", \"inhibitory\", \"inhibitory\", \"inhibitory\", \"inhibitory\", \"inhibitory\", \"inhibitory\", \"input\", \"input\", \"input\", \"input\", \"input\", \"input\", \"input\", \"input\", \"input\", \"input\", \"interfering\", \"interfering\", \"interneuron\", \"intervals\", \"intervals\", \"intervals\", \"intervals\", \"intervals\", \"intervals\", \"intervals\", \"intervals\", \"intervals\", \"intervals\", \"invariance\", \"invariance\", \"invariance\", \"invariance\", \"invariance\", \"invariance\", \"invariance\", \"invariance\", \"invariance\", \"ir\", \"ir\", \"ir\", \"ir\", \"ir\", \"ir\", \"ir\", \"ir\", \"ishikawa\", \"ishikawa\", \"ishikawa\", \"jerk\", \"jerk\", \"jerk\", \"jerk\", \"jerk\", \"jerk\", \"jerk\", \"jerk\", \"joshua\", \"joshua\", \"joshua\", \"judged\", \"karnin\", \"kawato\", \"kawato\", \"kawato\", \"kawato\", \"kawato\", \"kawato\", \"kawato\", \"kawato\", \"kawato\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernels\", \"kernels\", \"kernels\", \"kernels\", \"kernels\", \"kernels\", \"kernels\", \"kernels\", \"kernels\", \"kernels\", \"kgap\", \"kgap\", \"kgap\", \"kgap\", \"kgap\", \"kl\", \"kl\", \"kl\", \"kl\", \"kl\", \"kl\", \"kl\", \"kl\", \"kl\", \"kl\", \"known\", \"known\", \"known\", \"known\", \"known\", \"known\", \"known\", \"known\", \"known\", \"known\", \"komodakis\", \"komodakis\", \"komodakis\", \"kotz\", \"kotz\", \"labelling\", \"labelling\", \"labelling\", \"labelling\", \"labelling\", \"labelling\", \"labelling\", \"labelling\", \"labelling\", \"large\", \"large\", \"large\", \"large\", \"large\", \"large\", \"large\", \"large\", \"large\", \"large\", \"lasso\", \"lasso\", \"lasso\", \"lasso\", \"lasso\", \"lasso\", \"lasso\", \"lasso\", \"lasso\", \"lasso\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"learner\", \"learner\", \"learner\", \"learner\", \"learner\", \"learner\", \"learner\", \"learner\", \"learner\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"let\", \"let\", \"let\", \"let\", \"let\", \"let\", \"let\", \"let\", \"let\", \"let\", \"lgp\", \"lgp\", \"lgp\", \"lgp\", \"lgp\", \"lgp\", \"lgp\", \"lgp\", \"lgp\", \"lifted\", \"lifted\", \"lifted\", \"lifted\", \"lifted\", \"lifted\", \"lifted\", \"lifted\", \"lifted\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"linear\", \"linear\", \"linear\", \"linear\", \"linear\", \"linear\", \"linear\", \"linear\", \"linear\", \"linear\", \"linsker\", \"linsker\", \"linsker\", \"linsker\", \"linsker\", \"linsker\", \"linsker\", \"linsker\", \"linsker\", \"lmd\", \"lmd\", \"lmd\", \"lmd\", \"lmica\", \"lmica\", \"lmica\", \"lmica\", \"lmica\", \"lmica\", \"lmica\", \"lmica\", \"lmica\", \"ln\", \"ln\", \"ln\", \"ln\", \"ln\", \"ln\", \"ln\", \"ln\", \"ln\", \"ln\", \"lobs\", \"lobs\", \"lobs\", \"lobs\", \"local\", \"local\", \"local\", \"local\", \"local\", \"local\", \"local\", \"local\", \"local\", \"local\", \"log\", \"log\", \"log\", \"log\", \"log\", \"log\", \"log\", \"log\", \"log\", \"log\", \"logprobability\", \"logprobability\", \"logprobability\", \"logprobability\", \"loo\", \"loo\", \"loo\", \"loo\", \"loo\", \"loo\", \"loo\", \"loo\", \"loo\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"losses\", \"losses\", \"losses\", \"losses\", \"losses\", \"losses\", \"losses\", \"losses\", \"losses\", \"losses\", \"lrd\", \"lrd\", \"lrd\", \"lrd\", \"ltp\", \"ltp\", \"ltp\", \"lwpr\", \"lwpr\", \"lwpr\", \"lwpr\", \"lwpr\", \"lwpr\", \"lwpr\", \"lwpr\", \"lwpr\", \"macaque\", \"macaque\", \"macaque\", \"macaque\", \"macaque\", \"macaque\", \"macaque\", \"macaque\", \"mallard\", \"mallard\", \"mallard\", \"mallard\", \"map\", \"map\", \"map\", \"map\", \"map\", \"map\", \"map\", \"map\", \"map\", \"map\", \"marginal\", \"marginal\", \"marginal\", \"marginal\", \"marginal\", \"marginal\", \"marginal\", \"marginal\", \"marginal\", \"marginal\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matsuda\", \"matsuda\", \"matsuda\", \"matsuda\", \"matsuda\", \"matsuda\", \"matsuda\", \"maxkurt\", \"maxkurt\", \"maxkurt\", \"maxkurt\", \"maxkurt\", \"maxkurt\", \"maxkurt\", \"maxkurt\", \"maxkurt\", \"mca\", \"mca\", \"mca\", \"mca\", \"mca\", \"mca\", \"mca\", \"mca\", \"mcu\", \"mcu\", \"mcu\", \"mcu\", \"mcu\", \"mcu\", \"mcu\", \"mcu\", \"mcu\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mental\", \"mental\", \"mental\", \"mental\", \"mental\", \"mental\", \"mental\", \"mental\", \"mental\", \"method\", \"method\", \"method\", \"method\", \"method\", \"method\", \"method\", \"method\", \"method\", \"method\", \"methods\", \"methods\", \"methods\", \"methods\", \"methods\", \"methods\", \"methods\", \"methods\", \"methods\", \"methods\", \"mincut\", \"mincut\", \"mincut\", \"mincut\", \"mincut\", \"mincut\", \"mincut\", \"mincut\", \"mincut\", \"minor\", \"minor\", \"minor\", \"minor\", \"minor\", \"minor\", \"minor\", \"minor\", \"mittal\", \"mittal\", \"mln\", \"mln\", \"mln\", \"mln\", \"mln\", \"mln\", \"mln\", \"mln\", \"mln\", \"mlns\", \"mlns\", \"mlns\", \"mlp\", \"mlp\", \"mlp\", \"mlp\", \"mlp\", \"mlp\", \"mlp\", \"mlp\", \"mlp\", \"mlp\", \"model\", \"model\", \"model\", \"model\", \"model\", \"model\", \"model\", \"model\", \"model\", \"model\", \"modeling\", \"modeling\", \"modeling\", \"modeling\", \"modeling\", \"modeling\", \"modeling\", \"modeling\", \"modeling\", \"modeling\", \"models\", \"models\", \"models\", \"models\", \"models\", \"models\", \"models\", \"models\", \"models\", \"models\", \"modulation\", \"modulation\", \"modulation\", \"modulation\", \"modulation\", \"modulation\", \"modulation\", \"modulation\", \"modulation\", \"modulation\", \"morasso\", \"morasso\", \"morasso\", \"morris\", \"morris\", \"morris\", \"mother\", \"mother\", \"mother\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motor\", \"motor\", \"motor\", \"motor\", \"motor\", \"motor\", \"motor\", \"motor\", \"motor\", \"movements\", \"movements\", \"movements\", \"movements\", \"movements\", \"movements\", \"movements\", \"movements\", \"movements\", \"mpeg\", \"mpeg\", \"mpeg\", \"mpeg\", \"mpeg\", \"mrf\", \"mrf\", \"mrf\", \"mrf\", \"mrf\", \"mrf\", \"mrf\", \"mrf\", \"mrf\", \"mtll\", \"mtll\", \"mtll\", \"mtll\", \"multiset\", \"multiset\", \"multiset\", \"multiset\", \"multiset\", \"multiset\", \"multiset\", \"murata\", \"murata\", \"murata\", \"murata\", \"murata\", \"murata\", \"muscle\", \"muscle\", \"muscle\", \"muscle\", \"muscle\", \"muscle\", \"muscle\", \"muscle\", \"muscle\", \"muscles\", \"muscles\", \"muscles\", \"muscles\", \"muscles\", \"muscles\", \"muscles\", \"muscles\", \"muscles\", \"nec\", \"nec\", \"neg\", \"neg\", \"neg\", \"neg\", \"neg\", \"neg\", \"neg\", \"neighbouring\", \"neighbouring\", \"neighbouring\", \"neighbouring\", \"neighbouring\", \"nervous\", \"nervous\", \"nervous\", \"nervous\", \"nervous\", \"nervous\", \"nervous\", \"nervous\", \"network\", \"network\", \"network\", \"network\", \"network\", \"network\", \"network\", \"network\", \"network\", \"network\", \"networks\", \"networks\", \"networks\", \"networks\", \"networks\", \"networks\", \"networks\", \"networks\", \"networks\", \"networks\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neurons\", \"neurons\", \"neurons\", \"neurons\", \"neurons\", \"neurons\", \"neurons\", \"neurons\", \"neurons\", \"neurons\", \"new\", \"new\", \"new\", \"new\", \"new\", \"new\", \"new\", \"new\", \"new\", \"new\", \"ngm\", \"ngm\", \"ngm\", \"ngm\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"non\", \"non\", \"non\", \"non\", \"non\", \"non\", \"non\", \"non\", \"non\", \"non\", \"notepad\", \"notepad\", \"notepad\", \"notepad\", \"notepad\", \"notepad\", \"notepad\", \"notepads\", \"notepads\", \"notepads\", \"notepads\", \"notepads\", \"notepads\", \"notepads\", \"notepads\", \"nrm\", \"nrm\", \"nrm\", \"nrms\", \"nrms\", \"nrms\", \"nrms\", \"nrms\", \"nrms\", \"nrms\", \"nrms\", \"number\", \"number\", \"number\", \"number\", \"number\", \"number\", \"number\", \"number\", \"number\", \"number\", \"object\", \"object\", \"object\", \"object\", \"object\", \"object\", \"object\", \"object\", \"object\", \"object\", \"objects\", \"objects\", \"objects\", \"objects\", \"objects\", \"objects\", \"objects\", \"objects\", \"objects\", \"objects\", \"oja\", \"oja\", \"oja\", \"oja\", \"oja\", \"oja\", \"oja\", \"oja\", \"oja\", \"one\", \"one\", \"one\", \"one\", \"one\", \"one\", \"one\", \"one\", \"one\", \"one\", \"optimal\", \"optimal\", \"optimal\", \"optimal\", \"optimal\", \"optimal\", \"optimal\", \"optimal\", \"optimal\", \"optimal\", \"order\", \"order\", \"order\", \"order\", \"order\", \"order\", \"order\", \"order\", \"order\", \"order\", \"orlitsky\", \"orlitsky\", \"orlitsky\", \"orlitsky\", \"orlitsky\", \"orlitsky\", \"osi\", \"osi\", \"osi\", \"osi\", \"osi\", \"osi\", \"osi\", \"osi\", \"osi\", \"outbreaks\", \"outbreaks\", \"outbreaks\", \"outbreaks\", \"outbreaks\", \"outbreaks\", \"outbreaks\", \"outbreaks\", \"outbreaks\", \"pab\", \"pab\", \"pab\", \"pab\", \"pab\", \"participants\", \"participants\", \"participants\", \"participants\", \"participants\", \"participants\", \"participants\", \"participants\", \"participants\", \"patch\", \"patch\", \"patch\", \"patch\", \"patch\", \"patch\", \"patch\", \"patch\", \"patch\", \"patterns\", \"patterns\", \"patterns\", \"patterns\", \"patterns\", \"patterns\", \"patterns\", \"patterns\", \"patterns\", \"patterns\", \"perforant\", \"perforant\", \"perforant\", \"performance\", \"performance\", \"performance\", \"performance\", \"performance\", \"performance\", \"performance\", \"performance\", \"performance\", \"performance\", \"permuted\", \"phase\", \"phase\", \"phase\", \"phase\", \"phase\", \"phase\", \"phase\", \"phase\", \"phase\", \"photographs\", \"photographs\", \"photographs\", \"photographs\", \"plain\", \"plain\", \"plain\", \"plain\", \"plain\", \"plain\", \"plasticity\", \"plasticity\", \"plasticity\", \"plasticity\", \"plasticity\", \"plasticity\", \"plasticity\", \"plasticity\", \"plasticity\", \"player\", \"player\", \"player\", \"player\", \"player\", \"player\", \"player\", \"player\", \"player\", \"player\", \"players\", \"players\", \"players\", \"players\", \"players\", \"players\", \"players\", \"players\", \"players\", \"players\", \"pmca\", \"pmca\", \"pmca\", \"pmca\", \"pmca\", \"pmca\", \"pmca\", \"pmca\", \"point\", \"point\", \"point\", \"point\", \"point\", \"point\", \"point\", \"point\", \"point\", \"point\", \"points\", \"points\", \"points\", \"points\", \"points\", \"points\", \"points\", \"points\", \"points\", \"points\", \"pol\", \"pol\", \"pol\", \"pol\", \"pol\", \"pol\", \"pol\", \"pol\", \"pol\", \"polysensory\", \"postsynaptic\", \"postsynaptic\", \"postsynaptic\", \"postsynaptic\", \"postsynaptic\", \"postsynaptic\", \"postsynaptic\", \"postsynaptic\", \"postsynaptic\", \"postsynaptic\", \"potassium\", \"potassium\", \"potassium\", \"ppr\", \"ppr\", \"ppr\", \"predicate\", \"predicate\", \"predicate\", \"predicate\", \"predicate\", \"predicate\", \"predicate\", \"predicate\", \"predicate\", \"predictive\", \"predictive\", \"predictive\", \"predictive\", \"predictive\", \"predictive\", \"predictive\", \"predictive\", \"predictive\", \"preparations\", \"preparations\", \"preparations\", \"primitives\", \"primitives\", \"primitives\", \"primitives\", \"primitives\", \"primitives\", \"primitives\", \"primitives\", \"primitives\", \"primitives\", \"probabilities\", \"probabilities\", \"probabilities\", \"probabilities\", \"probabilities\", \"probabilities\", \"probabilities\", \"probabilities\", \"probabilities\", \"probabilities\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"problems\", \"problems\", \"problems\", \"problems\", \"problems\", \"problems\", \"problems\", \"problems\", \"problems\", \"problems\", \"procedure\", \"procedure\", \"procedure\", \"procedure\", \"procedure\", \"procedure\", \"procedure\", \"procedure\", \"procedure\", \"procedure\", \"pseudo\", \"pseudo\", \"pseudo\", \"pseudo\", \"pseudo\", \"pseudo\", \"pseudo\", \"pseudo\", \"pseudo\", \"ptp\", \"ptp\", \"ptp\", \"ptp\", \"ptp\", \"ptp\", \"ptp\", \"pulse\", \"pulse\", \"pulse\", \"pulse\", \"pulse\", \"pulse\", \"pulse\", \"pulse\", \"pulse\", \"pumadyn\", \"pumadyn\", \"pumadyn\", \"pumadyn\", \"pumadyn\", \"pumadyn\", \"pumadyn\", \"pumadyn\", \"pyramidal\", \"pyramidal\", \"pyramidal\", \"pyramidal\", \"pyramidal\", \"pyramidal\", \"pyramidal\", \"qout\", \"qout\", \"qout\", \"qout\", \"qout\", \"qout\", \"qout\", \"qout\", \"qx\", \"qx\", \"qx\", \"qx\", \"qx\", \"radiatum\", \"radiatum\", \"radiatum\", \"radiatum\", \"radiatum\", \"radiatum\", \"random\", \"random\", \"random\", \"random\", \"random\", \"random\", \"random\", \"random\", \"random\", \"random\", \"ranking\", \"ranking\", \"ranking\", \"ranking\", \"ranking\", \"ranking\", \"ranking\", \"ranking\", \"ranking\", \"ranking\", \"rat\", \"rat\", \"rat\", \"region\", \"region\", \"region\", \"region\", \"region\", \"region\", \"region\", \"region\", \"region\", \"region\", \"regret\", \"regret\", \"regret\", \"regret\", \"regret\", \"regret\", \"regret\", \"regret\", \"regret\", \"regret\", \"replica\", \"replica\", \"replica\", \"replica\", \"replica\", \"replica\", \"replica\", \"replica\", \"replica\", \"replica\", \"results\", \"results\", \"results\", \"results\", \"results\", \"results\", \"results\", \"results\", \"results\", \"results\", \"rgbn\", \"rgbn\", \"rgbn\", \"rgbn\", \"rgbn\", \"rgbn\", \"rgbn\", \"rgm\", \"risk\", \"risk\", \"risk\", \"risk\", \"risk\", \"risk\", \"risk\", \"risk\", \"risk\", \"risk\", \"rnnat\", \"rnnat\", \"rnnat\", \"rnnat\", \"rnnat\", \"rnnat\", \"rnnat\", \"rnnat\", \"rnnat\", \"rnp\", \"rnp\", \"rnp\", \"rnp\", \"rnp\", \"rnp\", \"rnp\", \"rnp\", \"rnp\", \"robust\", \"robust\", \"robust\", \"robust\", \"robust\", \"robust\", \"robust\", \"robust\", \"robust\", \"robust\", \"roi\", \"roi\", \"roi\", \"roi\", \"roi\", \"roi\", \"roi\", \"round\", \"round\", \"round\", \"round\", \"round\", \"round\", \"round\", \"round\", \"round\", \"rounding\", \"rounding\", \"rounding\", \"rounding\", \"rounding\", \"rounding\", \"rounding\", \"rounding\", \"rsa\", \"rsa\", \"rsa\", \"rsa\", \"rsa\", \"rsa\", \"rsa\", \"rsa\", \"rsa\", \"rules\", \"rules\", \"rules\", \"rules\", \"rules\", \"rules\", \"rules\", \"rules\", \"rules\", \"rules\", \"sales\", \"sales\", \"sales\", \"sales\", \"sales\", \"sales\", \"sales\", \"sales\", \"salesman\", \"salesman\", \"salesman\", \"salesman\", \"salesman\", \"salesman\", \"salesman\", \"salesman\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"sampler\", \"sampler\", \"sampler\", \"sampler\", \"sampler\", \"sampler\", \"sampler\", \"sampler\", \"sampler\", \"sauer\", \"sauer\", \"sauer\", \"sauer\", \"sauer\", \"sauer\", \"scan\", \"scan\", \"scan\", \"scan\", \"scan\", \"scan\", \"scan\", \"scan\", \"scan\", \"schnabel\", \"schnell\", \"schnell\", \"schnell\", \"schnell\", \"schnell\", \"schnell\", \"screening\", \"screening\", \"screening\", \"screening\", \"screening\", \"screening\", \"screening\", \"screening\", \"screening\", \"sds\", \"sds\", \"sds\", \"sds\", \"sds\", \"sds\", \"sds\", \"sds\", \"sds\", \"search\", \"search\", \"search\", \"search\", \"search\", \"search\", \"search\", \"search\", \"search\", \"search\", \"second\", \"second\", \"second\", \"second\", \"second\", \"second\", \"second\", \"second\", \"second\", \"second\", \"see\", \"see\", \"see\", \"see\", \"see\", \"see\", \"see\", \"see\", \"see\", \"see\", \"selection\", \"selection\", \"selection\", \"selection\", \"selection\", \"selection\", \"selection\", \"selection\", \"selection\", \"selection\", \"separable\", \"separable\", \"separable\", \"separable\", \"separable\", \"separable\", \"separable\", \"separable\", \"separable\", \"septum\", \"septum\", \"serre\", \"serre\", \"set\", \"set\", \"set\", \"set\", \"set\", \"set\", \"set\", \"set\", \"set\", \"set\", \"setineq\", \"setineq\", \"setineq\", \"setineq\", \"setineq\", \"setineq\", \"setineq\", \"setineq\", \"setineq\", \"sg\", \"sg\", \"sg\", \"sg\", \"sg\", \"sg\", \"sg\", \"sg\", \"sg\", \"sgi\", \"sgi\", \"sgi\", \"sgi\", \"sgi\", \"sgi\", \"sgi\", \"sgi\", \"sgi\", \"show\", \"show\", \"show\", \"show\", \"show\", \"show\", \"show\", \"show\", \"show\", \"show\", \"shown\", \"shown\", \"shown\", \"shown\", \"shown\", \"shown\", \"shown\", \"shown\", \"shown\", \"shown\", \"shows\", \"shows\", \"shows\", \"shows\", \"shows\", \"shows\", \"shows\", \"shows\", \"shows\", \"shows\", \"sigcomm\", \"sigmoid\", \"sigmoid\", \"sigmoid\", \"sigmoid\", \"sigmoid\", \"sigmoid\", \"sigmoid\", \"sigmoid\", \"sigmoid\", \"sigmoidal\", \"sigmoidal\", \"sigmoidal\", \"sigmoidal\", \"sigmoidal\", \"sigmoidal\", \"sigmoidal\", \"sigmoidal\", \"sigmoidal\", \"simple\", \"simple\", \"simple\", \"simple\", \"simple\", \"simple\", \"simple\", \"simple\", \"simple\", \"simple\", \"since\", \"since\", \"since\", \"since\", \"since\", \"since\", \"since\", \"since\", \"since\", \"since\", \"singla\", \"singla\", \"singla\", \"singla\", \"singla\", \"singla\", \"single\", \"single\", \"single\", \"single\", \"single\", \"single\", \"single\", \"single\", \"single\", \"single\", \"size\", \"size\", \"size\", \"size\", \"size\", \"size\", \"size\", \"size\", \"size\", \"size\", \"skeleton\", \"skeleton\", \"skeleton\", \"slice\", \"slice\", \"slice\", \"slice\", \"slice\", \"slice\", \"slice\", \"slice\", \"slice\", \"small\", \"small\", \"small\", \"small\", \"small\", \"small\", \"small\", \"small\", \"small\", \"small\", \"smo\", \"sngp\", \"sngp\", \"sngp\", \"sngp\", \"sngp\", \"sngp\", \"sngp\", \"sngp\", \"sngp\", \"snippet\", \"snippet\", \"snippet\", \"snippet\", \"snippet\", \"snippet\", \"snippet\", \"snippet\", \"snippet\", \"snr\", \"snr\", \"snr\", \"snr\", \"snr\", \"snr\", \"snr\", \"snr\", \"snr\", \"sollich\", \"sollich\", \"sollich\", \"sollich\", \"sollich\", \"solution\", \"solution\", \"solution\", \"solution\", \"solution\", \"solution\", \"solution\", \"solution\", \"solution\", \"solution\", \"sort\", \"sort\", \"sort\", \"sort\", \"sort\", \"sort\", \"sort\", \"sort\", \"sort\", \"sort\", \"sorting\", \"sorting\", \"sorting\", \"sorting\", \"sorting\", \"space\", \"space\", \"space\", \"space\", \"space\", \"space\", \"space\", \"space\", \"space\", \"space\", \"spambase\", \"spambase\", \"spambase\", \"spike\", \"spike\", \"spike\", \"spike\", \"spike\", \"spike\", \"spike\", \"spike\", \"spike\", \"spike\", \"spikes\", \"spikes\", \"spikes\", \"spikes\", \"spikes\", \"spikes\", \"spikes\", \"spikes\", \"spikes\", \"spikes\", \"srd\", \"srd\", \"srd\", \"srd\", \"st\", \"st\", \"st\", \"st\", \"st\", \"st\", \"st\", \"st\", \"st\", \"st\", \"starting\", \"starting\", \"starting\", \"starting\", \"starting\", \"starting\", \"starting\", \"starting\", \"starting\", \"state\", \"state\", \"state\", \"state\", \"state\", \"state\", \"state\", \"state\", \"state\", \"state\", \"statue\", \"statue\", \"statue\", \"stdp\", \"stdp\", \"stdp\", \"stdp\", \"stdp\", \"stdp\", \"stdp\", \"stdp\", \"stimuli\", \"stimuli\", \"stimuli\", \"stimuli\", \"stimuli\", \"stimuli\", \"stimuli\", \"stimuli\", \"stimuli\", \"stimuli\", \"stratum\", \"stratum\", \"stratum\", \"stratum\", \"stratum\", \"stratum\", \"stratum\", \"streams\", \"streams\", \"streams\", \"streams\", \"streams\", \"streams\", \"streams\", \"streams\", \"streams\", \"structure\", \"structure\", \"structure\", \"structure\", \"structure\", \"structure\", \"structure\", \"structure\", \"structure\", \"structure\", \"sts\", \"sts\", \"sts\", \"sts\", \"sts\", \"sts\", \"sts\", \"sts\", \"sts\", \"substitutions\", \"substitutions\", \"substitutions\", \"sum\", \"sum\", \"sum\", \"sum\", \"sum\", \"sum\", \"sum\", \"sum\", \"sum\", \"sum\", \"suppression\", \"suppression\", \"suppression\", \"suppression\", \"suppression\", \"suppression\", \"suppression\", \"suppression\", \"suppression\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surround\", \"surround\", \"surround\", \"surround\", \"surround\", \"surround\", \"surround\", \"surround\", \"surround\", \"suzuki\", \"suzuki\", \"suzuki\", \"suzuki\", \"suzuki\", \"suzuki\", \"suzuki\", \"suzuki\", \"svms\", \"svms\", \"svms\", \"svms\", \"svms\", \"svms\", \"svms\", \"svms\", \"svms\", \"symbolic\", \"symbolic\", \"symbolic\", \"symbolic\", \"symbolic\", \"symbolic\", \"symbolic\", \"symbolic\", \"symbolic\", \"symbols\", \"symbols\", \"symbols\", \"symbols\", \"symbols\", \"symbols\", \"symbols\", \"symbols\", \"symbols\", \"system\", \"system\", \"system\", \"system\", \"system\", \"system\", \"system\", \"system\", \"system\", \"system\", \"temperature\", \"temperature\", \"temperature\", \"temperature\", \"temperature\", \"temperature\", \"temperature\", \"temperature\", \"temperature\", \"tension\", \"tension\", \"tension\", \"tension\", \"tension\", \"tension\", \"tension\", \"tension\", \"tension\", \"tensions\", \"tensions\", \"tensions\", \"test\", \"test\", \"test\", \"test\", \"test\", \"test\", \"test\", \"test\", \"test\", \"test\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"therefore\", \"therefore\", \"therefore\", \"therefore\", \"therefore\", \"therefore\", \"therefore\", \"therefore\", \"therefore\", \"therefore\", \"tiger\", \"tiger\", \"tiger\", \"tiger\", \"tiger\", \"tiger\", \"tiger\", \"tiger\", \"tiger\", \"time\", \"time\", \"time\", \"time\", \"time\", \"time\", \"time\", \"time\", \"time\", \"time\", \"token\", \"token\", \"token\", \"token\", \"token\", \"token\", \"token\", \"token\", \"token\", \"tokens\", \"tokens\", \"tokens\", \"tokens\", \"tokens\", \"tokens\", \"tokens\", \"tokens\", \"tokens\", \"topic\", \"topic\", \"topic\", \"topic\", \"topic\", \"topic\", \"topic\", \"topic\", \"topic\", \"topological\", \"topological\", \"topological\", \"topological\", \"topological\", \"topological\", \"topological\", \"topological\", \"topological\", \"torr\", \"torr\", \"torr\", \"torr\", \"tour\", \"tour\", \"tour\", \"tour\", \"tour\", \"tour\", \"tour\", \"tour\", \"tour\", \"tours\", \"tours\", \"tours\", \"tours\", \"tours\", \"tours\", \"tours\", \"tours\", \"tours\", \"tracklets\", \"tracklets\", \"tracklets\", \"tracklets\", \"tracklets\", \"tracklets\", \"tracklets\", \"tracklets\", \"tracklets\", \"traffic\", \"traffic\", \"traffic\", \"traffic\", \"traffic\", \"traffic\", \"traffic\", \"traffic\", \"traffic\", \"training\", \"training\", \"training\", \"training\", \"training\", \"training\", \"training\", \"training\", \"training\", \"training\", \"traveling\", \"tree\", \"tree\", \"tree\", \"tree\", \"tree\", \"tree\", \"tree\", \"tree\", \"tree\", \"tree\", \"truncated\", \"truncated\", \"truncated\", \"truncated\", \"truncated\", \"truncated\", \"truncated\", \"truncated\", \"truncated\", \"trw\", \"trw\", \"trw\", \"trw\", \"trw\", \"trw\", \"trw\", \"trw\", \"trw\", \"tuple\", \"tuple\", \"tuple\", \"tuple\", \"tuple\", \"tuple\", \"tuple\", \"tuple\", \"tuple\", \"tuples\", \"tuples\", \"tuples\", \"tuples\", \"tuples\", \"tuples\", \"tuples\", \"tuples\", \"tuples\", \"turing\", \"turing\", \"turing\", \"turing\", \"turing\", \"turing\", \"turing\", \"turing\", \"turing\", \"two\", \"two\", \"two\", \"two\", \"two\", \"two\", \"two\", \"two\", \"two\", \"two\", \"twonorm\", \"twonorm\", \"twonorm\", \"tx\", \"tx\", \"tx\", \"tx\", \"tx\", \"tx\", \"tx\", \"tx\", \"tx\", \"ugi\", \"ugi\", \"ugi\", \"ugi\", \"ugi\", \"ugi\", \"ugi\", \"ugi\", \"unary\", \"unary\", \"unary\", \"unary\", \"unary\", \"unary\", \"unary\", \"unary\", \"unimodal\", \"unimodal\", \"unimodal\", \"unimodal\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"units\", \"units\", \"units\", \"units\", \"units\", \"units\", \"units\", \"units\", \"units\", \"units\", \"uno\", \"uno\", \"uno\", \"uno\", \"uno\", \"uno\", \"uno\", \"uno\", \"uno\", \"unrectified\", \"unrectified\", \"unrectified\", \"unrectified\", \"unrectified\", \"unrectified\", \"unrobust\", \"unrobust\", \"unrobust\", \"unrobust\", \"updates\", \"updates\", \"updates\", \"updates\", \"updates\", \"updates\", \"updates\", \"updates\", \"updates\", \"updates\", \"urn\", \"urn\", \"urn\", \"urn\", \"urn\", \"urn\", \"urn\", \"urn\", \"urn\", \"used\", \"used\", \"used\", \"used\", \"used\", \"used\", \"used\", \"used\", \"used\", \"used\", \"usg\", \"usg\", \"usg\", \"usg\", \"usg\", \"usg\", \"usg\", \"using\", \"using\", \"using\", \"using\", \"using\", \"using\", \"using\", \"using\", \"using\", \"using\", \"value\", \"value\", \"value\", \"value\", \"value\", \"value\", \"value\", \"value\", \"value\", \"value\", \"varsample\", \"varsample\", \"varsample\", \"varsample\", \"varsample\", \"varsample\", \"vdp\", \"vdp\", \"vdp\", \"vdp\", \"vdp\", \"vdp\", \"vdp\", \"vdp\", \"vdp\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"veksler\", \"veksler\", \"veksler\", \"veksler\", \"velocity\", \"velocity\", \"velocity\", \"velocity\", \"velocity\", \"velocity\", \"velocity\", \"velocity\", \"velocity\", \"ventral\", \"ventral\", \"ventral\", \"ventral\", \"ventral\", \"ventral\", \"ventral\", \"ventral\", \"ventral\", \"video\", \"video\", \"video\", \"video\", \"video\", \"video\", \"video\", \"video\", \"video\", \"virtual\", \"virtual\", \"virtual\", \"virtual\", \"virtual\", \"virtual\", \"virtual\", \"virtual\", \"virtual\", \"vl\", \"vl\", \"vl\", \"vl\", \"vl\", \"vl\", \"vl\", \"vl\", \"vl\", \"vly\", \"vly\", \"vo\", \"vo\", \"vo\", \"vo\", \"vo\", \"vo\", \"vo\", \"vo\", \"vo\", \"voxel\", \"voxel\", \"voxel\", \"voxel\", \"voxel\", \"voxel\", \"voxel\", \"voxel\", \"voxel\", \"vps\", \"vps\", \"vps\", \"vps\", \"vps\", \"vps\", \"vps\", \"vref\", \"vref\", \"wab\", \"wab\", \"wab\", \"wab\", \"wab\", \"wab\", \"wab\", \"wab\", \"wars\", \"wars\", \"wars\", \"waveforms\", \"waveforms\", \"waveforms\", \"waveforms\", \"waveforms\", \"waveforms\", \"waveforms\", \"waveforms\", \"waveforms\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"webkb\", \"webkb\", \"webkb\", \"webkb\", \"webkb\", \"webkb\", \"webkb\", \"week\", \"week\", \"week\", \"week\", \"whitening\", \"whitening\", \"whitening\", \"whitening\", \"whitening\", \"winner\", \"winner\", \"winner\", \"winner\", \"winner\", \"winner\", \"winner\", \"winner\", \"winner\", \"word\", \"word\", \"word\", \"word\", \"word\", \"word\", \"word\", \"word\", \"word\", \"word\", \"world\", \"world\", \"world\", \"world\", \"world\", \"world\", \"world\", \"world\", \"world\", \"world\", \"wt\", \"wt\", \"wt\", \"wt\", \"wt\", \"wt\", \"wt\", \"wt\", \"wt\", \"wt\", \"wv\", \"wv\", \"wv\", \"wv\", \"wv\", \"wv\", \"wv\", \"xca\", \"xca\", \"xca\", \"xca\", \"xca\", \"xca\", \"xca\", \"xca\", \"xca\", \"xca\", \"xg\", \"xg\", \"xg\", \"xg\", \"xg\", \"xg\", \"xg\", \"xg\", \"xg\", \"xi\", \"xi\", \"xi\", \"xi\", \"xi\", \"xi\", \"xi\", \"xi\", \"xi\", \"xi\", \"xj\", \"xj\", \"xj\", \"xj\", \"xj\", \"xj\", \"xj\", \"xj\", \"xj\", \"xj\", \"xt\", \"xt\", \"xt\", \"xt\", \"xt\", \"xt\", \"xt\", \"xt\", \"xt\", \"xt\", \"xtj\", \"xtj\", \"xtj\", \"xu\", \"xu\", \"xu\", \"xu\", \"xu\", \"xu\", \"xu\", \"xu\", \"xu\", \"xu\", \"ydxi\", \"ygi\", \"ygi\", \"ygi\", \"ygi\", \"ygi\", \"ygi\", \"ygi\", \"yi\", \"yi\", \"yi\", \"yi\", \"yi\", \"yi\", \"yi\", \"yi\", \"yi\", \"yi\", \"yj\", \"yj\", \"yj\", \"yj\", \"yj\", \"yj\", \"yj\", \"yj\", \"yj\", \"yj\", \"yoshitatsu\", \"yoshitatsu\", \"yoshitatsu\", \"yoshitatsu\", \"yoshitatsu\", \"zcn\", \"zn\", \"zn\", \"zn\", \"zn\", \"zn\", \"zn\", \"zn\", \"zn\", \"zn\", \"zt\", \"zt\", \"zt\", \"zt\", \"zt\", \"zt\", \"zt\", \"zt\", \"zt\", \"zt\"]}, \"R\": 30, \"lambda.step\": 0.01, \"plot.opts\": {\"xlab\": \"PC1\", \"ylab\": \"PC2\"}, \"topic.order\": [9, 4, 1, 7, 6, 10, 2, 8, 5, 3]};\n\nfunction LDAvis_load_lib(url, callback){\n  var s = document.createElement('script');\n  s.src = url;\n  s.async = true;\n  s.onreadystatechange = s.onload = callback;\n  s.onerror = function(){console.warn(\"failed to load library \" + url);};\n  document.getElementsByTagName(\"head\")[0].appendChild(s);\n}\n\nif(typeof(LDAvis) !== \"undefined\"){\n   // already loaded: just create the visualization\n   !function(LDAvis){\n       new LDAvis(\"#\" + \"ldavis_el86201113124030561323853923\", ldavis_el86201113124030561323853923_data);\n   }(LDAvis);\n}else if(typeof define === \"function\" && define.amd){\n   // require.js is available: use it to load d3/LDAvis\n   require.config({paths: {d3: \"https://d3js.org/d3.v5\"}});\n   require([\"d3\"], function(d3){\n      window.d3 = d3;\n      LDAvis_load_lib(\"https://cdn.jsdelivr.net/gh/bmabey/pyLDAvis@3.3.1/pyLDAvis/js/ldavis.v3.0.0.js\", function(){\n        new LDAvis(\"#\" + \"ldavis_el86201113124030561323853923\", ldavis_el86201113124030561323853923_data);\n      });\n    });\n}else{\n    // require.js not available: dynamically load d3 & LDAvis\n    LDAvis_load_lib(\"https://d3js.org/d3.v5.js\", function(){\n         LDAvis_load_lib(\"https://cdn.jsdelivr.net/gh/bmabey/pyLDAvis@3.3.1/pyLDAvis/js/ldavis.v3.0.0.js\", function(){\n                 new LDAvis(\"#\" + \"ldavis_el86201113124030561323853923\", ldavis_el86201113124030561323853923_data);\n            })\n         });\n}\n</script>"
  },
  {
    "path": "natural-language-processing/topic-modeling/results/ldavis_tuned_8.html",
    "content": "\n<link rel=\"stylesheet\" type=\"text/css\" href=\"https://cdn.jsdelivr.net/gh/bmabey/pyLDAvis@3.3.1/pyLDAvis/js/ldavis.v1.0.0.css\">\n\n\n<div id=\"ldavis_el84897113971359365306739010\"></div>\n<script type=\"text/javascript\">\n\nvar ldavis_el84897113971359365306739010_data = {\"mdsDat\": {\"x\": [0.11984545581139927, 0.04682497714931578, 0.025769644407420713, 0.014115843466547522, 0.007824420420313066, -0.015698310483255363, -0.07523205394554648, -0.1234499768261943], \"y\": [0.006314845027184323, -0.023082131902803334, 0.04034576558352655, -0.04608242934410893, -0.020886249492120916, 0.043061054566658756, 0.016899374954880645, -0.01657022939321701], \"topics\": [1, 2, 3, 4, 5, 6, 7, 8], \"cluster\": [1, 1, 1, 1, 1, 1, 1, 1], \"Freq\": [41.517365237605745, 16.267799607218823, 13.168878232197931, 10.078000121848616, 7.9102613515596305, 7.6062171858665515, 2.2767851645791137, 1.17469309912358]}, \"tinfo\": {\"Term\": [\"model\", \"class\", \"image\", \"layer\", \"neuron\", \"domain\", \"learn\", \"object\", \"strategy\", \"input\", \"feature\", \"training\", \"distribution\", \"time\", \"number\", \"rule\", \"datum\", \"prior\", \"use\", \"sample\", \"representation\", \"network\", \"show\", \"human\", \"cluster\", \"state\", \"information\", \"figure\", \"cell\", \"gp\", \"swapout\", \"online\", \"quantum\", \"entropy_rate\", \"loss\", \"transition_probabilitie\", \"batch\", \"validation\", \"pauc\", \"resnet\", \"estimator\", \"cost\", \"regret_bound\", \"ekf\", \"motor_program\", \"non_decomposable\", \"gradient\", \"reservoir\", \"least_square\", \"gradient_descent\", \"asynchronous\", \"forward_operation\", \"ranknet\", \"surrogate\", \"intersection\", \"document\", \"sdp\", \"rank\", \"bind\", \"prec\", \"stochastic\", \"graph\", \"theorem\", \"target\", \"convergence\", \"state\", \"bound\", \"node\", \"problem\", \"let\", \"proof\", \"algorithm\", \"error\", \"method\", \"step\", \"vector\", \"function\", \"output\", \"hide\", \"follow\", \"matrix\", \"training\", \"point\", \"time\", \"network\", \"set\", \"case\", \"use\", \"give\", \"estimate\", \"datum\", \"result\", \"learning\", \"sample\", \"also\", \"learn\", \"value\", \"model\", \"show\", \"number\", \"gp\", \"bump\", \"gpdm\", \"click\", \"gls\", \"physics_engine\", \"bdc_lstm\", \"biomedical\", \"bayesian_hebb\", \"tracking\", \"physical_propertie\", \"token\", \"cann\", \"extrapolation\", \"human_click\", \"fcn\", \"segmentation\", \"synaptic_weight\", \"glm\", \"travelling_wave\", \"pmatch\", \"video\", \"head_direction\", \"latent_coordinate\", \"motion\", \"gp_ucb\", \"track\", \"collision\", \"ext\", \"genetic\", \"rule\", \"human\", \"synaptic_plasticity\", \"gaussian_processe\", \"subject\", \"dynamical\", \"reward\", \"physical\", \"model\", \"dynamic\", \"kernel\", \"prediction\", \"choice\", \"object\", \"location\", \"move\", \"evidence\", \"learning\", \"search\", \"function\", \"gaussian\", \"system\", \"learn\", \"time\", \"weight\", \"network\", \"fig\", \"show\", \"neural\", \"state\", \"use\", \"base\", \"value\", \"mean\", \"process\", \"result\", \"figure\", \"parameter\", \"approach\", \"space\", \"set\", \"give\", \"datum\", \"cogan\", \"quantication\", \"shuffle_ideal\", \"redundancy\", \"blackboard\", \"logic\", \"sound\", \"site\", \"intuitionistic_modal\", \"coordinator\", \"cooperative_competitive\", \"assembly\", \"weight_share\", \"intuitionistic\", \"binaural\", \"tag\", \"pe\", \"domain_adaptation\", \"atom\", \"oq_complexity\", \"saccade\", \"spectral_sparsifi\", \"elevation\", \"domain\", \"node_perturbation\", \"density_ratio\", \"statistical_querie\", \"relative_density\", \"categorization\", \"auditory\", \"activate\", \"discriminative\", \"neuron\", \"string\", \"communication\", \"concept\", \"chip\", \"language\", \"object\", \"group\", \"feature\", \"edge\", \"image\", \"learn\", \"activity\", \"network\", \"distribution\", \"connection\", \"transformation\", \"information\", \"model\", \"field\", \"unit\", \"set\", \"use\", \"neural\", \"show\", \"figure\", \"layer\", \"representation\", \"result\", \"learning\", \"input\", \"different\", \"example\", \"value\", \"represent\", \"mean\", \"number\", \"give\", \"problem\", \"investor\", \"sta_stc\", \"asset\", \"face_recognition\", \"factor_loade\", \"teach\", \"black_schole\", \"wlrmf\", \"pedagogical\", \"transistor\", \"safe\", \"price\", \"spline\", \"stimulation\", \"teaching\", \"epsp\", \"principal_component\", \"bstc\", \"hca\", \"receptor\", \"hedge\", \"ltp\", \"visuomotor_map\", \"irl\", \"stc\", \"spike_triggere\", \"ard\", \"finger_position\", \"rvm\", \"device\", \"tile\", \"option\", \"pca\", \"gabor\", \"nonlinearity\", \"basis_function\", \"weak\", \"miss\", \"current\", \"component\", \"factor\", \"show\", \"model\", \"prior\", \"voltage\", \"sparse\", \"use\", \"feature\", \"function\", \"fig\", \"generalization\", \"datum\", \"method\", \"input\", \"well\", \"matrix\", \"space\", \"figure\", \"point\", \"estimate\", \"distribution\", \"base\", \"weight\", \"value\", \"parameter\", \"also\", \"result\", \"number\", \"set\", \"learn\", \"image\", \"bayes_net\", \"contrary_update\", \"belief_divergence\", \"rsb\", \"completeness\", \"rate_distortion\", \"global_coordinate\", \"cpds\", \"normative\", \"belief_polarization\", \"hemodynamic\", \"ise\", \"ptail\", \"student\", \"trimmed_graphical\", \"prior_belief\", \"alice\", \"balloon\", \"free_energy\", \"tail\", \"wavelet\", \"belief_revision\", \"poll\", \"spending\", \"pyramid\", \"fidelity_criterion\", \"structural_primitive\", \"mfas\", \"overcomplete\", \"carol\", \"outlier\", \"discriminator\", \"sparse_code\", \"coordinate\", \"tensor\", \"natural_scene\", \"posterior\", \"prior\", \"model\", \"generative\", \"distribution\", \"moment\", \"representation\", \"belief\", \"datum\", \"natural_image\", \"family\", \"inference\", \"represent\", \"likelihood\", \"code\", \"number\", \"figure\", \"approach\", \"probability\", \"use\", \"image\", \"sample\", \"feature\", \"noise\", \"case\", \"parameter\", \"set\", \"example\", \"learn\", \"give\", \"result\", \"function\", \"provide\", \"estimate\", \"show\", \"method\", \"value\", \"beat\", \"player\", \"rae\", \"base_strategy\", \"graft\", \"abstraction\", \"weibull\", \"pretraine\", \"wei_bull\", \"heartbeat\", \"image_identification\", \"syntactic\", \"extensive_game\", \"grafted_strategy\", \"recursive_autoencoder\", \"entity_resolution\", \"paraphrase\", \"bet\", \"phrase\", \"texture\", \"cardiologist\", \"dynamic_poole\", \"poker\", \"grafted_strategie\", \"patient\", \"spatial_frequency\", \"nash_equilibrium\", \"equilibrium\", \"abstract_game\", \"paraphrase_detection\", \"child\", \"record\", \"game\", \"contrast\", \"sentence\", \"strategy\", \"image\", \"natural_image\", \"cluster\", \"frequency\", \"word\", \"response\", \"similarity\", \"model\", \"use\", \"filter\", \"feature\", \"set\", \"classifier\", \"datum\", \"size\", \"result\", \"parameter\", \"base\", \"show\", \"information\", \"large\", \"figure\", \"well\", \"first\", \"method\", \"training\", \"strata\", \"vocabulary\", \"stratum\", \"stratie\", \"sharing\", \"sub_stratum\", \"stratication\", \"square_lattice\", \"horse\", \"cow_horse\", \"optimal_oracle\", \"hyper_cube\", \"allocate\", \"composition\", \"mean_square\", \"self_connection\", \"phase_diagram\", \"connection_topology\", \"osc\", \"lmc_ucb\", \"pseudo_regret\", \"rhythm\", \"contour_fragment\", \"pseudo_risk\", \"sub_strata\", \"detection\", \"stratify\", \"informational_coherence\", \"sequential\", \"oscillate\", \"coordination\", \"hierarchy\", \"class\", \"efcient\", \"oscillation\", \"lattice\", \"shape\", \"share\", \"layer\", \"hypothesis\", \"joint\", \"strategy\", \"object\", \"hierarchical\", \"number\", \"training\", \"representation\", \"time\", \"degree\", \"learn\", \"sample\", \"measure\", \"partition\", \"model\", \"function\", \"order\", \"image\", \"information\", \"also\", \"learning\", \"fig\", \"use\", \"domain\", \"odor\", \"fluid\", \"cochlea\", \"mitral\", \"bulbar\", \"cochlear\", \"olfactory\", \"granule_cell\", \"oscillation\", \"oscillator\", \"bm_velocity\", \"bulb\", \"damp\", \"iin\", \"sniff\", \"freeman_wj\", \"shepherd\", \"bm\", \"tilt\", \"feedbackward\", \"longitudinal\", \"duct\", \"bidirectional_coupling\", \"gy\", \"olfactory_bulb\", \"stape\", \"iin_iin\", \"bms\", \"motile\", \"saturation\", \"segment\", \"silicon\", \"wave\", \"mode\", \"circuit\", \"frequency\", \"cell\", \"hopfield\", \"amplitude\", \"active\", \"response\", \"input\", \"force\", \"chip\", \"receptor\", \"couple\", \"nonlinear\", \"model\", \"equation\", \"current\", \"increase\", \"figure\", \"system\"], \"Freq\": [1825.0, 227.0, 508.0, 271.0, 283.0, 233.0, 887.0, 256.0, 191.0, 490.0, 504.0, 490.0, 576.0, 684.0, 542.0, 284.0, 785.0, 256.0, 1279.0, 518.0, 232.0, 1063.0, 759.0, 208.0, 173.0, 734.0, 318.0, 563.0, 120.0, 101.0, 81.38383066917298, 121.07878553311674, 48.5237518069592, 49.862226736667644, 188.11843335526154, 43.479869790011676, 57.333289417147746, 68.8237394391394, 32.08744977210894, 35.32926479475675, 205.50646742064117, 238.4215792729787, 27.207619637667527, 27.95546838620176, 32.505558819198875, 25.257027767809742, 174.9416848527076, 25.213588010561207, 54.20969570602726, 54.906551496898544, 29.00539644955131, 22.33485568602881, 22.290158537855483, 24.572934052081802, 30.716289319505183, 68.8816723977693, 36.79683481103228, 99.58307668808735, 73.27025893585613, 19.401333880178505, 188.73839792010875, 191.2968873866253, 140.65231555849908, 212.38976182784126, 164.84065623004258, 565.1764648883297, 125.18704473390058, 120.2027028337016, 551.5454617770309, 204.2184917898347, 94.55247696964113, 308.1626075289172, 288.3150636854528, 536.5845641008556, 248.23932303320777, 248.27881573408501, 660.9647888945925, 255.65247489222216, 147.91234022552467, 302.91248683570694, 306.07700330851, 322.65483487504093, 364.95389061855474, 422.41573497546386, 595.4070471162389, 534.7137789215428, 300.05898866461064, 672.7663413395147, 368.7761465131427, 302.0938722144742, 413.6586270248928, 402.9198709768515, 367.0532021710153, 303.3206253040552, 298.97607401293305, 373.9265259994737, 305.4675760338072, 480.817399354863, 321.5277521534137, 273.54410287784617, 95.5659521854219, 43.60141961092034, 25.471001498864638, 49.954186896630034, 21.926672361939183, 18.62263819599713, 17.3601623443239, 15.084859687964299, 16.757594696134, 25.595709790330577, 14.903876012322533, 21.229911666515683, 15.962243219302428, 13.551907571303566, 12.816144091431315, 12.809947363040303, 40.08691353048141, 24.292174541244556, 16.02386354193646, 13.175091144806464, 12.005907975855374, 35.98281962319121, 11.266697435705144, 10.451848469963553, 28.60265489400564, 9.78319944453226, 36.73535087413383, 9.646807232093906, 9.751078636994803, 10.552575665429165, 201.85384050916565, 147.43970928125938, 20.915512706184437, 25.95007001563578, 58.52098000772863, 35.41383435691431, 82.81691560920603, 33.28874511542621, 530.9893148767928, 91.49806021767921, 66.91308505554017, 74.91419250209749, 63.03931792177123, 95.77108957029886, 55.2112608812778, 35.8626461156038, 44.841691675730026, 160.127999877257, 68.72798397697602, 209.5665537430349, 72.16954144911648, 101.34926927674475, 162.1668692416361, 138.48247370694835, 112.97861324043137, 169.52935976341752, 56.31144039163961, 127.157296815561, 102.41796552091391, 123.22889022003302, 158.27567077335627, 97.7017417126647, 105.6028341017756, 87.1478111634524, 82.7823513268817, 106.87405858202551, 93.0278194231967, 88.2700296540716, 84.73371011820474, 78.31099196962202, 95.42446831625689, 82.34493030688631, 81.68508629123906, 33.49968990700663, 29.69625694233479, 26.69684837453321, 26.87283974165071, 20.566058778629486, 26.938880038730726, 48.845513594540726, 32.952361249145895, 16.55272675805103, 17.11356091365809, 15.110308141621172, 17.781641388480192, 14.305393395830128, 13.088593970668462, 15.131638598652353, 12.295047323479526, 15.684074825952976, 12.848607019449908, 13.778266108987465, 11.695975209529355, 13.721210752593834, 10.905122078476163, 11.924386922609814, 172.41106345665477, 10.232413614902578, 12.243929499005565, 9.529545634889356, 12.213962376575623, 13.731032677439188, 12.297122931846904, 21.075228823121915, 20.48229435894821, 171.57027073889503, 41.82997135616548, 43.94823782893298, 63.0864395494285, 30.997775735547027, 42.0639954974193, 115.34459595788097, 50.56824045274673, 165.2364914194037, 57.37036015659984, 151.58644819403148, 217.679357311851, 46.251649129804825, 220.84632561440353, 145.40274181944633, 49.449574611552535, 34.72192270752242, 89.80703524296871, 259.0871094812199, 45.56980756811403, 63.610256816702886, 133.25139456484382, 149.14271284763527, 89.72956084350224, 108.16266520084298, 92.26389832077103, 67.04987411337551, 60.92967180534841, 97.18118587205883, 90.10708002060082, 78.47712334094601, 74.45911066902217, 70.68093608871816, 72.27076783797855, 59.44446869330252, 63.12225573062037, 63.15600472945987, 62.525407382468394, 59.84241814816064, 21.952348532016078, 18.403550747259047, 18.283029157033127, 16.025796971243864, 15.942713357828772, 14.78299585511848, 15.239440399207764, 13.460746913594395, 14.784995914680557, 15.459243828383233, 12.3094659866833, 34.991599819790494, 17.814056566562794, 12.384043167302133, 13.549391143975736, 11.135669612257152, 37.22785986033829, 10.419081909995858, 10.393666859741158, 17.876641987593416, 10.334792457339148, 9.26712820646689, 9.248564515539563, 10.450857171152423, 8.562779450091107, 8.5629078794934, 15.22999465848176, 8.00490254626575, 8.93887017726593, 27.120386370335606, 22.667651940254437, 22.88978644368576, 41.91574154031915, 19.70891578995442, 17.83084696448774, 30.0372362651964, 30.952945426610267, 26.103797256823984, 46.84009667842087, 49.0241703179932, 38.33252467574746, 111.1035945497035, 181.38580363117651, 55.57976156813775, 26.996305297428545, 30.13272325712086, 120.3128274264503, 72.08332021480555, 98.24500635301143, 38.66032809710599, 32.73059105726769, 79.30944406113204, 78.8664518252311, 64.03443140429656, 59.765601429961485, 59.640344610057696, 51.42662337872394, 63.149491563281536, 61.96882663395951, 58.370073573240916, 60.09782411242655, 55.898554511598135, 56.59720186229051, 59.28519163698733, 47.80068842308646, 48.384563951772755, 50.97421656906492, 47.35999179558133, 49.091529658730835, 45.75969051128424, 39.50852530711172, 14.636496686313231, 13.497032010482451, 11.804577861938045, 11.773851397560062, 16.782068543025844, 11.207973259115995, 9.457892543995953, 7.289373480406901, 9.015692958777485, 6.723838246774631, 6.694560586859769, 6.112287971128508, 6.096678842869065, 8.366053255285342, 6.815286700131408, 6.728106209616646, 5.595242345968646, 5.569969881739896, 5.570795299595596, 6.651488705702875, 14.654768071856873, 5.0287935430591935, 5.028686411666579, 5.026796428318908, 6.778689519793547, 5.006541424903344, 5.00251134436619, 4.9761983559422385, 5.9617599289616106, 4.466175320101435, 9.447109389729823, 8.364272512215969, 9.623842445373786, 25.404495656381773, 15.86474544933091, 9.066586275149552, 37.05423828803723, 68.204932858125, 237.18403088028447, 30.296363048141956, 89.50818067759454, 14.622699512631247, 44.73830575124913, 26.10803341656484, 95.03434520773754, 14.704789153111111, 18.769031951194712, 33.09949052164041, 41.88967696028881, 27.913059039228052, 18.405515734303513, 52.49301919767935, 52.43138237409197, 46.761884476172725, 39.97835168952327, 73.66658235750266, 47.36923005554727, 47.76248156054619, 45.80549638411555, 34.550592826599974, 42.22279316830551, 42.1789654625204, 51.704272333316496, 37.30746653222544, 47.91933541620302, 42.099168602553355, 43.98149922586247, 42.92120211851141, 29.15145744725317, 33.25757596653517, 36.23156029661494, 35.74063870372197, 31.836031803286097, 38.104263527866905, 45.67326939848294, 16.157612951876306, 14.99404574912154, 14.422753314143899, 27.659153137725582, 13.281256170656999, 17.140103303921435, 12.124957511439273, 12.081117511317636, 11.544314360035452, 11.521906059203591, 11.520567693302898, 10.94318853243507, 10.93908421033961, 12.206179287863565, 10.36515154444878, 11.523949297811983, 19.595428828791974, 22.486031501110606, 9.766017854213311, 9.204883210856513, 9.206461671488483, 9.206187509261552, 26.547449641449823, 9.17299927944774, 7.469794023093865, 18.55499453342444, 6.311491465113919, 6.308067276235265, 8.627025482787833, 40.50799464170027, 40.460154144231836, 47.58708396862424, 31.24726831642002, 68.31674301962778, 138.8376724752114, 20.268662566598344, 51.70618811267592, 28.312016135392543, 32.311475234707274, 32.43007878743422, 31.35430189871818, 117.75680998924882, 96.42783231017627, 32.501227940976804, 59.39521863726617, 73.7976450123365, 21.609332078903112, 56.82388842290858, 35.461634911890535, 50.93478969727332, 42.97324191911462, 42.43700331490328, 46.946926524574536, 35.86734309537263, 36.59621554262882, 38.75434272446043, 36.220944233121045, 33.92700779099551, 33.81274541057191, 30.06641753753678, 9.44543447821773, 8.361211461341437, 9.149401171736686, 5.702289668120833, 5.149711158516078, 4.270428188281113, 3.6940511511804517, 3.1347218992784027, 2.863561256529893, 2.8632288667224914, 2.8326676998507203, 2.8313519800798588, 5.1281808454692195, 4.881506818393516, 6.434171719630667, 2.26663553676318, 2.2665459444776204, 2.2662497315009733, 2.2657357291472335, 2.259262504417405, 2.2573927207017, 3.631582019583252, 1.9947702956062852, 1.9717283251884383, 1.9709182312568125, 10.599209273200879, 2.2480282630817765, 1.9199697206722537, 11.193935975812268, 1.9730339688621434, 4.77317736247824, 5.155779639776997, 45.70969467790693, 4.129716826877772, 4.432925771319335, 3.9551641040825745, 12.376417207397836, 8.74039514953266, 21.971880159122072, 6.184417077597834, 10.255979042044016, 13.843859207278365, 14.89696238018513, 5.983890118713504, 16.747396104589452, 15.687721859551555, 11.106604620775972, 16.395155171801992, 6.873655143561431, 16.278532723730677, 13.17108087091548, 10.484212597335821, 6.988415334640214, 11.950915521559864, 9.668145846672763, 7.991916916214839, 8.463584608450887, 7.847850578654685, 8.001839079742433, 7.816797746801184, 6.877062078387917, 7.334777653828838, 6.899842313077218, 5.75922448367709, 3.9858681771738778, 3.259369595613262, 3.2323556500032127, 2.512428475094294, 6.536720427849816, 2.5121756805038955, 2.3310600065809313, 5.388547942517828, 3.2255575048933283, 1.8027160556314985, 1.7891767632009423, 1.7918336117974547, 1.4375049134734743, 1.4276136048224004, 1.42757602386248, 1.4272113632813848, 1.256504863637199, 1.6202581150900408, 1.074687553192519, 1.0743155016893062, 1.074275164792325, 1.07419561842716, 1.067184013464927, 1.0666502385641896, 0.8925162927403215, 0.8925221177891092, 0.8923828176976706, 0.8921779388311707, 1.9830856222301745, 3.421970383282063, 1.6208324774274923, 2.798474448290734, 2.8687781521729323, 4.711077481269489, 4.575242855414994, 5.468673555545115, 1.7754241361879042, 2.5673177219157703, 3.630272624690978, 4.199182465937523, 6.835135297115022, 2.6247897828427154, 2.7155956541396726, 2.1488078844299716, 2.419718000748566, 3.1376769355165823, 5.874960311086639, 3.0411078987459508, 2.912157603669425, 2.5089216704542463, 2.6712273223178054, 2.3845968399247353], \"Total\": [1825.0, 227.0, 508.0, 271.0, 283.0, 233.0, 887.0, 256.0, 191.0, 490.0, 504.0, 490.0, 576.0, 684.0, 542.0, 284.0, 785.0, 256.0, 1279.0, 518.0, 232.0, 1063.0, 759.0, 208.0, 173.0, 734.0, 318.0, 563.0, 120.0, 101.0, 86.06516364255972, 128.76516431093188, 52.17211030702518, 53.887168452938006, 203.7727766894985, 47.21574889227339, 62.81213032228597, 75.58816862377101, 35.5435891012685, 39.28656578789786, 230.55199802259008, 267.87940294357713, 30.651744995938277, 31.568670642411657, 36.845907072912865, 28.691123235617564, 198.88980732433558, 28.68911470440214, 61.750834587727894, 62.817318080916586, 33.27364933183521, 25.757153691365833, 25.737001673683576, 28.523319440222767, 35.72670582579321, 80.22284813532417, 42.87521546227007, 116.61164336107312, 86.05411694645288, 22.815125071011558, 224.23215479923576, 228.92636387919256, 168.77502718682297, 258.15216306905006, 199.6392423305378, 734.1734786492065, 154.3748680074759, 147.96769933773982, 746.1681126415724, 263.17148339662737, 115.45145056419435, 420.2034172629995, 392.4087436343339, 793.2453162707446, 347.50248799827506, 349.0628695548875, 1093.2025570833098, 365.027551854018, 194.3407608502001, 449.37695436316756, 459.50354418742825, 490.8927481562456, 572.2880729351133, 684.5437684897005, 1063.0085810642267, 942.8615858603507, 460.8612717197628, 1279.1986293294094, 612.4584051856418, 493.3632159154102, 785.829239633656, 759.942364093312, 686.3566158536219, 518.9706646300303, 535.8659019083696, 887.2602354846034, 600.6529042479251, 1825.0463440462318, 759.4319832841003, 542.3927715569774, 101.67122513953828, 48.874525646855965, 29.327026049124388, 57.74837720518549, 25.605390426971695, 22.47338873538547, 21.015903876938555, 18.72712969934245, 20.95038369323402, 32.041769862325, 18.67501289958084, 26.772668513672073, 20.178300587762088, 17.193655376344502, 16.440372390691664, 16.43670315982459, 51.484679480604576, 31.32762066565518, 20.772546660661988, 17.105046570710595, 15.66137452349667, 47.78061532642052, 15.104585198068111, 14.118119742930087, 38.80313514690064, 13.386950190711923, 50.39787189445995, 13.347522302092896, 13.563787871447191, 14.80264916113793, 284.63342841622784, 208.90733357560862, 30.10328425215766, 37.84963784179655, 90.40891292903605, 55.537229360446716, 167.2565033208479, 54.660254179704054, 1825.0463440462318, 197.5473334196008, 145.29047537893527, 174.22495005019846, 139.92599574620496, 256.93472000472707, 121.44454887351057, 66.61408615767537, 93.2769869782095, 686.3566158536219, 183.3084562330581, 1093.2025570833098, 205.15651435270215, 385.08620920082245, 887.2602354846034, 684.5437684897005, 489.7944270127256, 1063.0085810642267, 145.59219341251475, 759.4319832841003, 496.1874982082559, 734.1734786492065, 1279.1986293294094, 471.99597956849647, 600.6529042479251, 390.33836838950543, 353.262004749768, 759.942364093312, 563.3049306376946, 477.2259806972987, 463.3374967680325, 327.3896577236677, 942.8615858603507, 612.4584051856418, 785.829239633656, 37.62660849505184, 33.490989947476436, 30.73283174900905, 31.22732459440466, 24.47907348378231, 32.589383234970136, 59.565206870295576, 40.29986809753879, 20.258717297755275, 20.989129206916008, 18.853786411167192, 22.38227223918192, 18.1408677269141, 16.77302378927389, 19.590294737458034, 16.05945643511131, 20.527870090348472, 16.831195351038946, 18.059590372063244, 15.375382890629634, 18.07946184803903, 14.713707345490963, 16.089947876529624, 233.68653056710485, 14.021765171309022, 17.05417221370108, 13.301764118302481, 17.061036456171117, 19.314283704286236, 17.30616611919952, 30.15899587292671, 29.600069904468068, 283.21742144348053, 64.1598013687206, 67.73066426776404, 104.69793023995948, 48.359453773292415, 73.8414472354022, 256.93472000472707, 99.24557926454868, 504.88360448986697, 127.60100818558844, 508.11873230098206, 887.2602354846034, 102.96263246460569, 1063.0085810642267, 576.1969400771171, 117.010876686853, 68.06172919026166, 318.90035519297584, 1825.0463440462318, 114.32701972103493, 230.81760824677355, 942.8615858603507, 1279.1986293294094, 496.1874982082559, 759.4319832841003, 563.3049306376946, 271.32010202917223, 232.03824396047847, 759.942364093312, 686.3566158536219, 490.23709092096806, 428.3014703175286, 410.86709378715204, 600.6529042479251, 279.7103005262711, 390.33836838950543, 542.3927715569774, 612.4584051856418, 746.1681126415724, 26.334918739408952, 22.424330905034346, 22.546581601506084, 19.897701566573275, 19.880169145763762, 18.63988712721076, 19.382705244617945, 17.375174661273533, 19.24670718664522, 20.243911440658454, 16.137065346436636, 46.09918427539324, 23.49613534629984, 16.344455931104847, 18.084574966749233, 14.875297251022655, 50.755148805634434, 14.265304155020017, 14.248726936477716, 24.698840528507926, 14.336474715886617, 12.996369260391106, 12.992477291516787, 14.89710100567347, 12.384444409317627, 12.385692653370674, 22.191653507226263, 11.74011757472342, 13.141688875267594, 40.0502094798381, 35.06126958571149, 36.20023516687962, 70.64596498956494, 31.71555451925086, 28.498833507301732, 54.556173082307446, 59.89664293674649, 51.71926396476849, 155.5409103686371, 173.17971455270586, 119.97645625920907, 759.4319832841003, 1825.0463440462318, 256.9286218757639, 67.38323506573562, 84.0172144848423, 1279.1986293294094, 504.88360448986697, 1093.2025570833098, 145.59219341251475, 101.57155252724601, 785.829239633656, 793.2453162707446, 490.23709092096806, 436.4741335645426, 459.50354418742825, 327.3896577236677, 563.3049306376946, 572.2880729351133, 493.3632159154102, 576.1969400771171, 471.99597956849647, 489.7944270127256, 600.6529042479251, 477.2259806972987, 535.8659019083696, 759.942364093312, 542.3927715569774, 942.8615858603507, 887.2602354846034, 508.11873230098206, 18.505501521547277, 17.36663369688566, 15.652467436015241, 15.665532308622536, 22.575534706449087, 15.093456342172571, 13.398355796932504, 11.082441963702811, 13.746514633618027, 10.511388177554567, 10.511352467481798, 9.962546002431909, 9.977680764731929, 13.767919723850286, 11.372291728310518, 11.266546099869348, 9.370121718011376, 9.372529666025995, 9.379723986410124, 11.26619763328372, 25.182525711928236, 8.79836858690096, 8.798970259445964, 8.799280016017978, 11.892881308406794, 8.810522823825814, 8.809216208997988, 8.818952440526479, 10.929386687076402, 8.22744087477822, 17.85657012730733, 15.85162212315464, 18.9508345635339, 61.31718586069551, 35.00248112043054, 18.1084143614052, 107.41619523787308, 256.9286218757639, 1825.0463440462318, 112.58385849161265, 576.1969400771171, 38.873906488600916, 232.03824396047847, 99.73715636027923, 785.829239633656, 40.258018179268184, 63.421835545549776, 176.58335947007458, 279.7103005262711, 135.71475633047365, 68.7996401329457, 542.3927715569774, 563.3049306376946, 463.3374967680325, 342.3071573873944, 1279.1986293294094, 508.11873230098206, 518.9706646300303, 504.88360448986697, 267.58435929701295, 460.8612717197628, 477.2259806972987, 942.8615858603507, 410.86709378715204, 887.2602354846034, 612.4584051856418, 759.942364093312, 1093.2025570833098, 269.766378166267, 493.3632159154102, 759.4319832841003, 793.2453162707446, 600.6529042479251, 44.80522118491405, 56.46212586932094, 20.01326875067065, 18.855748648811044, 18.268182169273484, 35.11555332429844, 17.075848640640462, 22.09989819542709, 15.908511480204403, 15.941260677362209, 15.32433256835418, 15.341561950990544, 15.343743203131611, 14.758918255876617, 14.754686513990231, 16.496754189293842, 14.172446897073378, 15.93717255095661, 27.114278089888963, 31.28625127195665, 13.603501228556807, 13.001670262704279, 13.003932944561223, 13.004777471662418, 38.76612284373784, 13.574786281231962, 11.250682985630899, 29.56293037925961, 10.08028082521149, 10.079462877641383, 13.816452323157986, 66.78978635025369, 71.10872913951309, 96.41882512858126, 60.84579913659289, 191.7656223375638, 508.11873230098206, 40.258018179268184, 173.9979305001518, 76.80884651732119, 102.78032896403678, 112.79384357004434, 108.86995065078959, 1825.0463440462318, 1279.1986293294094, 136.2561011691899, 504.88360448986697, 942.8615858603507, 66.794361957875, 785.829239633656, 253.50602911097192, 759.942364093312, 477.2259806972987, 471.99597956849647, 759.4319832841003, 318.90035519297584, 395.02255893393186, 563.3049306376946, 436.4741335645426, 394.7139119990368, 793.2453162707446, 490.8927481562456, 13.80295627019506, 13.11480129774097, 14.761885864919453, 9.934366556164534, 10.089117178620613, 8.438304681939492, 7.845086010368984, 7.237980199937241, 6.902039801258095, 6.903360954063059, 6.948052548960403, 6.9510559811799135, 12.974278046579402, 12.678561369530428, 17.434760774727046, 6.3486915565793405, 6.350015864451559, 6.3503485712040755, 6.352186009740349, 6.3512978556677835, 6.353279813106655, 10.91836359112476, 6.015195187891038, 6.05272437501432, 6.053579784467059, 33.15665651670511, 7.074037738389098, 6.159990483355112, 36.01191925052385, 6.473814165231572, 16.206568510006154, 18.571431112617447, 227.8684049847063, 15.017081967692825, 17.52082182305697, 15.52244457006588, 94.22779971899406, 55.89820685005704, 271.32010202917223, 40.875719544270694, 106.88222796554754, 191.7656223375638, 256.93472000472707, 40.21924831031474, 542.3927715569774, 490.8927481562456, 232.03824396047847, 684.5437684897005, 65.79054297097501, 887.2602354846034, 518.9706646300303, 299.13614392441514, 82.74362848250821, 1825.0463440462318, 1093.2025570833098, 336.17586377763496, 508.11873230098206, 318.90035519297584, 535.8659019083696, 686.3566158536219, 145.59219341251475, 1279.1986293294094, 233.68653056710485, 10.449360019242757, 8.285087264736378, 7.511688069571701, 7.64472823834174, 6.835078116555772, 17.968473139423423, 7.531037998056053, 7.262438825484704, 17.52082182305697, 10.539341781361982, 5.9724963636775055, 6.038843203677123, 6.753299120864578, 5.5906101198794245, 5.641155680125167, 5.641301898021389, 5.642992080665541, 5.394850885400011, 7.525321447216173, 5.201442594257102, 5.202388553506477, 5.203001238571655, 5.203090605385536, 5.237930211196236, 5.2400964575447375, 5.009385695915859, 5.009681324134846, 5.010236423140635, 5.010321712383831, 12.254337997982049, 25.549067886748226, 11.623623693765762, 26.701657547495813, 28.77047618926905, 71.96192246113206, 76.80884651732119, 120.43736178078221, 14.08670036829836, 30.387567591761023, 73.99483732153047, 112.79384357004434, 490.23709092096806, 40.33957801604758, 48.359453773292415, 24.698840528507926, 42.696403872782106, 124.02387914626271, 1825.0463440462318, 197.85920072762315, 155.5409103686371, 169.00702995179253, 563.3049306376946, 385.08620920082245], \"Category\": [\"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Default\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic1\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic2\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic3\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic4\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic5\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic6\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic7\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\", \"Topic8\"], \"logprob\": [30.0, 29.0, 28.0, 27.0, 26.0, 25.0, 24.0, 23.0, 22.0, 21.0, 20.0, 19.0, 18.0, 17.0, 16.0, 15.0, 14.0, 13.0, 12.0, 11.0, 10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0, -6.8401, -6.4429, -7.3573, -7.33, -6.0022, -7.467, -7.1904, -7.0078, -7.7708, -7.6746, -5.9138, -5.7653, -7.9358, -7.9087, -7.7579, -8.0102, -6.0749, -8.0119, -7.2465, -7.2337, -7.8718, -8.1332, -8.1352, -8.0377, -7.8145, -7.0069, -7.6339, -6.6383, -6.9452, -8.274, -5.999, -5.9855, -6.293, -5.8809, -6.1343, -4.9022, -6.4095, -6.4501, -4.9266, -5.9201, -6.6902, -5.5087, -5.5753, -4.9541, -5.7249, -5.7248, -4.7456, -5.6955, -6.2427, -5.5259, -5.5155, -5.4627, -5.3395, -5.1933, -4.8501, -4.9576, -5.5353, -4.7279, -5.3291, -5.5286, -5.2143, -5.2406, -5.3338, -5.5245, -5.5389, -5.3153, -5.5175, -5.0638, -5.4662, -5.6279, -5.7426, -6.5273, -7.0648, -6.3913, -7.2147, -7.378, -7.4482, -7.5887, -7.4835, -7.06, -7.6008, -7.247, -7.5322, -7.6959, -7.7517, -7.7522, -6.6113, -7.1122, -7.5283, -7.7241, -7.817, -6.7193, -7.8805, -7.9556, -6.9489, -8.0217, -6.6986, -8.0358, -8.025, -7.946, -4.9948, -5.309, -7.2619, -7.0462, -6.233, -6.7353, -5.8858, -6.7972, -4.0276, -5.7861, -6.099, -5.986, -6.1586, -5.7404, -6.2912, -6.7227, -6.4993, -5.2264, -6.0722, -4.9573, -6.0234, -5.6838, -5.2138, -5.3716, -5.5752, -5.1694, -6.2715, -5.457, -5.6733, -5.4883, -5.2381, -5.7205, -5.6427, -5.8348, -5.8862, -5.6307, -5.7695, -5.822, -5.8629, -5.9417, -5.7441, -5.8915, -5.8995, -6.5795, -6.7, -6.8065, -6.7999, -7.0674, -6.7975, -6.2024, -6.596, -7.2845, -7.2512, -7.3757, -7.2129, -7.4304, -7.5193, -7.3743, -7.5819, -7.3384, -7.5378, -7.468, -7.6318, -7.4721, -7.7018, -7.6125, -4.9412, -7.7655, -7.586, -7.8367, -7.5885, -7.4714, -7.5817, -7.043, -7.0715, -4.9461, -6.3574, -6.308, -5.9466, -6.6571, -6.3519, -5.3431, -6.1677, -4.9837, -6.0415, -5.0699, -4.708, -6.257, -4.6936, -5.1116, -6.1901, -6.5437, -5.5934, -4.5339, -6.2718, -5.9383, -5.1988, -5.0862, -5.5943, -5.4074, -5.5664, -5.8856, -5.9813, -5.5145, -5.5901, -5.7283, -5.7808, -5.8329, -5.8106, -6.006, -5.946, -5.9455, -5.9555, -5.9993, -6.7347, -6.911, -6.9176, -7.0494, -7.0546, -7.1301, -7.0997, -7.2238, -7.1299, -7.0853, -7.3132, -6.2684, -6.9436, -7.3071, -7.2172, -7.4134, -6.2065, -7.4799, -7.4824, -6.9401, -7.488, -7.5971, -7.5991, -7.4769, -7.6761, -7.6761, -7.1003, -7.7435, -7.6331, -6.5233, -6.7026, -6.6929, -6.0879, -6.8425, -6.9426, -6.4211, -6.3911, -6.5615, -5.9768, -5.9312, -6.1773, -5.1131, -4.6229, -5.8057, -6.5279, -6.4179, -5.0335, -5.5457, -5.2361, -6.1687, -6.3352, -5.4502, -5.4558, -5.6641, -5.7331, -5.7352, -5.8834, -5.6781, -5.6969, -5.7568, -5.7276, -5.8, -5.7876, -5.7412, -5.9565, -5.9444, -5.8922, -5.9658, -5.9299, -6.0002, -6.147, -6.8978, -6.9789, -7.1129, -7.1155, -6.7611, -7.1647, -7.3345, -7.5949, -7.3824, -7.6757, -7.6801, -7.7711, -7.7736, -7.4572, -7.6622, -7.6751, -7.8594, -7.864, -7.8638, -7.6865, -6.8966, -7.9662, -7.9662, -7.9666, -7.6676, -7.9706, -7.9714, -7.9767, -7.796, -8.0848, -7.3357, -7.4574, -7.3171, -6.3464, -6.8173, -7.3768, -5.969, -5.3588, -4.1125, -6.1703, -5.087, -6.8988, -5.7805, -6.3191, -5.0271, -6.8932, -6.6492, -6.0818, -5.8463, -6.2523, -6.6687, -5.6207, -5.6219, -5.7363, -5.893, -5.2818, -5.7234, -5.7151, -5.757, -6.0389, -5.8384, -5.8394, -5.6358, -5.9622, -5.7118, -5.8413, -5.7976, -5.822, -6.2089, -6.0771, -5.9914, -6.0051, -6.1208, -5.9018, -5.7207, -6.7598, -6.8345, -6.8734, -6.2222, -6.9558, -6.7007, -7.0469, -7.0505, -7.096, -7.0979, -7.098, -7.1495, -7.1498, -7.0402, -7.2037, -7.0977, -6.5669, -6.4293, -7.2633, -7.3224, -7.3223, -7.3223, -6.2632, -7.3259, -7.5313, -6.6214, -7.6998, -7.7003, -7.3873, -5.8407, -5.8419, -5.6796, -6.1002, -5.318, -4.6089, -6.5331, -5.5966, -6.1989, -6.0667, -6.0631, -6.0968, -4.7735, -4.9734, -6.0609, -5.458, -5.2408, -6.469, -5.5022, -5.9737, -5.6116, -5.7816, -5.7941, -5.6931, -5.9623, -5.9422, -5.8849, -5.9525, -6.018, -6.0213, -6.1388, -6.0904, -6.2124, -6.1223, -6.5951, -6.697, -6.8843, -7.0292, -7.1934, -7.2839, -7.284, -7.2947, -7.2952, -6.7012, -6.7505, -6.4743, -7.5177, -7.5177, -7.5178, -7.5181, -7.5209, -7.5218, -7.0463, -7.6454, -7.6571, -7.6575, -5.9752, -7.5259, -7.6837, -5.9206, -7.6564, -6.773, -6.6958, -4.5137, -6.9178, -6.8469, -6.9609, -5.8202, -6.168, -5.2462, -6.5139, -6.0081, -5.7081, -5.6348, -6.5469, -5.5177, -5.5831, -5.9284, -5.539, -6.4083, -5.5461, -5.7579, -5.9861, -6.3917, -5.8552, -6.0671, -6.2575, -6.2002, -6.2757, -6.2563, -6.2797, -6.4078, -6.3433, -6.4045, -5.9234, -6.2915, -6.4927, -6.501, -6.753, -5.7968, -6.7531, -6.8279, -5.9899, -6.5031, -7.0849, -7.0925, -7.091, -7.3113, -7.3182, -7.3182, -7.3185, -7.4459, -7.1916, -7.6022, -7.6025, -7.6026, -7.6026, -7.6092, -7.6097, -7.7879, -7.7879, -7.7881, -7.7883, -6.9896, -6.444, -7.1913, -6.6451, -6.6203, -6.1243, -6.1535, -5.9752, -7.1002, -6.7313, -6.3849, -6.2393, -5.7521, -6.7092, -6.6752, -6.9093, -6.7906, -6.5307, -5.9035, -6.562, -6.6053, -6.7544, -6.6917, -6.8052], \"loglift\": [30.0, 29.0, 28.0, 27.0, 26.0, 25.0, 24.0, 23.0, 22.0, 21.0, 20.0, 19.0, 18.0, 17.0, 16.0, 15.0, 14.0, 13.0, 12.0, 11.0, 10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0, 0.8231, 0.8175, 0.8066, 0.8014, 0.7991, 0.7966, 0.7878, 0.7853, 0.7768, 0.7729, 0.7641, 0.7626, 0.7599, 0.7575, 0.7537, 0.7516, 0.7508, 0.7499, 0.7488, 0.7445, 0.7418, 0.7365, 0.7353, 0.73, 0.728, 0.7266, 0.7262, 0.7212, 0.7182, 0.717, 0.7067, 0.6995, 0.6968, 0.6839, 0.6875, 0.6175, 0.6695, 0.6712, 0.5768, 0.6254, 0.6794, 0.5689, 0.5708, 0.4882, 0.5427, 0.5384, 0.3759, 0.5229, 0.6061, 0.4846, 0.4727, 0.4594, 0.4292, 0.3963, 0.2994, 0.3119, 0.4499, 0.2365, 0.3718, 0.3886, 0.2374, 0.2446, 0.2532, 0.342, 0.2955, 0.015, 0.2029, -0.4548, 0.0196, 0.1945, 1.7541, 1.7018, 1.675, 1.671, 1.6609, 1.628, 1.6249, 1.5997, 1.5927, 1.5914, 1.5904, 1.584, 1.5816, 1.578, 1.5669, 1.5667, 1.5657, 1.5616, 1.5564, 1.5549, 1.5502, 1.5324, 1.5228, 1.5153, 1.511, 1.5024, 1.4998, 1.4913, 1.486, 1.4775, 1.4723, 1.4675, 1.4518, 1.4385, 1.381, 1.366, 1.1131, 1.3201, 0.5814, 1.0463, 1.0406, 0.972, 1.0186, 0.8291, 1.0277, 1.1968, 1.0835, 0.3606, 0.835, 0.1642, 0.7712, 0.4811, 0.1165, 0.218, 0.3492, -0.0198, 0.8661, 0.0288, 0.2381, 0.0313, -0.2737, 0.2409, 0.0777, 0.3166, 0.365, -0.1456, 0.0151, 0.1284, 0.117, 0.3855, -0.4746, -0.1906, -0.4479, 1.9111, 1.9071, 1.8865, 1.8771, 1.8531, 1.8369, 1.8289, 1.826, 1.8253, 1.8232, 1.806, 1.7972, 1.7898, 1.7793, 1.7691, 1.7602, 1.7582, 1.7573, 1.7567, 1.7538, 1.7515, 1.7278, 1.7277, 1.7232, 1.7123, 1.6959, 1.6938, 1.6931, 1.6861, 1.6856, 1.6689, 1.6591, 1.5261, 1.5996, 1.5948, 1.5207, 1.5826, 1.4646, 1.2264, 1.353, 0.9104, 1.2279, 0.8178, 0.6222, 1.227, 0.4559, 0.6504, 1.166, 1.3543, 0.7601, 0.0751, 1.1075, 0.7385, 0.0706, -0.1218, 0.3172, 0.0784, 0.2181, 0.6295, 0.6901, -0.0294, -0.0031, 0.1952, 0.2777, 0.2672, -0.0903, 0.4786, 0.2054, -0.1231, -0.2546, -0.4959, 2.1128, 2.0972, 2.0852, 2.0784, 2.0741, 2.063, 2.0543, 2.0396, 2.0311, 2.0252, 2.0241, 2.0191, 2.018, 2.0173, 2.0061, 2.0053, 1.9849, 1.9806, 1.9793, 1.9716, 1.9675, 1.9566, 1.9549, 1.9403, 1.9258, 1.9257, 1.9184, 1.9119, 1.9094, 1.905, 1.8587, 1.8364, 1.7728, 1.8191, 1.8259, 1.698, 1.6347, 1.6111, 1.0946, 1.0328, 1.1538, 0.3727, -0.0139, 0.7638, 1.3801, 1.2694, -0.0691, 0.3483, -0.1146, 0.9688, 1.1624, 0.0014, -0.0136, 0.2593, 0.3065, 0.253, 0.4438, 0.1065, 0.0718, 0.1604, 0.0343, 0.1614, 0.1368, -0.0208, -0.0061, -0.1099, -0.4071, -0.1434, -0.6604, -0.6699, -0.2594, 2.3025, 2.2849, 2.2549, 2.2514, 2.2405, 2.2394, 2.1887, 2.1181, 2.1152, 2.0902, 2.0858, 2.0485, 2.0444, 2.0389, 2.025, 2.0215, 2.0214, 2.0166, 2.016, 2.01, 1.9956, 1.9776, 1.9775, 1.9771, 1.9749, 1.9718, 1.9712, 1.9648, 1.9309, 1.9261, 1.9003, 1.8977, 1.8594, 1.6559, 1.7457, 1.8452, 1.4727, 1.2107, 0.4965, 1.2243, 0.6749, 1.5593, 0.8909, 1.1967, 0.4245, 1.5299, 1.3194, 0.8627, 0.6383, 0.9555, 1.2185, 0.2017, 0.1627, 0.2436, 0.3896, -0.3174, 0.1643, 0.1514, 0.1371, 0.49, 0.1469, 0.1109, -0.3664, 0.1379, -0.3816, -0.1404, -0.3125, -0.7005, 0.312, -0.16, -0.5056, -0.5628, -0.4004, 2.4142, 2.3641, 2.3622, 2.347, 2.3399, 2.3375, 2.3249, 2.3221, 2.3046, 2.2989, 2.293, 2.2899, 2.2896, 2.2771, 2.277, 2.275, 2.2634, 2.252, 2.2514, 2.2459, 2.2448, 2.2309, 2.2309, 2.2308, 2.1976, 2.1843, 2.1666, 2.1104, 2.108, 2.1075, 2.1052, 2.0762, 2.0123, 1.8701, 1.9098, 1.5441, 1.2788, 1.89, 1.3627, 1.5782, 1.419, 1.3297, 1.3314, -0.1645, -0.009, 1.1429, 0.4361, 0.0286, 1.4477, -0.0506, 0.6093, -0.1265, 0.1688, 0.1673, -0.2073, 0.3912, 0.1972, -0.1004, 0.0871, 0.1223, -0.5791, -0.2166, 3.4031, 3.3323, 3.304, 3.2273, 3.1099, 3.1013, 3.0292, 2.9456, 2.9027, 2.9023, 2.8852, 2.8843, 2.8542, 2.8279, 2.7856, 2.7525, 2.7522, 2.752, 2.7515, 2.7488, 2.7476, 2.6816, 2.6786, 2.6608, 2.6603, 2.6419, 2.636, 2.6166, 2.6139, 2.5942, 2.56, 2.5009, 2.1759, 2.4914, 2.4081, 2.4151, 1.7525, 1.9268, 1.2689, 1.8939, 1.4385, 1.154, 0.9347, 1.8771, 0.3047, 0.3391, 0.743, 0.0506, 1.5236, -0.2159, 0.1086, 0.4314, 1.3109, -1.2461, -0.9456, 0.0432, -0.3125, 0.0778, -0.4218, -0.6927, 0.7298, -1.379, 0.2599, 3.8484, 3.7125, 3.6092, 3.5834, 3.4433, 3.433, 3.3463, 3.3078, 3.265, 3.2602, 3.2463, 3.2277, 3.1174, 3.086, 3.0701, 3.07, 3.0695, 2.9871, 2.9085, 2.8673, 2.8667, 2.8666, 2.8665, 2.8533, 2.8523, 2.7191, 2.7191, 2.7188, 2.7186, 2.6229, 2.4338, 2.4741, 2.1885, 2.1387, 1.7179, 1.6235, 1.3521, 2.373, 1.973, 1.4295, 1.1535, 0.1714, 1.7118, 1.5645, 2.0023, 1.5737, 0.7672, -1.2945, 0.2688, 0.4661, 0.2341, -0.9071, -0.6403]}, \"token.table\": {\"Topic\": [1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 8, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6, 7, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6], \"Freq\": [0.0992035854297759, 0.0992035854297759, 0.0992035854297759, 0.0992035854297759, 0.0992035854297759, 0.5952215125786554, 0.0854322291975428, 0.028477409732514263, 0.056954819465028526, 0.028477409732514263, 0.028477409732514263, 0.7973674725103994, 0.09947280780302956, 0.03315760260100985, 0.6963096546212069, 0.1326304104040394, 0.03315760260100985, 0.03315760260100985, 0.040543369086198155, 0.37840477813784945, 0.18920238906892473, 0.10811565089652841, 0.040543369086198155, 0.17568793270685867, 0.013514456362066052, 0.05405782544826421, 0.029136784172950087, 0.15539618225573382, 0.4467640239852347, 0.08741035251885027, 0.11654713669180035, 0.12625939808278372, 0.019424522781966727, 0.019424522781966727, 0.7329783322709797, 0.09281219142392276, 0.07139399340301751, 0.04759599560201167, 0.03331719692140817, 0.021418198020905253, 0.0023797997801005837, 0.10672219957162203, 0.10672219957162203, 0.10672219957162203, 0.10672219957162203, 0.6403331974297323, 0.10672219957162203, 0.07707557957443686, 0.15415115914887373, 0.2312267387233106, 0.07707557957443686, 0.07707557957443686, 0.07707557957443686, 0.38537789787218435, 0.5579754168630188, 0.1380942503273023, 0.10450375700444497, 0.08957464886095284, 0.048519601466349455, 0.0429211859125399, 0.01492910814349214, 0.0018661385179365175, 0.16454097501886428, 0.32908195003772855, 0.03290819500377285, 0.2632655600301828, 0.03290819500377285, 0.0658163900075457, 0.03290819500377285, 0.09872458501131856, 0.522297464997002, 0.18345158894522798, 0.0733806355780912, 0.07553888956568211, 0.10143793741677312, 0.03884857177663652, 0.006474761962772752, 0.0021582539875909176, 0.04506198691658421, 0.04506198691658421, 0.04506198691658421, 0.6759298037487632, 0.13518596074975264, 0.04506198691658421, 0.04467821628268026, 0.08935643256536052, 0.8042078930882447, 0.04467821628268026, 0.04467821628268026, 0.04467821628268026, 0.04435262150485825, 0.04435262150485825, 0.04435262150485825, 0.7983471870874485, 0.04435262150485825, 0.04435262150485825, 0.8715605466291215, 0.03005381195272833, 0.03005381195272833, 0.03005381195272833, 0.03005381195272833, 0.03005381195272833, 0.05537224152918334, 0.05537224152918334, 0.7752113814085668, 0.05537224152918334, 0.05537224152918334, 0.05537224152918334, 0.057782873058787786, 0.057782873058787786, 0.6933944767054535, 0.11556574611757557, 0.057782873058787786, 0.057782873058787786, 0.10669478098584728, 0.10669478098584728, 0.10669478098584728, 0.10669478098584728, 0.6401686859150837, 0.10669478098584728, 0.4491563682254509, 0.2076288871985575, 0.09533979514219476, 0.11864507839917571, 0.03177993171406492, 0.08898380879938178, 0.00423732422854199, 0.002118662114270995, 0.053034224131061236, 0.053034224131061236, 0.053034224131061236, 0.053034224131061236, 0.053034224131061236, 0.7955133619659186, 0.03665946284360256, 0.14663785137441024, 0.05498919426540384, 0.5498919426540384, 0.1832973142180128, 0.01832973142180128, 0.9074680273943232, 0.015920491708672338, 0.015920491708672338, 0.015920491708672338, 0.015920491708672338, 0.015920491708672338, 0.0540379842629841, 0.0540379842629841, 0.0540379842629841, 0.0540379842629841, 0.8105697639447614, 0.0540379842629841, 0.04773182270274852, 0.8114409859467249, 0.04773182270274852, 0.04773182270274852, 0.04773182270274852, 0.04773182270274852, 0.04758301169702879, 0.8089111988494895, 0.04758301169702879, 0.04758301169702879, 0.04758301169702879, 0.04758301169702879, 0.06695648231751397, 0.022318827439171324, 0.022318827439171324, 0.022318827439171324, 0.022318827439171324, 0.8481154426885102, 0.6216339252348263, 0.06015812179691868, 0.01002635363281978, 0.02005270726563956, 0.2606851944533143, 0.02005270726563956, 0.0638876908121891, 0.0638876908121891, 0.0638876908121891, 0.0638876908121891, 0.7666522897462692, 0.0638876908121891, 0.09513491302084574, 0.09513491302084574, 0.09513491302084574, 0.09513491302084574, 0.6659443911459202, 0.09513491302084574, 0.11365743434399911, 0.11365743434399911, 0.11365743434399911, 0.11365743434399911, 0.5682871717199955, 0.11365743434399911, 0.06274638721533929, 0.06274638721533929, 0.06274638721533929, 0.06274638721533929, 0.06274638721533929, 0.7529566465840714, 0.19219346266331308, 0.19219346266331308, 0.19219346266331308, 0.19219346266331308, 0.19219346266331308, 0.19219346266331308, 0.19219346266331308, 0.05104568427385266, 0.05104568427385266, 0.7656852641077898, 0.05104568427385266, 0.05104568427385266, 0.05104568427385266, 0.8483034001199989, 0.0348617835665753, 0.011620594522191767, 0.0348617835665753, 0.023241189044383534, 0.011620594522191767, 0.0348617835665753, 0.05339846607860639, 0.8009769911790958, 0.05339846607860639, 0.05339846607860639, 0.05339846607860639, 0.05339846607860639, 0.051592385447726555, 0.051592385447726555, 0.051592385447726555, 0.7738857817158983, 0.051592385447726555, 0.051592385447726555, 0.04085121933485401, 0.04085121933485401, 0.8578756060319341, 0.04085121933485401, 0.04085121933485401, 0.04085121933485401, 0.1853619351567774, 0.1853619351567774, 0.1853619351567774, 0.1853619351567774, 0.1853619351567774, 0.1853619351567774, 0.1853619351567774, 0.16743417477516218, 0.16743417477516218, 0.16743417477516218, 0.16743417477516218, 0.16743417477516218, 0.16743417477516218, 0.33486834955032435, 0.1995913796365634, 0.1995913796365634, 0.1995913796365634, 0.1995913796365634, 0.1995913796365634, 0.1995913796365634, 0.1995913796365634, 0.8097172915085287, 0.05182190665654584, 0.04534416832447761, 0.01943321499620469, 0.05182190665654584, 0.00647773833206823, 0.00647773833206823, 0.07010015272952284, 0.07010015272952284, 0.07010015272952284, 0.7010015272952285, 0.07010015272952284, 0.07010015272952284, 0.16559462901621425, 0.16559462901621425, 0.16559462901621425, 0.16559462901621425, 0.16559462901621425, 0.16559462901621425, 0.3311892580324285, 0.14630410698274574, 0.14630410698274574, 0.14630410698274574, 0.14630410698274574, 0.14630410698274574, 0.14630410698274574, 0.43891232094823723, 0.04092111326974398, 0.9002644919343675, 0.02046055663487199, 0.02046055663487199, 0.02046055663487199, 0.02046055663487199, 0.049558187303765745, 0.7929309968602519, 0.049558187303765745, 0.049558187303765745, 0.049558187303765745, 0.049558187303765745, 0.07351048698409901, 0.07351048698409901, 0.07351048698409901, 0.07351048698409901, 0.07351048698409901, 0.7351048698409901, 0.12154447722202029, 0.12154447722202029, 0.12154447722202029, 0.12154447722202029, 0.48617790888808116, 0.12154447722202029, 0.6509551103752147, 0.10415281766003436, 0.09547341618836483, 0.043397007358347645, 0.09113371545253006, 0.013019102207504294, 0.0021698503679173824, 0.0021698503679173824, 0.05177515331713179, 0.10355030663426358, 0.7248521464398451, 0.05177515331713179, 0.05177515331713179, 0.05177515331713179, 0.05177515331713179, 0.04981842769788569, 0.18266756822558086, 0.31551670875327603, 0.15775835437663802, 0.05812149898086664, 0.18266756822558086, 0.008303071282980948, 0.04151535641490474, 0.07237747987765886, 0.07237747987765886, 0.07237747987765886, 0.07237747987765886, 0.07237747987765886, 0.6513973188989297, 0.020678480048347284, 0.04135696009669457, 0.6410328814987658, 0.10339240024173642, 0.020678480048347284, 0.1240708802900837, 0.020678480048347284, 0.06203544014504185, 0.378771647951181, 0.45023799662121516, 0.035733174335017076, 0.05717307893602733, 0.035733174335017076, 0.028586539468013664, 0.007146634867003416, 0.125066141817724, 0.18065109373671243, 0.20844356969620664, 0.375198425453172, 0.01389623797974711, 0.01389623797974711, 0.06948118989873556, 0.45640377395444576, 0.03071948478539539, 0.14920892610049188, 0.078992960876731, 0.013165493479455167, 0.07021596522376089, 0.20187090001831257, 0.004388497826485056, 0.29942646974625403, 0.20959852882237784, 0.0898279409238762, 0.05988529394925081, 0.014971323487312702, 0.32936911672087943, 0.08658251957859392, 0.8658251957859393, 0.017316503915718786, 0.017316503915718786, 0.017316503915718786, 0.017316503915718786, 0.44253399898875706, 0.045977558336494244, 0.18391023334597698, 0.011494389584123561, 0.011494389584123561, 0.2988541291872126, 0.13312586874457605, 0.13312586874457605, 0.13312586874457605, 0.13312586874457605, 0.13312586874457605, 0.13312586874457605, 0.39937760623372814, 0.05565303140899418, 0.05565303140899418, 0.38957121986295923, 0.05565303140899418, 0.05565303140899418, 0.05565303140899418, 0.38957121986295923, 0.11627967798292428, 0.08720975848719321, 0.3633739936966384, 0.10174471823505875, 0.26162927546157966, 0.02906991949573107, 0.02906991949573107, 0.026576936907069554, 0.026576936907069554, 0.8770389179332952, 0.026576936907069554, 0.026576936907069554, 0.026576936907069554, 0.07492027189519658, 0.7492027189519658, 0.07492027189519658, 0.07492027189519658, 0.07492027189519658, 0.07492027189519658, 0.29528722649068817, 0.014764361324534408, 0.649631898279514, 0.014764361324534408, 0.014764361324534408, 0.014764361324534408, 0.08859147860753286, 0.08859147860753286, 0.04429573930376643, 0.04429573930376643, 0.7530275681640294, 0.04429573930376643, 0.38688133984427536, 0.12703566382946355, 0.028871741779423536, 0.2829430694383507, 0.08661522533827061, 0.08661522533827061, 0.0057743483558847074, 0.15774660402768342, 0.07887330201384171, 0.07887330201384171, 0.07887330201384171, 0.07887330201384171, 0.07887330201384171, 0.39436651006920853, 0.08596158471683922, 0.028653861572279744, 0.6017310930178746, 0.15282059505215861, 0.07641029752607931, 0.038205148763039654, 0.009551287190759913, 0.37603341882271113, 0.10255456876983031, 0.41876448914347375, 0.0341848562566101, 0.01709242812830505, 0.008546214064152525, 0.0341848562566101, 0.008546214064152525, 0.15747167085198163, 0.15747167085198163, 0.15747167085198163, 0.15747167085198163, 0.15747167085198163, 0.15747167085198163, 0.31494334170396326, 0.1662456443662979, 0.1662456443662979, 0.1662456443662979, 0.1662456443662979, 0.1662456443662979, 0.1662456443662979, 0.3324912887325958, 0.057581683212407996, 0.057581683212407996, 0.057581683212407996, 0.057581683212407996, 0.7485618817613039, 0.057581683212407996, 0.19705695412345223, 0.11408560501884075, 0.04148567455230573, 0.07259993046653503, 0.07259993046653503, 0.49782809462766875, 0.010371418638076432, 0.8264908145003553, 0.03506324667577265, 0.05509938763335702, 0.02003614095758437, 0.05009035239396093, 0.005009035239396093, 0.005009035239396093, 0.053039743751827746, 0.053039743751827746, 0.7955961562774162, 0.053039743751827746, 0.053039743751827746, 0.053039743751827746, 0.24462962201295416, 0.032617282935060554, 0.04892592440259083, 0.19570369761036333, 0.40771603668825696, 0.032617282935060554, 0.032617282935060554, 0.06170337658971956, 0.06170337658971956, 0.06170337658971956, 0.12340675317943912, 0.3702202595383174, 0.06170337658971956, 0.3085168829485978, 0.047643710710518455, 0.047643710710518455, 0.8099430820788138, 0.047643710710518455, 0.047643710710518455, 0.047643710710518455, 0.888459498508474, 0.03359720792679104, 0.03359720792679104, 0.01866511551488391, 0.003733023102976782, 0.01866511551488391, 0.4215811723542964, 0.023421176241905355, 0.32789646738667494, 0.07026352872571606, 0.07026352872571606, 0.023421176241905355, 0.023421176241905355, 0.04684235248381071, 0.14485697715276463, 0.14485697715276463, 0.14485697715276463, 0.14485697715276463, 0.14485697715276463, 0.14485697715276463, 0.4345709314582939, 0.09023282082371356, 0.09023282082371356, 0.09023282082371356, 0.09023282082371356, 0.631629745765995, 0.09023282082371356, 0.34074636617715715, 0.1864461248893879, 0.0642917672032372, 0.30217130585521484, 0.03857506032194232, 0.04500423704226604, 0.00642917672032372, 0.01928753016097116, 0.14807577483284598, 0.14807577483284598, 0.14807577483284598, 0.14807577483284598, 0.14807577483284598, 0.14807577483284598, 0.29615154966569196, 0.5268320127576339, 0.1043483696766328, 0.07380738342981344, 0.10053074639578038, 0.12089140389365995, 0.07253484233619598, 0.0012725410936174732, 0.0012725410936174732, 0.3495943179880271, 0.15199752956001178, 0.16719728251601296, 0.06079901182400472, 0.12159802364800944, 0.03039950591200236, 0.10639827069200825, 0.11727335545452205, 0.05863667772726103, 0.7036401327271323, 0.05863667772726103, 0.05863667772726103, 0.05863667772726103, 0.3920781334948622, 0.030159856422681708, 0.030159856422681708, 0.09047956926804512, 0.030159856422681708, 0.09047956926804512, 0.3317584206494988, 0.14981195049729998, 0.024968658416216663, 0.09987463366486665, 0.6741537772378499, 0.024968658416216663, 0.024968658416216663, 0.024968658416216663, 0.4366083540674385, 0.1727754984010184, 0.1727754984010184, 0.07471372903827823, 0.06770931694093965, 0.06070490484360107, 0.00466960806489239, 0.00466960806489239, 0.033783703998923734, 0.033783703998923734, 0.6756740799784747, 0.033783703998923734, 0.13513481599569493, 0.033783703998923734, 0.033783703998923734, 0.06308502639230146, 0.06308502639230146, 0.18925507917690435, 0.06308502639230146, 0.5046802111384117, 0.06308502639230146, 0.3089221542484707, 0.1249572758757859, 0.2516500694720688, 0.10413106322982159, 0.15619659484473236, 0.048594496173916736, 0.005206553161491079, 0.8601040925847849, 0.024930553408254636, 0.012465276704127318, 0.04986110681650927, 0.024930553408254636, 0.012465276704127318, 0.10698091986444662, 0.029954657562045055, 0.7360287286673928, 0.00855847358915573, 0.029954657562045055, 0.051350841534934376, 0.029954657562045055, 0.004279236794577865, 0.05941348663262188, 0.05941348663262188, 0.7723753262240844, 0.05941348663262188, 0.05941348663262188, 0.05941348663262188, 0.1921967637806143, 0.1921967637806143, 0.1921967637806143, 0.1921967637806143, 0.1921967637806143, 0.1921967637806143, 0.1921967637806143, 0.4353387034454752, 0.46064909318067726, 0.020248311788161638, 0.020248311788161638, 0.020248311788161638, 0.015186233841121229, 0.025310389735202048, 0.0050620779470404095, 0.07691319498145811, 0.07691319498145811, 0.07691319498145811, 0.07691319498145811, 0.07691319498145811, 0.6922187548331231, 0.23407721540496082, 0.6302078876287407, 0.01800593964653545, 0.01800593964653545, 0.01800593964653545, 0.01800593964653545, 0.05401781893960634, 0.2821294322975884, 0.05485850072453108, 0.44670493447118165, 0.023510786024799037, 0.03134771469973205, 0.11755393012399518, 0.03918464337466506, 0.46613583218494326, 0.06659083316927761, 0.06659083316927761, 0.06659083316927761, 0.06659083316927761, 0.06659083316927761, 0.26636333267711043, 0.886955308228366, 0.031676975293870216, 0.031676975293870216, 0.031676975293870216, 0.031676975293870216, 0.031676975293870216, 0.06215060531418489, 0.06215060531418489, 0.7458072637702187, 0.06215060531418489, 0.06215060531418489, 0.06215060531418489, 0.0606179851215208, 0.0606179851215208, 0.0606179851215208, 0.0606179851215208, 0.0606179851215208, 0.7274158214582496, 0.9278646741972936, 0.018557293483945872, 0.018557293483945872, 0.018557293483945872, 0.018557293483945872, 0.018557293483945872, 0.0672255473705745, 0.0672255473705745, 0.0672255473705745, 0.7394810210763194, 0.0672255473705745, 0.0672255473705745, 0.6166000850672979, 0.13646067456407412, 0.06064918869514405, 0.030324594347572027, 0.1061360802165021, 0.0353786934055007, 0.005054099057928671, 0.015162297173786013, 0.13530458410869411, 0.10147843808152059, 0.03382614602717353, 0.03382614602717353, 0.03382614602717353, 0.642696774516297, 0.03382614602717353, 0.7339286003993142, 0.08154762226659047, 0.06880580628743571, 0.04841890072078809, 0.030580358349971425, 0.02548363195830952, 0.010193452783323809, 0.6121250840309496, 0.08310307432208257, 0.0891837870773569, 0.11756044660197046, 0.06688784030801767, 0.01621523401406489, 0.012161425510548668, 0.8935077629638047, 0.008674832650134027, 0.017349665300268054, 0.0650612448760052, 0.008674832650134027, 0.004337416325067013, 0.11792833748553347, 0.4824341078953642, 0.16081136929845474, 0.0643245477193819, 0.15009061134522442, 0.02144151590646063, 0.5111141855248929, 0.10465671417890664, 0.17280527224889236, 0.07058243514391378, 0.09005345173533827, 0.046243664404633164, 0.0048677541478561225, 0.07372571802785831, 0.737257180278583, 0.07372571802785831, 0.07372571802785831, 0.07372571802785831, 0.07372571802785831, 0.06517314495956261, 0.06517314495956261, 0.06517314495956261, 0.06517314495956261, 0.06517314495956261, 0.7820777395147513, 0.058160988929429586, 0.8142538450120141, 0.058160988929429586, 0.058160988929429586, 0.058160988929429586, 0.058160988929429586, 0.05025706093008898, 0.05025706093008898, 0.05025706093008898, 0.8041129748814236, 0.05025706093008898, 0.05025706093008898, 0.24171409044909206, 0.15836440408733618, 0.10835459227028266, 0.31672880817467236, 0.14169446681498502, 0.025004905908526765, 0.008334968636175588, 0.050301382884012764, 0.050301382884012764, 0.050301382884012764, 0.8048221261442042, 0.050301382884012764, 0.050301382884012764, 0.44148832589196046, 0.15767440210427158, 0.04730232063128147, 0.03153488042085432, 0.299581363998116, 0.01576744021042716, 0.06083945121332178, 0.7909128657731831, 0.06083945121332178, 0.06083945121332178, 0.06083945121332178, 0.06083945121332178, 0.23569788945758555, 0.07724552679702383, 0.3268079979874085, 0.14260712639450554, 0.09111010852982299, 0.11685861746216426, 0.009903272666285106, 0.19225435672482422, 0.19225435672482422, 0.19225435672482422, 0.19225435672482422, 0.19225435672482422, 0.19225435672482422, 0.19225435672482422, 0.11350064235640532, 0.11350064235640532, 0.11350064235640532, 0.11350064235640532, 0.5675032117820267, 0.11350064235640532, 0.27115199955042946, 0.11370890303727686, 0.40235457997805657, 0.061227870866226, 0.04373419347587571, 0.11370890303727686, 0.041211000805516096, 0.3846360075181502, 0.17171250335631705, 0.2678715052358546, 0.03434250067126341, 0.048079500939768774, 0.048079500939768774, 0.006868500134252682, 0.3887769981918656, 0.16509708142394294, 0.1633218439892769, 0.11183995838396135, 0.09231234660263476, 0.06923425995197607, 0.0035504748693321064, 0.00532571230399816, 0.23485186883679754, 0.04403472540689954, 0.17613890162759815, 0.22751274793564763, 0.06605208811034931, 0.24219098973794748, 0.007339120901149923, 0.007339120901149923, 0.0851780225909329, 0.0851780225909329, 0.0851780225909329, 0.6814241807274632, 0.0851780225909329, 0.0851780225909329, 0.4965621784328653, 0.17734363515459475, 0.09880573958613137, 0.08360485657288039, 0.05320309054637842, 0.08613833707508887, 0.005066961004416993, 0.0025334805022084964, 0.12069878904671005, 0.12069878904671005, 0.12069878904671005, 0.12069878904671005, 0.12069878904671005, 0.12069878904671005, 0.4827951561868402, 0.6742668867596805, 0.13129289214132392, 0.06675909769897827, 0.03560485210612174, 0.048956671645917396, 0.03560485210612174, 0.006675909769897827, 0.002225303256632609, 0.5453701075219014, 0.07436865102571383, 0.17352685239333226, 0.04957910068380922, 0.02478955034190461, 0.04957910068380922, 0.02478955034190461, 0.07436865102571383, 0.8541316429452651, 0.038824165588421146, 0.038824165588421146, 0.038824165588421146, 0.038824165588421146, 0.038824165588421146, 0.10661294526884338, 0.10661294526884338, 0.10661294526884338, 0.10661294526884338, 0.6396776716130603, 0.10661294526884338, 0.17726404615054134, 0.17726404615054134, 0.17726404615054134, 0.17726404615054134, 0.17726404615054134, 0.17726404615054134, 0.17726404615054134, 0.06509666824475027, 0.052077334595800225, 0.2603866729790011, 0.1432126701384506, 0.03905800094685016, 0.3645413421706016, 0.06509666824475027, 0.6046454938447671, 0.1920961478175508, 0.047566665173869725, 0.08964486898152371, 0.03933397312454612, 0.018294871220719124, 0.009147435610359562, 0.0009147435610359562, 0.03153027008854647, 0.03153027008854647, 0.06306054017709294, 0.6306054017709293, 0.03153027008854647, 0.15765135044273232, 0.03153027008854647, 0.09844079741976854, 0.02812594211993387, 0.05625188423986774, 0.16875565271960322, 0.08437782635980161, 0.5625188423986773, 0.25346501993401166, 0.3509515660624777, 0.058491927677079614, 0.18035011033766216, 0.12185818266058253, 0.029245963838539807, 0.004874327306423301, 0.13210166028269382, 0.6869286334700078, 0.02642033205653876, 0.13210166028269382, 0.02642033205653876, 0.02642033205653876, 0.47257326294312835, 0.11814331573578209, 0.04922638155657587, 0.3248941182734007, 0.019690552622630347, 0.019690552622630347, 0.2664680390416266, 0.1421162874888675, 0.2309389671694097, 0.053293607808325316, 0.2664680390416266, 0.026646803904162658, 0.01776453593610844, 0.06755547531487442, 0.7431102284636186, 0.06755547531487442, 0.06755547531487442, 0.06755547531487442, 0.06755547531487442, 0.6024898946209297, 0.1338866432490955, 0.10286412834991482, 0.05061568220392634, 0.06857608556660988, 0.03428804278330494, 0.00653105576824856, 0.04814046232924103, 0.7702473972678565, 0.04814046232924103, 0.09628092465848206, 0.04814046232924103, 0.04814046232924103, 0.074636023640225, 0.074636023640225, 0.074636023640225, 0.074636023640225, 0.671724212762025, 0.074636023640225, 0.039054276592738066, 0.8591940850402374, 0.039054276592738066, 0.039054276592738066, 0.039054276592738066, 0.039054276592738066, 0.019671249139125723, 0.9442199586780348, 0.009835624569562862, 0.009835624569562862, 0.009835624569562862, 0.009835624569562862, 0.07469961311231409, 0.7469961311231409, 0.07469961311231409, 0.07469961311231409, 0.07469961311231409, 0.07469961311231409, 0.03409824093056503, 0.8524560232641257, 0.03409824093056503, 0.03409824093056503, 0.03409824093056503, 0.03409824093056503, 0.8798842049991141, 0.05027909742852081, 0.020111638971408323, 0.010055819485704161, 0.010055819485704161, 0.025139548714260405, 0.875554730451133, 0.04775753075187998, 0.015919176917293325, 0.015919176917293325, 0.03183835383458665, 0.015919176917293325, 0.05473998401887896, 0.05473998401887896, 0.05473998401887896, 0.05473998401887896, 0.05473998401887896, 0.7663597762643054, 0.07689481824498828, 0.07689481824498828, 0.07689481824498828, 0.07689481824498828, 0.07689481824498828, 0.6920533642048946, 0.0677556432431507, 0.0677556432431507, 0.0677556432431507, 0.0677556432431507, 0.0677556432431507, 0.7453120756746576, 0.1376947915197425, 0.1376947915197425, 0.1376947915197425, 0.1376947915197425, 0.1376947915197425, 0.1376947915197425, 0.275389583039485, 0.8343294182612938, 0.02620930109721342, 0.08299612014117583, 0.00873643369907114, 0.021841084247677847, 0.021841084247677847, 0.00436821684953557, 0.15114023325931777, 0.0705321088543483, 0.5138767930816804, 0.15114023325931777, 0.030228046651863558, 0.0705321088543483, 0.010076015550621186, 0.19091510571532042, 0.19091510571532042, 0.19091510571532042, 0.19091510571532042, 0.19091510571532042, 0.19091510571532042, 0.19091510571532042, 0.07018170847529764, 0.07018170847529764, 0.07018170847529764, 0.7018170847529764, 0.07018170847529764, 0.07018170847529764, 0.06620506203162076, 0.7282556823478283, 0.06620506203162076, 0.06620506203162076, 0.06620506203162076, 0.06620506203162076, 0.06273029594328605, 0.06273029594328605, 0.06273029594328605, 0.06273029594328605, 0.06273029594328605, 0.7527635513194326, 0.06975215454409264, 0.06975215454409264, 0.06975215454409264, 0.6975215454409265, 0.06975215454409264, 0.06975215454409264, 0.09513523622137368, 0.09513523622137368, 0.09513523622137368, 0.09513523622137368, 0.6659466535496158, 0.09513523622137368, 0.7615489378169099, 0.030873605587172025, 0.0823296148991254, 0.030873605587172025, 0.03601920651836736, 0.05660161024314871, 0.32322831843343985, 0.14918230081543377, 0.09945486721028918, 0.07459115040771688, 0.07459115040771688, 0.12431858401286147, 0.14918230081543377, 0.2692307323910539, 0.32307687886926467, 0.05384614647821078, 0.05384614647821078, 0.05384614647821078, 0.05384614647821078, 0.2692307323910539, 0.14197789032986974, 0.49692261615454403, 0.14197789032986974, 0.07098894516493487, 0.07098894516493487, 0.07098894516493487, 0.07098894516493487, 0.14197789032986974, 0.14488470492704508, 0.14488470492704508, 0.14488470492704508, 0.14488470492704508, 0.14488470492704508, 0.14488470492704508, 0.43465411478113525, 0.03350767960219324, 0.7036612716460581, 0.11967028429354729, 0.043081302345677024, 0.038294490973935134, 0.06222854783264459, 0.06082587281089738, 0.7907363465416659, 0.06082587281089738, 0.06082587281089738, 0.06082587281089738, 0.06082587281089738, 0.14386303357468488, 0.14386303357468488, 0.14386303357468488, 0.14386303357468488, 0.14386303357468488, 0.14386303357468488, 0.4315891007240546, 0.0978576045778907, 0.1957152091557814, 0.1712508080113087, 0.07339320343341801, 0.26910841258919943, 0.04892880228894535, 0.14678640686683603, 0.17887135367285592, 0.17887135367285592, 0.17887135367285592, 0.17887135367285592, 0.17887135367285592, 0.17887135367285592, 0.17887135367285592, 0.19961349540984955, 0.19961349540984955, 0.19961349540984955, 0.19961349540984955, 0.19961349540984955, 0.19961349540984955, 0.19961349540984955, 0.12201872526768912, 0.11808263735582818, 0.2991426813014314, 0.07872175823721879, 0.09249806592873207, 0.27355810987433526, 0.015744351647443755, 0.06525569681677817, 0.06525569681677817, 0.06525569681677817, 0.06525569681677817, 0.06525569681677817, 0.7830683618013381, 0.43193469538410606, 0.27217802722834084, 0.06508604998938584, 0.0887537045309807, 0.07100296362478456, 0.03550148181239228, 0.011833827270797426, 0.01775074090619614, 0.3397828661775355, 0.30580457955978196, 0.0736196210051327, 0.05096742992663033, 0.18688057639764452, 0.005663047769625592, 0.03397828661775355, 0.32298490209479924, 0.15365301167616663, 0.28221981736438767, 0.05330818772438434, 0.04703663622739795, 0.11288792694575507, 0.025086205987945573, 0.0031357757484931966, 0.16233791313510895, 0.16233791313510895, 0.16233791313510895, 0.16233791313510895, 0.16233791313510895, 0.16233791313510895, 0.3246758262702179, 0.4915988701451641, 0.1591066882627502, 0.1591066882627502, 0.13054907754892325, 0.018358464030317333, 0.02447795204042311, 0.002039829336701926, 0.014278805356913482, 0.8676982465486441, 0.0279902660176982, 0.0279902660176982, 0.0279902660176982, 0.0279902660176982, 0.0279902660176982, 0.05961954222228468, 0.05961954222228468, 0.7750540488897009, 0.05961954222228468, 0.05961954222228468, 0.05961954222228468, 0.04936146673564584, 0.04936146673564584, 0.8391449345059793, 0.04936146673564584, 0.04936146673564584, 0.04936146673564584, 0.037972397404953735, 0.037972397404953735, 0.037972397404953735, 0.8353927429089822, 0.037972397404953735, 0.037972397404953735, 0.06712715444563047, 0.06712715444563047, 0.06712715444563047, 0.6712715444563047, 0.06712715444563047, 0.06712715444563047, 0.06712715444563047, 0.10037594805142128, 0.10037594805142128, 0.10037594805142128, 0.10037594805142128, 0.6022556883085277, 0.10037594805142128, 0.2993949565714039, 0.08420483153570735, 0.38359978810711126, 0.08420483153570735, 0.028068277178569117, 0.03742436957142549, 0.09356092392856373, 0.2615450180123051, 0.46114516333748534, 0.05506210905522213, 0.19960014532518022, 0.006882763631902766, 0.006882763631902766, 0.12188276824137166, 0.2166804768735496, 0.5687862517930677, 0.01354252980459685, 0.01354252980459685, 0.06771264902298425, 0.5366782104093891, 0.15948456252731846, 0.08860253473739915, 0.0810080317599078, 0.032909512902462544, 0.09366553672239339, 0.010126003969988475, 0.0025315009924971187, 0.07083096178588291, 0.7083096178588291, 0.07083096178588291, 0.07083096178588291, 0.07083096178588291, 0.07083096178588291, 0.12884568477421926, 0.06442284238710963, 0.45095989670976744, 0.06442284238710963, 0.06442284238710963, 0.06442284238710963, 0.2576913695484385, 0.4865102106802518, 0.09951345218459696, 0.24694078875437023, 0.007371366828488664, 0.003685683414244332, 0.07002798487064231, 0.0810850351133753, 0.4215223279962827, 0.18258453779518127, 0.24570018048981182, 0.05184499221344653, 0.054099122309683334, 0.025922496106723265, 0.018033040769894447, 0.5347074560409999, 0.23311496721133512, 0.131127169056376, 0.03205330799155858, 0.024768465266204357, 0.03205330799155858, 0.011655748360566756, 0.8744821079832293, 0.03238822622160109, 0.016194113110800545, 0.03238822622160109, 0.016194113110800545, 0.03238822622160109, 0.7751599731364144, 0.03419823410895946, 0.09499509474710961, 0.05319725305838138, 0.018999018949421922, 0.018999018949421922, 0.0037998037898843845, 0.18420988753148912, 0.2652622380453443, 0.0810523505138552, 0.2284202605390465, 0.2063150740352678, 0.007368395501259564, 0.02210518650377869, 0.15744813465292265, 0.15744813465292265, 0.15744813465292265, 0.15744813465292265, 0.15744813465292265, 0.15744813465292265, 0.3148962693058453, 0.03293684267513861, 0.45288158678315593, 0.3211342160826015, 0.09881052802541584, 0.041171053343923264, 0.041171053343923264, 0.008234210668784653, 0.008234210668784653, 0.03068483968505875, 0.03068483968505875, 0.8284906714965863, 0.0613696793701175, 0.03068483968505875, 0.03068483968505875, 0.19221939878481145, 0.19221939878481145, 0.19221939878481145, 0.19221939878481145, 0.19221939878481145, 0.19221939878481145, 0.19221939878481145, 0.9225962518362673, 0.024537134357347532, 0.01472228061440852, 0.009814853742939013, 0.01472228061440852, 0.009814853742939013, 0.004907426871469506, 0.07694456659120091, 0.07694456659120091, 0.07694456659120091, 0.6925010993208083, 0.07694456659120091, 0.07694456659120091, 0.6659361040209621, 0.02611514133415538, 0.07181663866892729, 0.1305757066707769, 0.04787775911261819, 0.054406544446157036, 0.0021762617778462814, 0.0021762617778462814, 0.43551957421301396, 0.2228835468031307, 0.16139843044364635, 0.06660887605610802, 0.058923236511172476, 0.0435519574213014, 0.0076856395449355405, 0.002561879848311847, 0.34414008184714734, 0.11471336061571578, 0.05735668030785789, 0.05735668030785789, 0.05735668030785789, 0.05735668030785789, 0.34414008184714734, 0.6184474987641255, 0.0702021485083602, 0.09694582413059265, 0.06351622960280208, 0.05014439179168585, 0.06351622960280208, 0.03342959452779057, 0.003342959452779057, 0.6769658628740206, 0.09454830487067327, 0.04034061007815393, 0.09959088113044251, 0.04538318633792317, 0.04286189820803855, 0.0012606440649423102, 0.11339215249700353, 0.11339215249700353, 0.11339215249700353, 0.11339215249700353, 0.5669607624850177, 0.11339215249700353, 0.17401639756766185, 0.23202186342354914, 0.019335155285295762, 0.5027140374176898, 0.019335155285295762, 0.019335155285295762, 0.019335155285295762, 0.13080909730506202, 0.13080909730506202, 0.13080909730506202, 0.13080909730506202, 0.13080909730506202, 0.13080909730506202, 0.39242729191518605, 0.13903141448496217, 0.27806282896992435, 0.13903141448496217, 0.1737892681062027, 0.10427356086372162, 0.03475785362124054, 0.03475785362124054, 0.10427356086372162, 0.26355495112173183, 0.2909515156874004, 0.1419142044501633, 0.0991755637277203, 0.1298597160412691, 0.06465589237497787, 0.006575175495760461, 0.0032875877478802306, 0.25724196262426885, 0.10289678504970755, 0.07717258878728066, 0.15434517757456132, 0.3858629439364033, 0.025724196262426887, 0.19958798205080047, 0.19958798205080047, 0.19958798205080047, 0.19958798205080047, 0.19958798205080047, 0.19958798205080047, 0.19958798205080047, 0.02577111349931409, 0.7473622914801086, 0.02577111349931409, 0.05154222699862818, 0.07731334049794227, 0.02577111349931409, 0.02577111349931409, 0.8956218647215726, 0.02714005650671432, 0.02714005650671432, 0.02714005650671432, 0.02714005650671432, 0.02714005650671432, 0.22517759929176298, 0.5404262383002312, 0.04503551985835259, 0.04503551985835259, 0.06004735981113679, 0.06004735981113679, 0.08888349278680911, 0.08888349278680911, 0.08888349278680911, 0.08888349278680911, 0.08888349278680911, 0.6221844495076638, 0.024839772180215606, 0.024839772180215606, 0.024839772180215606, 0.04967954436043121, 0.3725965827032341, 0.49679544360431216, 0.05522294663917779, 0.05522294663917779, 0.05522294663917779, 0.05522294663917779, 0.4970065197526001, 0.2761147331958889, 0.5597320761082833, 0.15992345031665237, 0.20790048541164807, 0.02445888063666448, 0.026340332993330978, 0.015992345031665238, 0.004703630891666246, 0.0009407261783332492, 0.4413654128546444, 0.20556745256243714, 0.181383046378621, 0.07456858573343308, 0.04433807800366291, 0.04635344518564759, 0.004030734363969356, 0.002015367181984678, 0.03883941864852824, 0.2648142180581471, 0.6073072734133507, 0.05296284361162942, 0.003530856240775295, 0.010592568722325883, 0.021185137444651767, 0.8109878070490042, 0.006758231725408368, 0.040549390352450206, 0.02703292690163347, 0.047307622077858574, 0.06082408552867531, 0.006758231725408368, 0.07131769700766166, 0.07131769700766166, 0.7131769700766166, 0.07131769700766166, 0.07131769700766166, 0.07131769700766166, 0.6166279689645593, 0.13079987220460348, 0.05979422729353303, 0.037371392058458144, 0.13079987220460348, 0.018685696029229072, 0.0037371392058458144, 0.0037371392058458144, 0.8713496433964861, 0.03485398573585944, 0.03485398573585944, 0.03485398573585944, 0.03485398573585944, 0.03485398573585944, 0.49990373165866486, 0.2177000121739347, 0.0241888902415483, 0.16125926827698867, 0.040314817069247166, 0.008062963413849433, 0.008062963413849433, 0.0241888902415483, 0.10526746644670089, 0.07017831096446725, 0.07017831096446725, 0.6316047986802054, 0.10526746644670089, 0.035089155482233625, 0.14549142479424315, 0.07274571239712158, 0.07274571239712158, 0.07274571239712158, 0.6547114115740942, 0.07274571239712158, 0.5051689741614058, 0.10877726085957277, 0.11615199040937432, 0.08665307221016814, 0.09587148414742007, 0.05346678923606119, 0.03134260058665656, 0.0018436823874503861, 0.023352235150973805, 0.3736357624155809, 0.44758450706033126, 0.04281243111011864, 0.019460195959144836, 0.03502835272646071, 0.05838058787743451, 0.09569964075871394, 0.09569964075871394, 0.09569964075871394, 0.09569964075871394, 0.09569964075871394, 0.09569964075871394, 0.5741978445522836, 0.13278382080373577, 0.13278382080373577, 0.13278382080373577, 0.13278382080373577, 0.13278382080373577, 0.13278382080373577, 0.39835146241120734, 0.1908361817577215, 0.1908361817577215, 0.1908361817577215, 0.1908361817577215, 0.1908361817577215, 0.1908361817577215, 0.1908361817577215, 0.9396951469561972, 0.023298226784037944, 0.007766075594679315, 0.007766075594679315, 0.007766075594679315, 0.007766075594679315, 0.14392522119735898, 0.14392522119735898, 0.14392522119735898, 0.14392522119735898, 0.14392522119735898, 0.14392522119735898, 0.431775663592077, 0.13812064968502216, 0.08287238981101329, 0.027624129937004433, 0.635354988551102, 0.027624129937004433, 0.055248259874008866, 0.06503903070989142, 0.06503903070989142, 0.7804683685186972, 0.06503903070989142, 0.06503903070989142, 0.06503903070989142, 0.4997384348542177, 0.2022750807743262, 0.07734047206077178, 0.06841657143837504, 0.06246730435677721, 0.06246730435677721, 0.02379706832639132, 0.002974633540798915, 0.15742612046728713, 0.15742612046728713, 0.15742612046728713, 0.15742612046728713, 0.15742612046728713, 0.15742612046728713, 0.31485224093457426, 0.15446844386893663, 0.15446844386893663, 0.15446844386893663, 0.15446844386893663, 0.15446844386893663, 0.15446844386893663, 0.30893688773787326, 0.15446844386893663, 0.22829979326289926, 0.057074948315724815, 0.057074948315724815, 0.057074948315724815, 0.057074948315724815, 0.057074948315724815, 0.22829979326289926, 0.2853747415786241, 0.189765171439534, 0.28464775715930096, 0.094882585719767, 0.094882585719767, 0.094882585719767, 0.094882585719767, 0.28464775715930096, 0.1680053884151146, 0.05600179613837153, 0.11200359227674306, 0.05600179613837153, 0.5040161652453438, 0.05600179613837153, 0.7013169244341854, 0.09862269249855732, 0.10684125020677042, 0.0602694231935628, 0.005479038472142073, 0.019176634652497256, 0.0027395192360710366, 0.005479038472142073, 0.18299288489489776, 0.09149644244744888, 0.09149644244744888, 0.09149644244744888, 0.5489786546846933, 0.09149644244744888, 0.47147475011993534, 0.18439901338024137, 0.060767856682125, 0.10058128002558621, 0.08800862002238793, 0.0901040633562543, 0.004190886667732758, 0.07055944589261441, 0.07055944589261441, 0.07055944589261441, 0.07055944589261441, 0.07055944589261441, 0.7055944589261441, 0.09921163579244237, 0.09921163579244237, 0.09921163579244237, 0.09921163579244237, 0.09921163579244237, 0.5952698147546542, 0.24171045392610438, 0.12085522696305219, 0.19336836314088351, 0.01208552269630522, 0.07251313617783131, 0.2658814993187148, 0.08459865887413653, 0.02579571870085886, 0.02579571870085886, 0.180570030906012, 0.05159143740171772, 0.02579571870085886, 0.6964844049231892, 0.9003030028517285, 0.028134468839116515, 0.028134468839116515, 0.028134468839116515, 0.028134468839116515, 0.028134468839116515, 0.3114119823156156, 0.04246527031576577, 0.014155090105255256, 0.5945137844207208, 0.014155090105255256, 0.014155090105255256, 0.0974285199193819, 0.04871425995969095, 0.7794281593550552, 0.04871425995969095, 0.04871425995969095, 0.04871425995969095, 0.05195693945475897, 0.05195693945475897, 0.05195693945475897, 0.7793540918213845, 0.05195693945475897, 0.05195693945475897, 0.15747992152242732, 0.15747992152242732, 0.15747992152242732, 0.15747992152242732, 0.15747992152242732, 0.15747992152242732, 0.31495984304485464, 0.11064281298784472, 0.0737618753252298, 0.0368809376626149, 0.0368809376626149, 0.0368809376626149, 0.7376187532522981, 0.27442243409050343, 0.6037293549991075, 0.05488448681810069, 0.018294828939366896, 0.018294828939366896, 0.03658965787873379, 0.018294828939366896, 0.0535474864396182, 0.8032122965942731, 0.0535474864396182, 0.0535474864396182, 0.0535474864396182, 0.0535474864396182, 0.04449707214940176, 0.8454443708386334, 0.04449707214940176, 0.04449707214940176, 0.04449707214940176, 0.04449707214940176, 0.07084394961071463, 0.01771098740267866, 0.053132962208035976, 0.01771098740267866, 0.03542197480535732, 0.8147054205232183, 0.06385135599047873, 0.7662162718857447, 0.06385135599047873, 0.06385135599047873, 0.06385135599047873, 0.06385135599047873, 0.6377906814098223, 0.12406339282218462, 0.04717903670702795, 0.10833704725317529, 0.03844217805757833, 0.03320006286790856, 0.010484230379339545, 0.001747371729889924, 0.07689981210017242, 0.07689981210017242, 0.07689981210017242, 0.07689981210017242, 0.07689981210017242, 0.6920983089015519, 0.11364966246208974, 0.11364966246208974, 0.11364966246208974, 0.11364966246208974, 0.5682483123104487, 0.11364966246208974, 0.2327395784652167, 0.2420491616038254, 0.05585749883165201, 0.08378624824747802, 0.3444545761285207, 0.03723833255443467, 0.8327808828951379, 0.04383057278395462, 0.04383057278395462, 0.04383057278395462, 0.04383057278395462, 0.04383057278395462, 0.39030001145305276, 0.43047795380851406, 0.07461617866014243, 0.08035588471092263, 0.011479412101560374, 0.011479412101560374, 0.045249077220044384, 0.045249077220044384, 0.045249077220044384, 0.045249077220044384, 0.045249077220044384, 0.7692343127407546, 0.08676943123557859, 0.043384715617789296, 0.021692357808894648, 0.7592325233113127, 0.043384715617789296, 0.021692357808894648, 0.021692357808894648, 0.11821460760516828, 0.01970243460086138, 0.01970243460086138, 0.7289900802318711, 0.05910730380258414, 0.05910730380258414, 0.2919098676217776, 0.1595773942999051, 0.03502918411461332, 0.21795936782426065, 0.26466494664374507, 0.02724492097803258, 0.0038921315682903684, 0.08875834627007796, 0.08875834627007796, 0.08875834627007796, 0.08875834627007796, 0.6213084238905457, 0.08875834627007796, 0.4761805194027256, 0.2191014659828492, 0.11977546807062422, 0.04382029319656984, 0.11685411519085291, 0.02044947015839926, 0.0029213528797713225, 0.7397796698197386, 0.0817510142373262, 0.08041083367605854, 0.046906319644367485, 0.03752505571549399, 0.012061625051408783, 0.0013401805612676424, 0.5491674660495098, 0.2349530911448934, 0.07643052362544725, 0.05095368241696483, 0.03679988174558571, 0.05095368241696483, 0.002830760134275824, 0.822856703278728, 0.02598494852459141, 0.08661649508197136, 0.02598494852459141, 0.02598494852459141, 0.008661649508197138, 0.008661649508197138, 0.48560536301992335, 0.11491424621082155, 0.1482764467236407, 0.08525895686609342, 0.10750042387463953, 0.04448293401709222, 0.014827644672364072, 0.003706911168091018, 0.1573990174235716, 0.1573990174235716, 0.1573990174235716, 0.1573990174235716, 0.1573990174235716, 0.1573990174235716, 0.3147980348471432, 0.16521485830876517, 0.16521485830876517, 0.16521485830876517, 0.16521485830876517, 0.16521485830876517, 0.16521485830876517, 0.33042971661753034, 0.10022369161525956, 0.10022369161525956, 0.10022369161525956, 0.10022369161525956, 0.6013421496915573, 0.10022369161525956, 0.08408391323077645, 0.08408391323077645, 0.1681678264615529, 0.08408391323077645, 0.5885873926154351, 0.08408391323077645, 0.029858776989521343, 0.029858776989521343, 0.8957633096856403, 0.029858776989521343, 0.029858776989521343, 0.029858776989521343, 0.9391991182960058, 0.019167328944816445, 0.019167328944816445, 0.019167328944816445, 0.019167328944816445, 0.019167328944816445, 0.049966850116200515, 0.049966850116200515, 0.049966850116200515, 0.049966850116200515, 0.049966850116200515, 0.7994696018592082, 0.8575473007473423, 0.008575473007473423, 0.008575473007473423, 0.06860378405978738, 0.008575473007473423, 0.04287736503736712, 0.8548004262087487, 0.0388545648276704, 0.0388545648276704, 0.0388545648276704, 0.0388545648276704, 0.0388545648276704, 0.06625387700005489, 0.06625387700005489, 0.06625387700005489, 0.06625387700005489, 0.7287926470006039, 0.06625387700005489, 0.04048773054126888, 0.04048773054126888, 0.04048773054126888, 0.7287791497428399, 0.04048773054126888, 0.04048773054126888, 0.08097546108253777, 0.10480644395673248, 0.11977879309340855, 0.05988939654670428, 0.05988939654670428, 0.04491704741002821, 0.6138663146037189, 0.06777507601071775, 0.06777507601071775, 0.06777507601071775, 0.06777507601071775, 0.06777507601071775, 0.7455258361178952, 0.032023236476018214, 0.032023236476018214, 0.8646273848524918, 0.032023236476018214, 0.032023236476018214, 0.032023236476018214, 0.8808633897867095, 0.03262456999210035, 0.03262456999210035, 0.03262456999210035, 0.03262456999210035, 0.03262456999210035, 0.11722617234526708, 0.05861308617263354, 0.7033570340716024, 0.05861308617263354, 0.05861308617263354, 0.05861308617263354, 0.3789635197579869, 0.14658022934035342, 0.21093252514831348, 0.05720204071818671, 0.1501553568852401, 0.03575127544886669, 0.014300510179546677, 0.007150255089773338, 0.1982431827394577, 0.07757341933283128, 0.26288769885015045, 0.12497939781400594, 0.19393354833207818, 0.09481195696234934, 0.04740597848117467, 0.8714106467762119, 0.03485642587104847, 0.03485642587104847, 0.03485642587104847, 0.03485642587104847, 0.03485642587104847, 0.8908897812285154, 0.025453993749386155, 0.025453993749386155, 0.025453993749386155, 0.025453993749386155, 0.025453993749386155, 0.08865732103357268, 0.17731464206714537, 0.15958317786043083, 0.20391183837721719, 0.04432866051678634, 0.2837034273074326, 0.008865732103357268, 0.03546292841342907, 0.5303033743628962, 0.14080015150578137, 0.12764125884168967, 0.06711035258686776, 0.057899127722003556, 0.06711035258686776, 0.00789533559845503, 0.0013158892664091716, 0.1733863821388717, 0.49624378336297764, 0.10761913374136864, 0.20925942671932793, 0.005978840763409369, 0.005978840763409369, 0.09158881655240651, 0.09158881655240651, 0.09158881655240651, 0.09158881655240651, 0.09158881655240651, 0.2747664496572195, 0.36635526620962605, 0.06383440921759075, 0.06383440921759075, 0.06383440921759075, 0.06383440921759075, 0.7660129106110889, 0.06383440921759075, 0.12647846811357707, 0.7096847377484048, 0.08080568796145203, 0.028106326247461576, 0.024593035466528877, 0.02107974468559618, 0.007026581561865394, 0.07609372048686838, 0.07609372048686838, 0.07609372048686838, 0.6848434843818154, 0.07609372048686838, 0.07609372048686838, 0.0553113808588536, 0.0553113808588536, 0.7743593320239504, 0.0553113808588536, 0.0553113808588536, 0.0553113808588536, 0.06196913618006874, 0.06196913618006874, 0.06196913618006874, 0.7436296341608248, 0.06196913618006874, 0.06196913618006874, 0.5838480296685095, 0.12139414478256136, 0.08092942985504091, 0.06166051798479307, 0.0924907769771896, 0.0346840413664461, 0.025049585431322186, 0.08160375535297561, 0.08160375535297561, 0.16320751070595121, 0.16320751070595121, 0.16320751070595121, 0.08160375535297561, 0.08160375535297561, 0.16320751070595121, 0.8629694242017227, 0.02332349795139791, 0.02332349795139791, 0.06997049385419374, 0.02332349795139791, 0.02332349795139791, 0.5728049985130154, 0.376414713308553, 0.02727642850061978, 0.016365857100371868, 0.005455285700123956, 0.005455285700123956, 0.195701855823609, 0.2739825981530526, 0.195701855823609, 0.0391403711647218, 0.0391403711647218, 0.1174211134941654, 0.1174211134941654, 0.07769301548253571, 0.7769301548253571, 0.05826976161190178, 0.019423253870633927, 0.019423253870633927, 0.05826976161190178, 0.019423253870633927, 0.15751277111008327, 0.15751277111008327, 0.15751277111008327, 0.15751277111008327, 0.15751277111008327, 0.15751277111008327, 0.31502554222016654, 0.03286997670143497, 0.23008983691004475, 0.1972198602086098, 0.016434988350717484, 0.016434988350717484, 0.5094846388722419, 0.2776858386922695, 0.24991725482304253, 0.027768583869226947, 0.055537167738453894, 0.027768583869226947, 0.027768583869226947, 0.3054544225614964, 0.5674215685771294, 0.10075710096229401, 0.14105994134721161, 0.05196945207528849, 0.05515125526357146, 0.07848447864431322, 0.004242404251043958, 0.0010606010627609895, 0.06367547600488589, 0.3396025386927248, 0.07428805533903354, 0.11673837267562413, 0.05306289667073824, 0.22286416601710063, 0.12735095200977178, 0.30412424580275516, 0.08944830758904564, 0.3220139073205643, 0.03577932303561825, 0.053668984553427376, 0.03577932303561825, 0.16100695366028214, 0.0991166999347628, 0.0991166999347628, 0.0991166999347628, 0.0991166999347628, 0.0991166999347628, 0.0991166999347628, 0.495583499673814, 0.1772109522227185, 0.1772109522227185, 0.1772109522227185, 0.1772109522227185, 0.1772109522227185, 0.1772109522227185, 0.1772109522227185, 0.42400110488833753, 0.16723024944353684, 0.1422115507078896, 0.14616187156088653, 0.047403850235963205, 0.06188836003028529, 0.007900641705993867, 0.002633547235331289, 0.0325384919999194, 0.0325384919999194, 0.8785392839978239, 0.0325384919999194, 0.0325384919999194, 0.0325384919999194, 0.0860316908346183, 0.0860316908346183, 0.2580950725038549, 0.3441267633384732, 0.0860316908346183, 0.0860316908346183, 0.1720633816692366, 0.3857813818132321, 0.06429689696887203, 0.027555812986659437, 0.01837054199110629, 0.20207596190216923, 0.28474340086214756, 0.009185270995553146, 0.024813976998129986, 0.024813976998129986, 0.8188612409382895, 0.07444193099438996, 0.024813976998129986, 0.024813976998129986, 0.6114253003905834, 0.10256166329132367, 0.059170190360379044, 0.035502114216227426, 0.02761275550151022, 0.1380637775075511, 0.02761275550151022, 0.1772686408076247, 0.1772686408076247, 0.1772686408076247, 0.1772686408076247, 0.1772686408076247, 0.1772686408076247, 0.1772686408076247, 0.0335766482664123, 0.0671532965328246, 0.8226278825271014, 0.0335766482664123, 0.01678832413320615, 0.01678832413320615, 0.01678832413320615, 0.36653570804391644, 0.2382482102285457, 0.09774285547837772, 0.15577767591866448, 0.08857946277727981, 0.04276249927179025, 0.006108928467398607, 0.0030544642336993036, 0.17853484065109274, 0.1428278725208742, 0.059511613550364245, 0.35706968130218547, 0.22614413149138413, 0.01190232271007285, 0.01190232271007285, 0.21107249850077794, 0.052768124625194486, 0.052768124625194486, 0.10553624925038897, 0.5276812462519449, 0.052768124625194486, 0.07366598481057238, 0.07366598481057238, 0.07366598481057238, 0.07366598481057238, 0.07366598481057238, 0.6629938632951514, 0.06796383647704203, 0.06796383647704203, 0.7476022012474624, 0.06796383647704203, 0.06796383647704203, 0.06796383647704203, 0.11364566171091571, 0.11364566171091571, 0.11364566171091571, 0.11364566171091571, 0.5682283085545785, 0.11364566171091571, 0.0807383186379857, 0.0807383186379857, 0.0807383186379857, 0.7266448677418713, 0.0807383186379857, 0.0807383186379857, 0.08512038131049321, 0.08512038131049321, 0.042560190655246605, 0.7660834317944389, 0.042560190655246605, 0.042560190655246605, 0.1381600905745322, 0.1381600905745322, 0.1381600905745322, 0.1381600905745322, 0.1381600905745322, 0.1381600905745322, 0.41448027172359664, 0.04459441863549633, 0.04459441863549633, 0.04459441863549633, 0.802699535438934, 0.04459441863549633, 0.04459441863549633, 0.19962527557326995, 0.19962527557326995, 0.19962527557326995, 0.19962527557326995, 0.19962527557326995, 0.19962527557326995, 0.19962527557326995, 0.7695728822015392, 0.16753533541732624, 0.014982834874720232, 0.023155290260931268, 0.008172455386211036, 0.008172455386211036, 0.006810379488509196, 0.0013620758977018392, 0.07517799827949559, 0.07517799827949559, 0.7517799827949558, 0.07517799827949559, 0.07517799827949559, 0.07517799827949559, 0.08074645635678535, 0.08074645635678535, 0.08074645635678535, 0.7267181072110681, 0.08074645635678535, 0.08074645635678535, 0.7136639551231962, 0.12374012125119935, 0.04892051305279974, 0.04028748133759979, 0.05467586752959971, 0.014388386191999924, 0.0028776772383999846, 0.061182825798252335, 0.061182825798252335, 0.061182825798252335, 0.7341939095790281, 0.061182825798252335, 0.061182825798252335, 0.8428764383467636, 0.07135461911930274, 0.03121764586469495, 0.013378991084869264, 0.022298318474782107, 0.013378991084869264, 0.004459663694956421, 0.07244824807272017, 0.07244824807272017, 0.07244824807272017, 0.07244824807272017, 0.07244824807272017, 0.07244824807272017, 0.6520342326544816, 0.3180966393059862, 0.1720850671655335, 0.020858796020064668, 0.03650289303511317, 0.026073495025080835, 0.35459953234109937, 0.07300578607022634, 0.12746832841326186, 0.12746832841326186, 0.12746832841326186, 0.12746832841326186, 0.12746832841326186, 0.12746832841326186, 0.5098733136530474, 0.10066067064733723, 0.10066067064733723, 0.10066067064733723, 0.10066067064733723, 0.10066067064733723, 0.10066067064733723, 0.6039640238840234, 0.14136198264440136, 0.14136198264440136, 0.14136198264440136, 0.14136198264440136, 0.14136198264440136, 0.14136198264440136, 0.2827239652888027, 0.06774202220167731, 0.06774202220167731, 0.06774202220167731, 0.13548404440335463, 0.06774202220167731, 0.06774202220167731, 0.6096781998150957, 0.18703299798322445, 0.062344332661074815, 0.6546154929412856, 0.015586083165268704, 0.015586083165268704, 0.062344332661074815, 0.11351747718242755, 0.11351747718242755, 0.11351747718242755, 0.11351747718242755, 0.5675873859121378, 0.11351747718242755, 0.07263261408095599, 0.14526522816191198, 0.07263261408095599, 0.07263261408095599, 0.5810609126476479, 0.07263261408095599, 0.1651915123950146, 0.1651915123950146, 0.1651915123950146, 0.1651915123950146, 0.1651915123950146, 0.1651915123950146, 0.3303830247900292, 0.1185072165194865, 0.1185072165194865, 0.1185072165194865, 0.1185072165194865, 0.1185072165194865, 0.1185072165194865, 0.474028866077946, 0.011060856364736096, 0.6525905255194298, 0.022121712729472193, 0.09954770728262487, 0.13273027637683316, 0.07742599455315267, 0.8764758271698811, 0.03505903308679524, 0.03505903308679524, 0.03505903308679524, 0.03505903308679524, 0.03505903308679524, 0.9411473419884957, 0.011619102987512292, 0.023238205975024583, 0.011619102987512292, 0.011619102987512292, 0.011619102987512292, 0.033218966795236796, 0.6975983026999727, 0.13287586718094718, 0.09965690038571039, 0.033218966795236796, 0.033218966795236796, 0.031920713375347756, 0.7660971210083461, 0.12768285350139102, 0.031920713375347756, 0.031920713375347756, 0.031920713375347756, 0.06518241123000086, 0.06518241123000086, 0.06518241123000086, 0.06518241123000086, 0.06518241123000086, 0.7821889347600104, 0.4102977365195724, 0.2622789328384609, 0.14801880368111156, 0.06751734904752457, 0.06232370681309961, 0.028565032289337322, 0.012984105586062418, 0.005193642234424967, 0.062268608158721206, 0.062268608158721206, 0.7472232979046545, 0.062268608158721206, 0.062268608158721206, 0.062268608158721206, 0.08876109159009431, 0.08876109159009431, 0.08876109159009431, 0.08876109159009431, 0.6213276411306601, 0.08876109159009431, 0.8212210871279612, 0.0581052655986765, 0.0503578968521863, 0.0542315812254314, 0.0077473687464902, 0.0038736843732451, 0.053648393532393576, 0.053648393532393576, 0.053648393532393576, 0.8047259029859036, 0.053648393532393576, 0.053648393532393576, 0.055295742467745344, 0.055295742467745344, 0.055295742467745344, 0.7741403945484349, 0.055295742467745344, 0.055295742467745344, 0.4285410496584677, 0.028569403310564512, 0.028569403310564512, 0.028569403310564512, 0.4571104529690322, 0.028569403310564512, 0.031962921709842156, 0.031962921709842156, 0.06392584341968431, 0.031962921709842156, 0.06392584341968431, 0.7031842776165275, 0.031962921709842156, 0.8354316533092433, 0.011850094373180756, 0.05925047186590378, 0.04740037749272302, 0.02370018874636151, 0.017775141559771133, 0.005925047186590378, 0.028521499986056683, 0.11408599994422673, 0.028521499986056683, 0.6559944996793038, 0.14260749993028343, 0.028521499986056683, 0.13288468898161526, 0.13288468898161526, 0.13288468898161526, 0.13288468898161526, 0.13288468898161526, 0.2657693779632305, 0.2657693779632305, 0.6164689818607988, 0.2015941220303086, 0.08034548341787662, 0.030677366395916526, 0.020451577597277684, 0.023373231539745926, 0.023373231539745926, 0.0014608269712341204, 0.11205457530197192, 0.7843820271138034, 0.037351525100657305, 0.037351525100657305, 0.037351525100657305, 0.037351525100657305, 0.09921053830349592, 0.7341579834458698, 0.019842107660699184, 0.03968421532139837, 0.07936843064279674, 0.03968421532139837, 0.06241852458816946, 0.811440819646203, 0.03120926229408473, 0.03120926229408473, 0.03120926229408473, 0.03120926229408473, 0.6579848678008842, 0.13648602520947134, 0.06722446017779932, 0.02444525824647248, 0.020371048538727066, 0.0611131456161812, 0.032593677661963305, 0.19100308138894675, 0.10284781305558671, 0.5142390652779335, 0.08815526833336004, 0.04407763416668002, 0.04407763416668002, 0.014692544722226673, 0.049397568396370835, 0.049397568396370835, 0.049397568396370835, 0.7409635259455625, 0.049397568396370835, 0.049397568396370835, 0.049397568396370835, 0.9107130779204209, 0.021179373905126066, 0.021179373905126066, 0.021179373905126066, 0.021179373905126066, 0.021179373905126066, 0.058462278712080526, 0.7600096232570468, 0.058462278712080526, 0.058462278712080526, 0.058462278712080526, 0.058462278712080526, 0.175866047739625, 0.0879330238698125, 0.0879330238698125, 0.0879330238698125, 0.6155311670886875, 0.0879330238698125, 0.5892098139003269, 0.04332425102208286, 0.2772752065413303, 0.030326975715458, 0.017329700408833144, 0.03465940081766629, 0.008664850204416572, 0.5261106325237425, 0.12351482903231993, 0.11647917421402322, 0.09380873091062274, 0.057848717394884015, 0.07504698472849819, 0.005472175969786326, 0.0007817394242551894, 0.9128412720704658, 0.013229583653195157, 0.026459167306390315, 0.013229583653195157, 0.013229583653195157, 0.013229583653195157, 0.013229583653195157, 0.5077807796199524, 0.1764746316056228, 0.1198695610906117, 0.0982264458936957, 0.05327536048471631, 0.03995652036353724, 0.0033297100302947693, 0.7104737330448258, 0.07734996287181571, 0.020053694077878145, 0.10026847038939073, 0.025783320957271903, 0.06589070911302819, 0.002864813439696878, 0.020928989573875272, 0.7534436246595098, 0.020928989573875272, 0.020928989573875272, 0.020928989573875272, 0.1465029270171269, 0.07696761576431099, 0.07696761576431099, 0.07696761576431099, 0.6927085418787989, 0.07696761576431099, 0.07696761576431099, 0.07624972558084052, 0.07624972558084052, 0.07624972558084052, 0.07624972558084052, 0.07624972558084052, 0.07624972558084052, 0.6099978046467242, 0.5342575191719461, 0.01484048664366517, 0.01484048664366517, 0.4006931393789596, 0.01484048664366517, 0.01484048664366517, 0.01484048664366517, 0.599213736882808, 0.074901717110351, 0.074901717110351, 0.074901717110351, 0.0374508585551755, 0.074901717110351, 0.11235257566552649, 0.03971007560716314, 0.03971007560716314, 0.03971007560716314, 0.11913022682148942, 0.5956511341074471, 0.1985503780358157, 0.13356341203376548, 0.13356341203376548, 0.13356341203376548, 0.5175582216308412, 0.050086279512662056, 0.016695426504220685, 0.016695426504220685, 0.062859432275882, 0.062859432275882, 0.062859432275882, 0.062859432275882, 0.062859432275882, 0.754313187310584, 0.058562243144976316, 0.058562243144976316, 0.058562243144976316, 0.058562243144976316, 0.058562243144976316, 0.761309160884692, 0.5063348750465807, 0.2307090358075146, 0.09800029839611239, 0.11637535434538347, 0.03879178478179449, 0.008166691533009366, 0.0020416728832523415, 0.05512415475674204, 0.05512415475674204, 0.7717381665943885, 0.05512415475674204, 0.05512415475674204, 0.05512415475674204, 0.47654599437847894, 0.16954040184618963, 0.06873259534304985, 0.1374651906860997, 0.05727716278587487, 0.08247911441165982, 0.006873259534304984, 0.05755337828222464, 0.05755337828222464, 0.05755337828222464, 0.7481939176689203, 0.05755337828222464, 0.05755337828222464, 0.34053208773292243, 0.2140487408606941, 0.048647441104703204, 0.06810641754658447, 0.00972948822094064, 0.3113436230701005], \"Term\": [\"abstract_game\", \"abstract_game\", \"abstract_game\", \"abstract_game\", \"abstract_game\", \"abstract_game\", \"abstraction\", \"abstraction\", \"abstraction\", \"abstraction\", \"abstraction\", \"abstraction\", \"activate\", \"activate\", \"activate\", \"activate\", \"activate\", \"activate\", \"active\", \"active\", \"active\", \"active\", \"active\", \"active\", \"active\", \"active\", \"activity\", \"activity\", \"activity\", \"activity\", \"activity\", \"activity\", \"activity\", \"activity\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"algorithm\", \"alice\", \"alice\", \"alice\", \"alice\", \"alice\", \"alice\", \"allocate\", \"allocate\", \"allocate\", \"allocate\", \"allocate\", \"allocate\", \"allocate\", \"also\", \"also\", \"also\", \"also\", \"also\", \"also\", \"also\", \"also\", \"amplitude\", \"amplitude\", \"amplitude\", \"amplitude\", \"amplitude\", \"amplitude\", \"amplitude\", \"amplitude\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"approach\", \"ard\", \"ard\", \"ard\", \"ard\", \"ard\", \"ard\", \"assembly\", \"assembly\", \"assembly\", \"assembly\", \"assembly\", \"assembly\", \"asset\", \"asset\", \"asset\", \"asset\", \"asset\", \"asset\", \"asynchronous\", \"asynchronous\", \"asynchronous\", \"asynchronous\", \"asynchronous\", \"asynchronous\", \"atom\", \"atom\", \"atom\", \"atom\", \"atom\", \"atom\", \"auditory\", \"auditory\", \"auditory\", \"auditory\", \"auditory\", \"auditory\", \"balloon\", \"balloon\", \"balloon\", \"balloon\", \"balloon\", \"balloon\", \"base\", \"base\", \"base\", \"base\", \"base\", \"base\", \"base\", \"base\", \"base_strategy\", \"base_strategy\", \"base_strategy\", \"base_strategy\", \"base_strategy\", \"base_strategy\", \"basis_function\", \"basis_function\", \"basis_function\", \"basis_function\", \"basis_function\", \"basis_function\", \"batch\", \"batch\", \"batch\", \"batch\", \"batch\", \"batch\", \"bayes_net\", \"bayes_net\", \"bayes_net\", \"bayes_net\", \"bayes_net\", \"bayes_net\", \"bayesian_hebb\", \"bayesian_hebb\", \"bayesian_hebb\", \"bayesian_hebb\", \"bayesian_hebb\", \"bayesian_hebb\", \"bdc_lstm\", \"bdc_lstm\", \"bdc_lstm\", \"bdc_lstm\", \"bdc_lstm\", \"bdc_lstm\", \"beat\", \"beat\", \"beat\", \"beat\", \"beat\", \"beat\", \"belief\", \"belief\", \"belief\", \"belief\", \"belief\", \"belief\", \"belief_divergence\", \"belief_divergence\", \"belief_divergence\", \"belief_divergence\", \"belief_divergence\", \"belief_divergence\", \"belief_polarization\", \"belief_polarization\", \"belief_polarization\", \"belief_polarization\", \"belief_polarization\", \"belief_polarization\", \"belief_revision\", \"belief_revision\", \"belief_revision\", \"belief_revision\", \"belief_revision\", \"belief_revision\", \"bet\", \"bet\", \"bet\", \"bet\", \"bet\", \"bet\", \"bidirectional_coupling\", \"bidirectional_coupling\", \"bidirectional_coupling\", \"bidirectional_coupling\", \"bidirectional_coupling\", \"bidirectional_coupling\", \"bidirectional_coupling\", \"binaural\", \"binaural\", \"binaural\", \"binaural\", \"binaural\", \"binaural\", \"bind\", \"bind\", \"bind\", \"bind\", \"bind\", \"bind\", \"bind\", \"biomedical\", \"biomedical\", \"biomedical\", \"biomedical\", \"biomedical\", \"biomedical\", \"black_schole\", \"black_schole\", \"black_schole\", \"black_schole\", \"black_schole\", \"black_schole\", \"blackboard\", \"blackboard\", \"blackboard\", \"blackboard\", \"blackboard\", \"blackboard\", \"bm\", \"bm\", \"bm\", \"bm\", \"bm\", \"bm\", \"bm\", \"bm_velocity\", \"bm_velocity\", \"bm_velocity\", \"bm_velocity\", \"bm_velocity\", \"bm_velocity\", \"bm_velocity\", \"bms\", \"bms\", \"bms\", \"bms\", \"bms\", \"bms\", \"bms\", \"bound\", \"bound\", \"bound\", \"bound\", \"bound\", \"bound\", \"bound\", \"bstc\", \"bstc\", \"bstc\", \"bstc\", \"bstc\", \"bstc\", \"bulb\", \"bulb\", \"bulb\", \"bulb\", \"bulb\", \"bulb\", \"bulb\", \"bulbar\", \"bulbar\", \"bulbar\", \"bulbar\", \"bulbar\", \"bulbar\", \"bulbar\", \"bump\", \"bump\", \"bump\", \"bump\", \"bump\", \"bump\", \"cann\", \"cann\", \"cann\", \"cann\", \"cann\", \"cann\", \"cardiologist\", \"cardiologist\", \"cardiologist\", \"cardiologist\", \"cardiologist\", \"cardiologist\", \"carol\", \"carol\", \"carol\", \"carol\", \"carol\", \"carol\", \"case\", \"case\", \"case\", \"case\", \"case\", \"case\", \"case\", \"case\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"categorization\", \"cell\", \"cell\", \"cell\", \"cell\", \"cell\", \"cell\", \"cell\", \"cell\", \"child\", \"child\", \"child\", \"child\", \"child\", \"child\", \"chip\", \"chip\", \"chip\", \"chip\", \"chip\", \"chip\", \"chip\", \"chip\", \"choice\", \"choice\", \"choice\", \"choice\", \"choice\", \"choice\", \"choice\", \"circuit\", \"circuit\", \"circuit\", \"circuit\", \"circuit\", \"circuit\", \"circuit\", \"class\", \"class\", \"class\", \"class\", \"class\", \"class\", \"class\", \"class\", \"classifier\", \"classifier\", \"classifier\", \"classifier\", \"classifier\", \"classifier\", \"click\", \"click\", \"click\", \"click\", \"click\", \"click\", \"cluster\", \"cluster\", \"cluster\", \"cluster\", \"cluster\", \"cluster\", \"cochlea\", \"cochlea\", \"cochlea\", \"cochlea\", \"cochlea\", \"cochlea\", \"cochlea\", \"cochlear\", \"cochlear\", \"cochlear\", \"cochlear\", \"cochlear\", \"cochlear\", \"cochlear\", \"code\", \"code\", \"code\", \"code\", \"code\", \"code\", \"code\", \"cogan\", \"cogan\", \"cogan\", \"cogan\", \"cogan\", \"cogan\", \"collision\", \"collision\", \"collision\", \"collision\", \"collision\", \"collision\", \"communication\", \"communication\", \"communication\", \"communication\", \"communication\", \"communication\", \"completeness\", \"completeness\", \"completeness\", \"completeness\", \"completeness\", \"completeness\", \"component\", \"component\", \"component\", \"component\", \"component\", \"component\", \"component\", \"composition\", \"composition\", \"composition\", \"composition\", \"composition\", \"composition\", \"composition\", \"concept\", \"concept\", \"concept\", \"concept\", \"concept\", \"concept\", \"concept\", \"connection\", \"connection\", \"connection\", \"connection\", \"connection\", \"connection\", \"connection\", \"connection\", \"connection_topology\", \"connection_topology\", \"connection_topology\", \"connection_topology\", \"connection_topology\", \"connection_topology\", \"connection_topology\", \"contour_fragment\", \"contour_fragment\", \"contour_fragment\", \"contour_fragment\", \"contour_fragment\", \"contour_fragment\", \"contour_fragment\", \"contrary_update\", \"contrary_update\", \"contrary_update\", \"contrary_update\", \"contrary_update\", \"contrary_update\", \"contrast\", \"contrast\", \"contrast\", \"contrast\", \"contrast\", \"contrast\", \"contrast\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"convergence\", \"cooperative_competitive\", \"cooperative_competitive\", \"cooperative_competitive\", \"cooperative_competitive\", \"cooperative_competitive\", \"cooperative_competitive\", \"coordinate\", \"coordinate\", \"coordinate\", \"coordinate\", \"coordinate\", \"coordinate\", \"coordinate\", \"coordination\", \"coordination\", \"coordination\", \"coordination\", \"coordination\", \"coordination\", \"coordination\", \"coordinator\", \"coordinator\", \"coordinator\", \"coordinator\", \"coordinator\", \"coordinator\", \"cost\", \"cost\", \"cost\", \"cost\", \"cost\", \"cost\", \"couple\", \"couple\", \"couple\", \"couple\", \"couple\", \"couple\", \"couple\", \"couple\", \"cow_horse\", \"cow_horse\", \"cow_horse\", \"cow_horse\", \"cow_horse\", \"cow_horse\", \"cow_horse\", \"cpds\", \"cpds\", \"cpds\", \"cpds\", \"cpds\", \"cpds\", \"current\", \"current\", \"current\", \"current\", \"current\", \"current\", \"current\", \"current\", \"damp\", \"damp\", \"damp\", \"damp\", \"damp\", \"damp\", \"damp\", \"datum\", \"datum\", \"datum\", \"datum\", \"datum\", \"datum\", \"datum\", \"datum\", \"degree\", \"degree\", \"degree\", \"degree\", \"degree\", \"degree\", \"degree\", \"density_ratio\", \"density_ratio\", \"density_ratio\", \"density_ratio\", \"density_ratio\", \"density_ratio\", \"detection\", \"detection\", \"detection\", \"detection\", \"detection\", \"detection\", \"detection\", \"device\", \"device\", \"device\", \"device\", \"device\", \"device\", \"device\", \"different\", \"different\", \"different\", \"different\", \"different\", \"different\", \"different\", \"different\", \"discriminative\", \"discriminative\", \"discriminative\", \"discriminative\", \"discriminative\", \"discriminative\", \"discriminative\", \"discriminator\", \"discriminator\", \"discriminator\", \"discriminator\", \"discriminator\", \"discriminator\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"distribution\", \"document\", \"document\", \"document\", \"document\", \"document\", \"document\", \"domain\", \"domain\", \"domain\", \"domain\", \"domain\", \"domain\", \"domain\", \"domain\", \"domain_adaptation\", \"domain_adaptation\", \"domain_adaptation\", \"domain_adaptation\", \"domain_adaptation\", \"domain_adaptation\", \"duct\", \"duct\", \"duct\", \"duct\", \"duct\", \"duct\", \"duct\", \"dynamic\", \"dynamic\", \"dynamic\", \"dynamic\", \"dynamic\", \"dynamic\", \"dynamic\", \"dynamic\", \"dynamic_poole\", \"dynamic_poole\", \"dynamic_poole\", \"dynamic_poole\", \"dynamic_poole\", \"dynamic_poole\", \"dynamical\", \"dynamical\", \"dynamical\", \"dynamical\", \"dynamical\", \"dynamical\", \"dynamical\", \"edge\", \"edge\", \"edge\", \"edge\", \"edge\", \"edge\", \"edge\", \"efcient\", \"efcient\", \"efcient\", \"efcient\", \"efcient\", \"efcient\", \"efcient\", \"ekf\", \"ekf\", \"ekf\", \"ekf\", \"ekf\", \"ekf\", \"elevation\", \"elevation\", \"elevation\", \"elevation\", \"elevation\", \"elevation\", \"entity_resolution\", \"entity_resolution\", \"entity_resolution\", \"entity_resolution\", \"entity_resolution\", \"entity_resolution\", \"entropy_rate\", \"entropy_rate\", \"entropy_rate\", \"entropy_rate\", \"entropy_rate\", \"entropy_rate\", \"epsp\", \"epsp\", \"epsp\", \"epsp\", \"epsp\", \"epsp\", \"equation\", \"equation\", \"equation\", \"equation\", \"equation\", \"equation\", \"equation\", \"equation\", \"equilibrium\", \"equilibrium\", \"equilibrium\", \"equilibrium\", \"equilibrium\", \"equilibrium\", \"equilibrium\", \"error\", \"error\", \"error\", \"error\", \"error\", \"error\", \"error\", \"estimate\", \"estimate\", \"estimate\", \"estimate\", \"estimate\", \"estimate\", \"estimate\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"estimator\", \"evidence\", \"evidence\", \"evidence\", \"evidence\", \"evidence\", \"evidence\", \"example\", \"example\", \"example\", \"example\", \"example\", \"example\", \"example\", \"ext\", \"ext\", \"ext\", \"ext\", \"ext\", \"ext\", \"extensive_game\", \"extensive_game\", \"extensive_game\", \"extensive_game\", \"extensive_game\", \"extensive_game\", \"extrapolation\", \"extrapolation\", \"extrapolation\", \"extrapolation\", \"extrapolation\", \"extrapolation\", \"face_recognition\", \"face_recognition\", \"face_recognition\", \"face_recognition\", \"face_recognition\", \"face_recognition\", \"factor\", \"factor\", \"factor\", \"factor\", \"factor\", \"factor\", \"factor\", \"factor_loade\", \"factor_loade\", \"factor_loade\", \"factor_loade\", \"factor_loade\", \"factor_loade\", \"family\", \"family\", \"family\", \"family\", \"family\", \"family\", \"fcn\", \"fcn\", \"fcn\", \"fcn\", \"fcn\", \"fcn\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"feature\", \"feedbackward\", \"feedbackward\", \"feedbackward\", \"feedbackward\", \"feedbackward\", \"feedbackward\", \"feedbackward\", \"fidelity_criterion\", \"fidelity_criterion\", \"fidelity_criterion\", \"fidelity_criterion\", \"fidelity_criterion\", \"fidelity_criterion\", \"field\", \"field\", \"field\", \"field\", \"field\", \"field\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"fig\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"figure\", \"filter\", \"filter\", \"filter\", \"filter\", \"filter\", \"filter\", \"filter\", \"filter\", \"finger_position\", \"finger_position\", \"finger_position\", \"finger_position\", \"finger_position\", \"finger_position\", \"first\", \"first\", \"first\", \"first\", \"first\", \"first\", \"first\", \"first\", \"fluid\", \"fluid\", \"fluid\", \"fluid\", \"fluid\", \"fluid\", \"fluid\", \"follow\", \"follow\", \"follow\", \"follow\", \"follow\", \"follow\", \"follow\", \"follow\", \"force\", \"force\", \"force\", \"force\", \"force\", \"force\", \"force\", \"force\", \"forward_operation\", \"forward_operation\", \"forward_operation\", \"forward_operation\", \"forward_operation\", \"forward_operation\", \"free_energy\", \"free_energy\", \"free_energy\", \"free_energy\", \"free_energy\", \"free_energy\", \"freeman_wj\", \"freeman_wj\", \"freeman_wj\", \"freeman_wj\", \"freeman_wj\", \"freeman_wj\", \"freeman_wj\", \"frequency\", \"frequency\", \"frequency\", \"frequency\", \"frequency\", \"frequency\", \"frequency\", \"function\", \"function\", \"function\", \"function\", \"function\", \"function\", \"function\", \"function\", \"gabor\", \"gabor\", \"gabor\", \"gabor\", \"gabor\", \"gabor\", \"gabor\", \"game\", \"game\", \"game\", \"game\", \"game\", \"game\", \"gaussian\", \"gaussian\", \"gaussian\", \"gaussian\", \"gaussian\", \"gaussian\", \"gaussian\", \"gaussian_processe\", \"gaussian_processe\", \"gaussian_processe\", \"gaussian_processe\", \"gaussian_processe\", \"gaussian_processe\", \"generalization\", \"generalization\", \"generalization\", \"generalization\", \"generalization\", \"generalization\", \"generative\", \"generative\", \"generative\", \"generative\", \"generative\", \"generative\", \"generative\", \"genetic\", \"genetic\", \"genetic\", \"genetic\", \"genetic\", \"genetic\", \"give\", \"give\", \"give\", \"give\", \"give\", \"give\", \"give\", \"glm\", \"glm\", \"glm\", \"glm\", \"glm\", \"glm\", \"global_coordinate\", \"global_coordinate\", \"global_coordinate\", \"global_coordinate\", \"global_coordinate\", \"global_coordinate\", \"gls\", \"gls\", \"gls\", \"gls\", \"gls\", \"gls\", \"gp\", \"gp\", \"gp\", \"gp\", \"gp\", \"gp\", \"gp_ucb\", \"gp_ucb\", \"gp_ucb\", \"gp_ucb\", \"gp_ucb\", \"gp_ucb\", \"gpdm\", \"gpdm\", \"gpdm\", \"gpdm\", \"gpdm\", \"gpdm\", \"gradient\", \"gradient\", \"gradient\", \"gradient\", \"gradient\", \"gradient\", \"gradient_descent\", \"gradient_descent\", \"gradient_descent\", \"gradient_descent\", \"gradient_descent\", \"gradient_descent\", \"graft\", \"graft\", \"graft\", \"graft\", \"graft\", \"graft\", \"grafted_strategie\", \"grafted_strategie\", \"grafted_strategie\", \"grafted_strategie\", \"grafted_strategie\", \"grafted_strategie\", \"grafted_strategy\", \"grafted_strategy\", \"grafted_strategy\", \"grafted_strategy\", \"grafted_strategy\", \"grafted_strategy\", \"granule_cell\", \"granule_cell\", \"granule_cell\", \"granule_cell\", \"granule_cell\", \"granule_cell\", \"granule_cell\", \"graph\", \"graph\", \"graph\", \"graph\", \"graph\", \"graph\", \"graph\", \"group\", \"group\", \"group\", \"group\", \"group\", \"group\", \"group\", \"gy\", \"gy\", \"gy\", \"gy\", \"gy\", \"gy\", \"gy\", \"hca\", \"hca\", \"hca\", \"hca\", \"hca\", \"hca\", \"head_direction\", \"head_direction\", \"head_direction\", \"head_direction\", \"head_direction\", \"head_direction\", \"heartbeat\", \"heartbeat\", \"heartbeat\", \"heartbeat\", \"heartbeat\", \"heartbeat\", \"hedge\", \"hedge\", \"hedge\", \"hedge\", \"hedge\", \"hedge\", \"hemodynamic\", \"hemodynamic\", \"hemodynamic\", \"hemodynamic\", \"hemodynamic\", \"hemodynamic\", \"hide\", \"hide\", \"hide\", \"hide\", \"hide\", \"hide\", \"hierarchical\", \"hierarchical\", \"hierarchical\", \"hierarchical\", \"hierarchical\", \"hierarchical\", \"hierarchical\", \"hierarchy\", \"hierarchy\", \"hierarchy\", \"hierarchy\", \"hierarchy\", \"hierarchy\", \"hierarchy\", \"hopfield\", \"hopfield\", \"hopfield\", \"hopfield\", \"hopfield\", \"hopfield\", \"hopfield\", \"hopfield\", \"horse\", \"horse\", \"horse\", \"horse\", \"horse\", \"horse\", \"horse\", \"human\", \"human\", \"human\", \"human\", \"human\", \"human\", \"human_click\", \"human_click\", \"human_click\", \"human_click\", \"human_click\", \"human_click\", \"hyper_cube\", \"hyper_cube\", \"hyper_cube\", \"hyper_cube\", \"hyper_cube\", \"hyper_cube\", \"hyper_cube\", \"hypothesis\", \"hypothesis\", \"hypothesis\", \"hypothesis\", \"hypothesis\", \"hypothesis\", \"hypothesis\", \"iin\", \"iin\", \"iin\", \"iin\", \"iin\", \"iin\", \"iin\", \"iin_iin\", \"iin_iin\", \"iin_iin\", \"iin_iin\", \"iin_iin\", \"iin_iin\", \"iin_iin\", \"image\", \"image\", \"image\", \"image\", \"image\", \"image\", \"image\", \"image_identification\", \"image_identification\", \"image_identification\", \"image_identification\", \"image_identification\", \"image_identification\", \"increase\", \"increase\", \"increase\", \"increase\", \"increase\", \"increase\", \"increase\", \"increase\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"inference\", \"information\", \"information\", \"information\", \"information\", \"information\", \"information\", \"information\", \"information\", \"informational_coherence\", \"informational_coherence\", \"informational_coherence\", \"informational_coherence\", \"informational_coherence\", \"informational_coherence\", \"informational_coherence\", \"input\", \"input\", \"input\", \"input\", \"input\", \"input\", \"input\", \"input\", \"intersection\", \"intersection\", \"intersection\", \"intersection\", \"intersection\", \"intersection\", \"intuitionistic\", \"intuitionistic\", \"intuitionistic\", \"intuitionistic\", \"intuitionistic\", \"intuitionistic\", \"intuitionistic_modal\", \"intuitionistic_modal\", \"intuitionistic_modal\", \"intuitionistic_modal\", \"intuitionistic_modal\", \"intuitionistic_modal\", \"investor\", \"investor\", \"investor\", \"investor\", \"investor\", \"investor\", \"irl\", \"irl\", \"irl\", \"irl\", \"irl\", \"irl\", \"irl\", \"ise\", \"ise\", \"ise\", \"ise\", \"ise\", \"ise\", \"joint\", \"joint\", \"joint\", \"joint\", \"joint\", \"joint\", \"joint\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"kernel\", \"language\", \"language\", \"language\", \"language\", \"language\", \"language\", \"large\", \"large\", \"large\", \"large\", \"large\", \"large\", \"large\", \"large\", \"latent_coordinate\", \"latent_coordinate\", \"latent_coordinate\", \"latent_coordinate\", \"latent_coordinate\", \"latent_coordinate\", \"lattice\", \"lattice\", \"lattice\", \"lattice\", \"lattice\", \"lattice\", \"lattice\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"layer\", \"learn\", \"learn\", \"learn\", \"learn\", \"learn\", \"learn\", \"learn\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"learning\", \"least_square\", \"least_square\", \"least_square\", \"least_square\", \"least_square\", \"least_square\", \"let\", \"let\", \"let\", \"let\", \"let\", \"let\", \"let\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"likelihood\", \"lmc_ucb\", \"lmc_ucb\", \"lmc_ucb\", \"lmc_ucb\", \"lmc_ucb\", \"lmc_ucb\", \"lmc_ucb\", \"location\", \"location\", \"location\", \"location\", \"location\", \"location\", \"location\", \"location\", \"logic\", \"logic\", \"logic\", \"logic\", \"logic\", \"logic\", \"longitudinal\", \"longitudinal\", \"longitudinal\", \"longitudinal\", \"longitudinal\", \"longitudinal\", \"longitudinal\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"loss\", \"ltp\", \"ltp\", \"ltp\", \"ltp\", \"ltp\", \"ltp\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"matrix\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean\", \"mean_square\", \"mean_square\", \"mean_square\", \"mean_square\", \"mean_square\", \"mean_square\", \"mean_square\", \"measure\", \"measure\", \"measure\", \"measure\", \"measure\", \"measure\", \"measure\", \"measure\", \"method\", \"method\", \"method\", \"method\", \"method\", \"method\", \"method\", \"mfas\", \"mfas\", \"mfas\", \"mfas\", \"mfas\", \"mfas\", \"miss\", \"miss\", \"miss\", \"miss\", \"miss\", \"miss\", \"miss\", \"mitral\", \"mitral\", \"mitral\", \"mitral\", \"mitral\", \"mitral\", \"mitral\", \"mode\", \"mode\", \"mode\", \"mode\", \"mode\", \"mode\", \"mode\", \"mode\", \"model\", \"model\", \"model\", \"model\", \"model\", \"model\", \"model\", \"model\", \"moment\", \"moment\", \"moment\", \"moment\", \"moment\", \"moment\", \"motile\", \"motile\", \"motile\", \"motile\", \"motile\", \"motile\", \"motile\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motion\", \"motor_program\", \"motor_program\", \"motor_program\", \"motor_program\", \"motor_program\", \"motor_program\", \"move\", \"move\", \"move\", \"move\", \"move\", \"move\", \"nash_equilibrium\", \"nash_equilibrium\", \"nash_equilibrium\", \"nash_equilibrium\", \"nash_equilibrium\", \"nash_equilibrium\", \"natural_image\", \"natural_image\", \"natural_image\", \"natural_image\", \"natural_image\", \"natural_image\", \"natural_scene\", \"natural_scene\", \"natural_scene\", \"natural_scene\", \"natural_scene\", \"natural_scene\", \"network\", \"network\", \"network\", \"network\", \"network\", \"network\", \"network\", \"network\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neural\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"neuron\", \"node\", \"node\", \"node\", \"node\", \"node\", \"node\", \"node\", \"node_perturbation\", \"node_perturbation\", \"node_perturbation\", \"node_perturbation\", \"node_perturbation\", \"node_perturbation\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"noise\", \"non_decomposable\", \"non_decomposable\", \"non_decomposable\", \"non_decomposable\", \"non_decomposable\", \"non_decomposable\", \"nonlinear\", \"nonlinear\", \"nonlinear\", \"nonlinear\", \"nonlinear\", \"nonlinear\", \"nonlinear\", \"nonlinear\", \"nonlinearity\", \"nonlinearity\", \"nonlinearity\", \"nonlinearity\", \"nonlinearity\", \"nonlinearity\", \"normative\", \"normative\", \"normative\", \"normative\", \"normative\", \"normative\", \"number\", \"number\", \"number\", \"number\", \"number\", \"number\", \"number\", \"number\", \"object\", \"object\", \"object\", \"object\", \"object\", \"object\", \"object\", \"odor\", \"odor\", \"odor\", \"odor\", \"odor\", \"odor\", \"odor\", \"olfactory\", \"olfactory\", \"olfactory\", \"olfactory\", \"olfactory\", \"olfactory\", \"olfactory\", \"olfactory_bulb\", \"olfactory_bulb\", \"olfactory_bulb\", \"olfactory_bulb\", \"olfactory_bulb\", \"olfactory_bulb\", \"olfactory_bulb\", \"online\", \"online\", \"online\", \"online\", \"online\", \"online\", \"optimal_oracle\", \"optimal_oracle\", \"optimal_oracle\", \"optimal_oracle\", \"optimal_oracle\", \"optimal_oracle\", \"optimal_oracle\", \"option\", \"option\", \"option\", \"option\", \"option\", \"option\", \"oq_complexity\", \"oq_complexity\", \"oq_complexity\", \"oq_complexity\", \"oq_complexity\", \"oq_complexity\", \"order\", \"order\", \"order\", \"order\", \"order\", \"order\", \"order\", \"order\", \"osc\", \"osc\", \"osc\", \"osc\", \"osc\", \"osc\", \"osc\", \"oscillate\", \"oscillate\", \"oscillate\", \"oscillate\", \"oscillate\", \"oscillate\", \"oscillate\", \"oscillate\", \"oscillation\", \"oscillation\", \"oscillation\", \"oscillation\", \"oscillation\", \"oscillation\", \"oscillation\", \"oscillation\", \"oscillator\", \"oscillator\", \"oscillator\", \"oscillator\", \"oscillator\", \"oscillator\", \"oscillator\", \"outlier\", \"outlier\", \"outlier\", \"outlier\", \"outlier\", \"outlier\", \"output\", \"output\", \"output\", \"output\", \"output\", \"output\", \"output\", \"output\", \"overcomplete\", \"overcomplete\", \"overcomplete\", \"overcomplete\", \"overcomplete\", \"overcomplete\", \"parameter\", \"parameter\", \"parameter\", \"parameter\", \"parameter\", \"parameter\", \"parameter\", \"paraphrase\", \"paraphrase\", \"paraphrase\", \"paraphrase\", \"paraphrase\", \"paraphrase\", \"paraphrase_detection\", \"paraphrase_detection\", \"paraphrase_detection\", \"paraphrase_detection\", \"paraphrase_detection\", \"paraphrase_detection\", \"partition\", \"partition\", \"partition\", \"partition\", \"partition\", \"partition\", \"partition\", \"patient\", \"patient\", \"patient\", \"patient\", \"patient\", \"patient\", \"pauc\", \"pauc\", \"pauc\", \"pauc\", \"pauc\", \"pauc\", \"pca\", \"pca\", \"pca\", \"pca\", \"pca\", \"pca\", \"pe\", \"pe\", \"pe\", \"pe\", \"pe\", \"pe\", \"pedagogical\", \"pedagogical\", \"pedagogical\", \"pedagogical\", \"pedagogical\", \"pedagogical\", \"phase_diagram\", \"phase_diagram\", \"phase_diagram\", \"phase_diagram\", \"phase_diagram\", \"phase_diagram\", \"phase_diagram\", \"phrase\", \"phrase\", \"phrase\", \"phrase\", \"phrase\", \"phrase\", \"physical\", \"physical\", \"physical\", \"physical\", \"physical\", \"physical\", \"physical\", \"physical_propertie\", \"physical_propertie\", \"physical_propertie\", \"physical_propertie\", \"physical_propertie\", \"physical_propertie\", \"physics_engine\", \"physics_engine\", \"physics_engine\", \"physics_engine\", \"physics_engine\", \"physics_engine\", \"player\", \"player\", \"player\", \"player\", \"player\", \"player\", \"pmatch\", \"pmatch\", \"pmatch\", \"pmatch\", \"pmatch\", \"pmatch\", \"point\", \"point\", \"point\", \"point\", \"point\", \"point\", \"point\", \"point\", \"poker\", \"poker\", \"poker\", \"poker\", \"poker\", \"poker\", \"poll\", \"poll\", \"poll\", \"poll\", \"poll\", \"poll\", \"posterior\", \"posterior\", \"posterior\", \"posterior\", \"posterior\", \"posterior\", \"prec\", \"prec\", \"prec\", \"prec\", \"prec\", \"prec\", \"prediction\", \"prediction\", \"prediction\", \"prediction\", \"prediction\", \"prediction\", \"pretraine\", \"pretraine\", \"pretraine\", \"pretraine\", \"pretraine\", \"pretraine\", \"price\", \"price\", \"price\", \"price\", \"price\", \"price\", \"price\", \"principal_component\", \"principal_component\", \"principal_component\", \"principal_component\", \"principal_component\", \"principal_component\", \"prior\", \"prior\", \"prior\", \"prior\", \"prior\", \"prior\", \"prior\", \"prior_belief\", \"prior_belief\", \"prior_belief\", \"prior_belief\", \"prior_belief\", \"prior_belief\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"probability\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"problem\", \"process\", \"process\", \"process\", \"process\", \"process\", \"process\", \"process\", \"proof\", \"proof\", \"proof\", \"proof\", \"proof\", \"proof\", \"proof\", \"provide\", \"provide\", \"provide\", \"provide\", \"provide\", \"provide\", \"provide\", \"provide\", \"pseudo_regret\", \"pseudo_regret\", \"pseudo_regret\", \"pseudo_regret\", \"pseudo_regret\", \"pseudo_regret\", \"pseudo_regret\", \"pseudo_risk\", \"pseudo_risk\", \"pseudo_risk\", \"pseudo_risk\", \"pseudo_risk\", \"pseudo_risk\", \"pseudo_risk\", \"ptail\", \"ptail\", \"ptail\", \"ptail\", \"ptail\", \"ptail\", \"pyramid\", \"pyramid\", \"pyramid\", \"pyramid\", \"pyramid\", \"pyramid\", \"quantication\", \"quantication\", \"quantication\", \"quantication\", \"quantication\", \"quantication\", \"quantum\", \"quantum\", \"quantum\", \"quantum\", \"quantum\", \"quantum\", \"rae\", \"rae\", \"rae\", \"rae\", \"rae\", \"rae\", \"rank\", \"rank\", \"rank\", \"rank\", \"rank\", \"rank\", \"ranknet\", \"ranknet\", \"ranknet\", \"ranknet\", \"ranknet\", \"ranknet\", \"rate_distortion\", \"rate_distortion\", \"rate_distortion\", \"rate_distortion\", \"rate_distortion\", \"rate_distortion\", \"receptor\", \"receptor\", \"receptor\", \"receptor\", \"receptor\", \"receptor\", \"receptor\", \"record\", \"record\", \"record\", \"record\", \"record\", \"record\", \"recursive_autoencoder\", \"recursive_autoencoder\", \"recursive_autoencoder\", \"recursive_autoencoder\", \"recursive_autoencoder\", \"recursive_autoencoder\", \"redundancy\", \"redundancy\", \"redundancy\", \"redundancy\", \"redundancy\", \"redundancy\", \"regret_bound\", \"regret_bound\", \"regret_bound\", \"regret_bound\", \"regret_bound\", \"regret_bound\", \"relative_density\", \"relative_density\", \"relative_density\", \"relative_density\", \"relative_density\", \"relative_density\", \"represent\", \"represent\", \"represent\", \"represent\", \"represent\", \"represent\", \"represent\", \"represent\", \"representation\", \"representation\", \"representation\", \"representation\", \"representation\", \"representation\", \"representation\", \"reservoir\", \"reservoir\", \"reservoir\", \"reservoir\", \"reservoir\", \"reservoir\", \"resnet\", \"resnet\", \"resnet\", \"resnet\", \"resnet\", \"resnet\", \"response\", \"response\", \"response\", \"response\", \"response\", \"response\", \"response\", \"response\", \"result\", \"result\", \"result\", \"result\", \"result\", \"result\", \"result\", \"result\", \"reward\", \"reward\", \"reward\", \"reward\", \"reward\", \"reward\", \"rhythm\", \"rhythm\", \"rhythm\", \"rhythm\", \"rhythm\", \"rhythm\", \"rhythm\", \"rsb\", \"rsb\", \"rsb\", \"rsb\", \"rsb\", \"rsb\", \"rule\", \"rule\", \"rule\", \"rule\", \"rule\", \"rule\", \"rule\", \"rvm\", \"rvm\", \"rvm\", \"rvm\", \"rvm\", \"rvm\", \"saccade\", \"saccade\", \"saccade\", \"saccade\", \"saccade\", \"saccade\", \"safe\", \"safe\", \"safe\", \"safe\", \"safe\", \"safe\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"sample\", \"saturation\", \"saturation\", \"saturation\", \"saturation\", \"saturation\", \"saturation\", \"saturation\", \"saturation\", \"sdp\", \"sdp\", \"sdp\", \"sdp\", \"sdp\", \"sdp\", \"search\", \"search\", \"search\", \"search\", \"search\", \"search\", \"segment\", \"segment\", \"segment\", \"segment\", \"segment\", \"segment\", \"segment\", \"segmentation\", \"segmentation\", \"segmentation\", \"segmentation\", \"segmentation\", \"segmentation\", \"segmentation\", \"self_connection\", \"self_connection\", \"self_connection\", \"self_connection\", \"self_connection\", \"self_connection\", \"self_connection\", \"sentence\", \"sentence\", \"sentence\", \"sentence\", \"sentence\", \"sentence\", \"sequential\", \"sequential\", \"sequential\", \"sequential\", \"sequential\", \"sequential\", \"sequential\", \"set\", \"set\", \"set\", \"set\", \"set\", \"set\", \"set\", \"set\", \"shape\", \"shape\", \"shape\", \"shape\", \"shape\", \"shape\", \"shape\", \"share\", \"share\", \"share\", \"share\", \"share\", \"share\", \"share\", \"sharing\", \"sharing\", \"sharing\", \"sharing\", \"sharing\", \"sharing\", \"sharing\", \"shepherd\", \"shepherd\", \"shepherd\", \"shepherd\", \"shepherd\", \"shepherd\", \"shepherd\", \"show\", \"show\", \"show\", \"show\", \"show\", \"show\", \"show\", \"show\", \"shuffle_ideal\", \"shuffle_ideal\", \"shuffle_ideal\", \"shuffle_ideal\", \"shuffle_ideal\", \"shuffle_ideal\", \"silicon\", \"silicon\", \"silicon\", \"silicon\", \"silicon\", \"silicon\", \"silicon\", \"similarity\", \"similarity\", \"similarity\", \"similarity\", \"similarity\", \"similarity\", \"similarity\", \"site\", \"site\", \"site\", \"site\", \"site\", \"site\", \"size\", \"size\", \"size\", \"size\", \"size\", \"size\", \"size\", \"sniff\", \"sniff\", \"sniff\", \"sniff\", \"sniff\", \"sniff\", \"sniff\", \"sound\", \"sound\", \"sound\", \"sound\", \"sound\", \"sound\", \"sound\", \"space\", \"space\", \"space\", \"space\", \"space\", \"space\", \"space\", \"space\", \"sparse\", \"sparse\", \"sparse\", \"sparse\", \"sparse\", \"sparse\", \"sparse\", \"sparse_code\", \"sparse_code\", \"sparse_code\", \"sparse_code\", \"sparse_code\", \"sparse_code\", \"spatial_frequency\", \"spatial_frequency\", \"spatial_frequency\", \"spatial_frequency\", \"spatial_frequency\", \"spatial_frequency\", \"spectral_sparsifi\", \"spectral_sparsifi\", \"spectral_sparsifi\", \"spectral_sparsifi\", \"spectral_sparsifi\", \"spectral_sparsifi\", \"spending\", \"spending\", \"spending\", \"spending\", \"spending\", \"spending\", \"spike_triggere\", \"spike_triggere\", \"spike_triggere\", \"spike_triggere\", \"spike_triggere\", \"spike_triggere\", \"spline\", \"spline\", \"spline\", \"spline\", \"spline\", \"spline\", \"square_lattice\", \"square_lattice\", \"square_lattice\", \"square_lattice\", \"square_lattice\", \"square_lattice\", \"square_lattice\", \"sta_stc\", \"sta_stc\", \"sta_stc\", \"sta_stc\", \"sta_stc\", \"sta_stc\", \"stape\", \"stape\", \"stape\", \"stape\", \"stape\", \"stape\", \"stape\", \"state\", \"state\", \"state\", \"state\", \"state\", \"state\", \"state\", \"state\", \"statistical_querie\", \"statistical_querie\", \"statistical_querie\", \"statistical_querie\", \"statistical_querie\", \"statistical_querie\", \"stc\", \"stc\", \"stc\", \"stc\", \"stc\", \"stc\", \"step\", \"step\", \"step\", \"step\", \"step\", \"step\", \"step\", \"stimulation\", \"stimulation\", \"stimulation\", \"stimulation\", \"stimulation\", \"stimulation\", \"stochastic\", \"stochastic\", \"stochastic\", \"stochastic\", \"stochastic\", \"stochastic\", \"stochastic\", \"strata\", \"strata\", \"strata\", \"strata\", \"strata\", \"strata\", \"strata\", \"strategy\", \"strategy\", \"strategy\", \"strategy\", \"strategy\", \"strategy\", \"strategy\", \"stratication\", \"stratication\", \"stratication\", \"stratication\", \"stratication\", \"stratication\", \"stratication\", \"stratie\", \"stratie\", \"stratie\", \"stratie\", \"stratie\", \"stratie\", \"stratie\", \"stratify\", \"stratify\", \"stratify\", \"stratify\", \"stratify\", \"stratify\", \"stratify\", \"stratum\", \"stratum\", \"stratum\", \"stratum\", \"stratum\", \"stratum\", \"stratum\", \"string\", \"string\", \"string\", \"string\", \"string\", \"string\", \"structural_primitive\", \"structural_primitive\", \"structural_primitive\", \"structural_primitive\", \"structural_primitive\", \"structural_primitive\", \"student\", \"student\", \"student\", \"student\", \"student\", \"student\", \"sub_strata\", \"sub_strata\", \"sub_strata\", \"sub_strata\", \"sub_strata\", \"sub_strata\", \"sub_strata\", \"sub_stratum\", \"sub_stratum\", \"sub_stratum\", \"sub_stratum\", \"sub_stratum\", \"sub_stratum\", \"sub_stratum\", \"subject\", \"subject\", \"subject\", \"subject\", \"subject\", \"subject\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"surrogate\", \"swapout\", \"swapout\", \"swapout\", \"swapout\", \"swapout\", \"swapout\", \"synaptic_plasticity\", \"synaptic_plasticity\", \"synaptic_plasticity\", \"synaptic_plasticity\", \"synaptic_plasticity\", \"synaptic_plasticity\", \"synaptic_weight\", \"synaptic_weight\", \"synaptic_weight\", \"synaptic_weight\", \"synaptic_weight\", \"synaptic_weight\", \"syntactic\", \"syntactic\", \"syntactic\", \"syntactic\", \"syntactic\", \"syntactic\", \"system\", \"system\", \"system\", \"system\", \"system\", \"system\", \"system\", \"system\", \"tag\", \"tag\", \"tag\", \"tag\", \"tag\", \"tag\", \"tail\", \"tail\", \"tail\", \"tail\", \"tail\", \"tail\", \"target\", \"target\", \"target\", \"target\", \"target\", \"target\", \"teach\", \"teach\", \"teach\", \"teach\", \"teach\", \"teach\", \"teaching\", \"teaching\", \"teaching\", \"teaching\", \"teaching\", \"teaching\", \"tensor\", \"tensor\", \"tensor\", \"tensor\", \"tensor\", \"tensor\", \"texture\", \"texture\", \"texture\", \"texture\", \"texture\", \"texture\", \"texture\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"theorem\", \"tile\", \"tile\", \"tile\", \"tile\", \"tile\", \"tile\", \"tilt\", \"tilt\", \"tilt\", \"tilt\", \"tilt\", \"tilt\", \"tilt\", \"time\", \"time\", \"time\", \"time\", \"time\", \"time\", \"time\", \"time\", \"token\", \"token\", \"token\", \"token\", \"token\", \"token\", \"track\", \"track\", \"track\", \"track\", \"track\", \"track\", \"tracking\", \"tracking\", \"tracking\", \"tracking\", \"tracking\", \"tracking\", \"training\", \"training\", \"training\", \"training\", \"training\", \"training\", \"training\", \"transformation\", \"transformation\", \"transformation\", \"transformation\", \"transformation\", \"transformation\", \"transformation\", \"transistor\", \"transistor\", \"transistor\", \"transistor\", \"transistor\", \"transistor\", \"transistor\", \"transition_probabilitie\", \"transition_probabilitie\", \"transition_probabilitie\", \"transition_probabilitie\", \"transition_probabilitie\", \"transition_probabilitie\", \"travelling_wave\", \"travelling_wave\", \"travelling_wave\", \"travelling_wave\", \"travelling_wave\", \"travelling_wave\", \"trimmed_graphical\", \"trimmed_graphical\", \"trimmed_graphical\", \"trimmed_graphical\", \"trimmed_graphical\", \"trimmed_graphical\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"unit\", \"use\", \"use\", \"use\", \"use\", \"use\", \"use\", \"use\", \"use\", \"validation\", \"validation\", \"validation\", \"validation\", \"validation\", \"validation\", \"validation\", \"value\", \"value\", \"value\", \"value\", \"value\", \"value\", \"value\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"vector\", \"video\", \"video\", \"video\", \"video\", \"video\", \"video\", \"visuomotor_map\", \"visuomotor_map\", \"visuomotor_map\", \"visuomotor_map\", \"visuomotor_map\", \"visuomotor_map\", \"vocabulary\", \"vocabulary\", \"vocabulary\", \"vocabulary\", \"vocabulary\", \"vocabulary\", \"vocabulary\", \"voltage\", \"voltage\", \"voltage\", \"voltage\", \"voltage\", \"voltage\", \"voltage\", \"wave\", \"wave\", \"wave\", \"wave\", \"wave\", \"wave\", \"wave\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"wavelet\", \"weak\", \"weak\", \"weak\", \"weak\", \"weak\", \"weak\", \"weak\", \"wei_bull\", \"wei_bull\", \"wei_bull\", \"wei_bull\", \"wei_bull\", \"wei_bull\", \"weibull\", \"weibull\", \"weibull\", \"weibull\", \"weibull\", \"weibull\", \"weight\", \"weight\", \"weight\", \"weight\", \"weight\", \"weight\", \"weight\", \"weight_share\", \"weight_share\", \"weight_share\", \"weight_share\", \"weight_share\", \"weight_share\", \"well\", \"well\", \"well\", \"well\", \"well\", \"well\", \"well\", \"wlrmf\", \"wlrmf\", \"wlrmf\", \"wlrmf\", \"wlrmf\", \"wlrmf\", \"word\", \"word\", \"word\", \"word\", \"word\", \"word\"]}, \"R\": 30, \"lambda.step\": 0.01, \"plot.opts\": {\"xlab\": \"PC1\", \"ylab\": \"PC2\"}, \"topic.order\": [5, 7, 6, 8, 1, 2, 4, 3]};\n\nfunction LDAvis_load_lib(url, callback){\n  var s = document.createElement('script');\n  s.src = url;\n  s.async = true;\n  s.onreadystatechange = s.onload = callback;\n  s.onerror = function(){console.warn(\"failed to load library \" + url);};\n  document.getElementsByTagName(\"head\")[0].appendChild(s);\n}\n\nif(typeof(LDAvis) !== \"undefined\"){\n   // already loaded: just create the visualization\n   !function(LDAvis){\n       new LDAvis(\"#\" + \"ldavis_el84897113971359365306739010\", ldavis_el84897113971359365306739010_data);\n   }(LDAvis);\n}else if(typeof define === \"function\" && define.amd){\n   // require.js is available: use it to load d3/LDAvis\n   require.config({paths: {d3: \"https://d3js.org/d3.v5\"}});\n   require([\"d3\"], function(d3){\n      window.d3 = d3;\n      LDAvis_load_lib(\"https://cdn.jsdelivr.net/gh/bmabey/pyLDAvis@3.3.1/pyLDAvis/js/ldavis.v3.0.0.js\", function(){\n        new LDAvis(\"#\" + \"ldavis_el84897113971359365306739010\", ldavis_el84897113971359365306739010_data);\n      });\n    });\n}else{\n    // require.js not available: dynamically load d3 & LDAvis\n    LDAvis_load_lib(\"https://d3js.org/d3.v5.js\", function(){\n         LDAvis_load_lib(\"https://cdn.jsdelivr.net/gh/bmabey/pyLDAvis@3.3.1/pyLDAvis/js/ldavis.v3.0.0.js\", function(){\n                 new LDAvis(\"#\" + \"ldavis_el84897113971359365306739010\", ldavis_el84897113971359365306739010_data);\n            })\n         });\n}\n</script>"
  },
  {
    "path": "natural-language-processing/transformers-series/pyproject.toml",
    "content": "[tool.poetry]\nname = \"transformers-models\"\nversion = \"0.1.0\"\ndescription = \"\"\nauthors = [\"Shashank Kapadia <smhkapadia@gmail.com>\"]\nreadme = \"README.md\"\n\n[tool.poetry.dependencies]\npython = \">=3.9,<3.10\"\njupyterlab = \"^4.0.2\"\ntensorflow = [\n    { version = \"2.10.0\", platform = \"linux\" },\n]\ntensorflow-macos = [\n    { version = \"2.10.0\", platform = \"darwin\" },\n]\npandas = \"^2.0.3\"\nseaborn = \"^0.12.2\"\ntransformers = \"4.30.0\"\ntorch = \"^2.0.1\"\nscikit-learn = \"^1.3.0\"\ndatasets = \"^2.13.1\"\naccelerate = \"^0.20.3\"\n\n\n[build-system]\nrequires = [\"poetry-core\"]\nbuild-backend = \"poetry.core.masonry.api\"\n"
  },
  {
    "path": "natural-language-processing/transformers-series/sentiment_analysis_bert.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"bb537d94-7c06-416c-823c-2111e595065f\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Practical Introduction to Transformer Models: BERT\\n\",\n    \"\\n\",\n    \"In the field of NLP, the transformer model architecture has been a revolutionary discovery that greatly enhanced our ability to understand and generate human language.\\n\",\n    \"\\n\",\n    \"This tutorial provides a close look at BERT (Bidirectional Encoder Representations from Transformers), one of the most popular transformer-based models, and instructs on its use in practice for sentiment analysis by fine-tuning the base model.\\n\",\n    \"\\n\",\n    \"#### Introduction to BERT\\n\",\n    \"BERT, introduced by researchers at Google in 2018, is a powerful language model that uses transformer architecture. What sets BERT apart is its bidirectional nature. Traditional language models like LSTM and GRU are either unidirectional or sequentially bidirectional, but BERT takes into account the context from both past and future words simultaneously for a specific word. The key innovation in the transformer model is the \\\"attention\\\" mechanism, which allows the model to weight the importance of different words in a sentence when generating representations.\\n\",\n    \"This feature allows BERT to understand the complete context of a word and, consequently, produce more accurate language models. It is pre-trained on two NLP tasks:\\n\",\n    \"- Masked Language Model (MLM) and \\n\",\n    \"- Next Sentence Prediction (NSP), \\n\",\n    \"\\n\",\n    \"and is often used as a starting point for various downstream NLP tasks, like sentiment analysis, which will be explored in this tutorial.\\n\",\n    \"\\n\",\n    \"#### Pre-Training and Fine-Tuning\\n\",\n    \"The power of BERT comes from its two-step process: \\n\",\n    \"- Pre-training is the phase where BERT is trained on large amounts of data. As a result, it learns to predict masked words in a sentence (MLM task) and to predict if a sentence follows another one (NSP task). The output of this stage is a a pre-trained NLP model with a general-purpose \\\"understanding\\\" of the language\\n\",\n    \"- Fine-tuning is where the pre-trained BERT model is further trained on a specific task. The model is initialized with the pre-trained parameters, and the entire model is trained on a downstream task, allowing BERT to fine-tune its understanding of language to the specifics of the task at hand.\\n\",\n    \"\\n\",\n    \"#### Hands On: Using BERT for sentiment analysis\\n\",\n    \"Now, let's get our hands dirty. We're going to perform sentiment analysis on the IMDb dataset, which contains movie reviews labeled as either positive or negative. For this, we'll be using Hugging Face's transformers library.\\n\",\n    \"\\n\",\n    \"** **\\n\",\n    \"Let's load all the libraries\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"id\": \"9122a30a-9f2f-4d8a-9c87-2d11c0465308\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/Users/skapadia/Library/Caches/pypoetry/virtualenvs/transformers-models-r2uefcSj-py3.9/lib/python3.9/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\\n\",\n      \"  from .autonotebook import tqdm as notebook_tqdm\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"import pandas as pd\\n\",\n    \"import seaborn as sns\\n\",\n    \"import matplotlib.pyplot as plt\\n\",\n    \"from sklearn.metrics import confusion_matrix, roc_curve, auc\\n\",\n    \"from datasets import load_dataset\\n\",\n    \"from transformers import AutoTokenizer, AutoModelForSequenceClassification, TrainingArguments, Trainer\\n\",\n    \"\\n\",\n    \"# Variables to set the number of epochs and samples\\n\",\n    \"num_epochs = 10\\n\",\n    \"num_samples = 100  # set this to -1 to use all data\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"5fc553a9-96e5-45f3-8573-cf7445585de3\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"First, we need to load the dataset and the model tokenizer.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"id\": \"bf744b0d-6f8c-4711-ad75-efde168d978a\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Found cached dataset imdb (/Users/skapadia/.cache/huggingface/datasets/imdb/plain_text/1.0.0/d613c88cf8fa3bab83b4ded3713f1f74830d1100e171db75bbddb80b3345c9c0)\\n\",\n      \"100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:00<00:00, 385.36it/s]\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Step 1: Load dataset and model tokenizer\\n\",\n    \"dataset = load_dataset('imdb')\\n\",\n    \"tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"66f8f9f2-1c21-4717-b7f6-8210bbd25503\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"Before we move on, let's explore our dataset a bit. We'll create a plot to see the distribution of the positive and negative classes.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"id\": \"bb61016a-573c-475c-b95e-ec7964443106\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAk0AAAHHCAYAAACiOWx7AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAA040lEQVR4nO3deXhU5d3/8c8kmIVlEpaQZH4GiEiBKIIGgciilJSg6GNaXChpQUVwSVSMBaTVsLigQZS1LFqFtlAptqCCRlIQKBBZgsiOFCPB0kmokIxEICE5vz98ch7GRLmJCTPB9+u6znVl7vs79/metCEfzzk547AsyxIAAAC+V4CvGwAAAKgPCE0AAAAGCE0AAAAGCE0AAAAGCE0AAAAGCE0AAAAGCE0AAAAGCE0AAAAGCE0AAAAGCE0AaqxNmza65557fN2GkQkTJsjhcHiNXaz+P//8czkcDi1YsMAeu+eee9S4ceM633clh8OhCRMmXLT9AZciQhOAKg4dOqQHHnhAV1xxhUJCQuR0OtWzZ09Nnz5dp06d8nV7PvXee+/5bfjw596AS0EDXzcAwL+sXLlSd955p4KDgzV06FBdffXVKi0t1YYNGzR69Gjt2bNH8+fP93WbteLAgQMKCLiw/3Z87733NHv27AsKJ61bt9apU6d02WWXXWCHF+b7ejt16pQaNOCffOCH4CcIgC0vL0+DBw9W69attWbNGkVHR9tzqamp+te//qWVK1f6sMPaFRwcXKfrnz17VhUVFQoKClJISEid7ut8fL1/4FLA5TkAtszMTJ08eVJ/+MMfvAJTpSuvvFKPPfbYd77/+PHj+s1vfqNOnTqpcePGcjqduvnmm/XJJ59UqZ05c6auuuoqNWzYUE2bNlXXrl21ePFie/6rr77SqFGj1KZNGwUHB6tly5b62c9+pu3bt5/3ODZs2KDrr79eISEhatu2rebNm1dt3bfvaSorK9PEiRPVrl07hYSEqHnz5urVq5eys7MlfXMf0uzZsyV9c49Q5Sb9331LL730kqZNm6a2bdsqODhYe/furfaepkqfffaZkpKS1KhRI7lcLk2aNEmWZdnza9eulcPh0Nq1a73e9+01v6+3yrFvn4H6+OOPdfPNN8vpdKpx48bq16+fPvroI6+aBQsWyOFwaOPGjUpPT1dERIQaNWqkn//85zp27Fj1/wMAlyjONAGwvfvuu7riiit0ww031Oj9n332mZYvX64777xTsbGxKigo0Lx583TjjTdq7969crlckqRXX31Vjz76qO644w499thjOn36tHbu3KnNmzdryJAhkqQHH3xQb731ltLS0hQXF6cvv/xSGzZs0L59+3Tdddd9Zw+7du1S//79FRERoQkTJujs2bMaP368IiMjz9v/hAkTNHnyZN1///3q1q2bPB6Ptm3bpu3bt+tnP/uZHnjgAR09elTZ2dn605/+VO0ab7zxhk6fPq2RI0cqODhYzZo1U0VFRbW15eXlGjBggHr06KHMzExlZWVp/PjxOnv2rCZNmnTefs9l0tu59uzZo969e8vpdGrMmDG67LLLNG/ePN10001at26dunfv7lX/yCOPqGnTpho/frw+//xzTZs2TWlpaVqyZMkF9QnUaxYAWJZVXFxsSbJuv/124/e0bt3aGjZsmP369OnTVnl5uVdNXl6eFRwcbE2aNMkeu/32262rrrrqe9cOCwuzUlNTjXuplJycbIWEhFiHDx+2x/bu3WsFBgZa3/4n79v9d+7c2Ro4cOD3rp+amlplHcv65jglWU6n0yosLKx27o033rDHhg0bZkmyHnnkEXusoqLCGjhwoBUUFGQdO3bMsizL+vDDDy1J1ocffnjeNb+rN8uyLEnW+PHj7dfJyclWUFCQdejQIXvs6NGjVpMmTaw+ffrYY2+88YYlyUpMTLQqKirs8ccff9wKDAy0ioqKqt0fcCni8hwASZLH45EkNWnSpMZrBAcH2zdWl5eX68svv1Tjxo3Vvn17r8tq4eHh+uKLL7R169bvXCs8PFybN2/W0aNHjfdfXl6uDz74QMnJyWrVqpU93rFjRyUlJZ33/eHh4dqzZ48OHjxovM9vGzRokCIiIozr09LS7K8dDofS0tJUWlqqf/zjHzXu4XzKy8u1atUqJScn64orrrDHo6OjNWTIEG3YsMH+/0OlkSNHel3u6927t8rLy3X48OE66xPwN4QmAJIkp9Mp6Zt7iWqqoqJCr7zyitq1a6fg4GC1aNFCERER2rlzp4qLi+26sWPHqnHjxurWrZvatWun1NRUbdy40WutzMxM7d69WzExMerWrZsmTJigzz777Hv3f+zYMZ06dUrt2rWrMte+ffvz9j9p0iQVFRXpJz/5iTp16qTRo0dr586dhkf/jdjYWOPagIAAr9AiST/5yU8kfXPPUl05duyYvv7662q/Jx07dlRFRYWOHDniNX5uCJWkpk2bSpJOnDhRZ30C/obQBEDSN6HJ5XJp9+7dNV7j+eefV3p6uvr06aM///nP+uCDD5Sdna2rrrrK676ejh076sCBA3rzzTfVq1cv/e1vf1OvXr00fvx4u+auu+7SZ599ppkzZ8rlcmnKlCm66qqr9P777/+g4/w+ffr00aFDh/T666/r6quv1muvvabrrrtOr732mvEaoaGhtdrTtx/IWam8vLxW93M+gYGB1Y5b59y0DlzqCE0AbLfeeqsOHTqknJycGr3/rbfeUt++ffWHP/xBgwcPVv/+/ZWYmKiioqIqtY0aNdLdd9+tN954Q/n5+Ro4cKCee+45nT592q6Jjo7Www8/rOXLlysvL0/NmzfXc8899537j4iIUGhoaLWX1w4cOGB0DM2aNdO9996rv/zlLzpy5IiuueYar786+64QUxMVFRVVzp59+umnkr75yz7p/87ofPt7WN1lMdPeIiIi1LBhw2q/J/v371dAQIBiYmKM1gJ+TAhNAGxjxoxRo0aNdP/996ugoKDK/KFDhzR9+vTvfH9gYGCVMw9Lly7Vv//9b6+xL7/80ut1UFCQ4uLiZFmWysrKVF5e7nU5T5Jatmwpl8ulM2fOfO/+k5KStHz5cuXn59vj+/bt0wcffPCd7/uuvho3bqwrr7zSa5+NGjWSVDXE1NSsWbPsry3L0qxZs3TZZZepX79+kr55MGZgYKDWr1/v9b7f//73VdYy7S0wMFD9+/fX22+/7XUZsKCgQIsXL1avXr3sy7UA/g+PHABga9u2rRYvXqy7775bHTt29Hoi+KZNm7R06dLv/ay2W2+9VZMmTdK9996rG264Qbt27dKiRYuq3LfTv39/RUVFqWfPnoqMjNS+ffs0a9YsDRw4UE2aNFFRUZEuv/xy3XHHHercubMaN26sf/zjH9q6daumTp36vccwceJEZWVlqXfv3nr44Yd19uxZ+5lQ57s/KS4uTjfddJPi4+PVrFkzbdu2zX7sQaX4+HhJ0qOPPqqkpCQFBgZq8ODB5/nOVi8kJERZWVkaNmyYunfvrvfff18rV67Ub3/7W/tm8rCwMN15552aOXOmHA6H2rZtqxUrVqiwsLDKehfS27PPPqvs7Gz16tVLDz/8sBo0aKB58+bpzJkzyszMrNHxAJc83/7xHgB/9Omnn1ojRoyw2rRpYwUFBVlNmjSxevbsac2cOdM6ffq0XVfdIweeeOIJKzo62goNDbV69uxp5eTkWDfeeKN144032nXz5s2z+vTpYzVv3twKDg622rZta40ePdoqLi62LMuyzpw5Y40ePdrq3Lmz1aRJE6tRo0ZW586drd///vdG/a9bt86Kj4+3goKCrCuuuMKaO3euNX78+PM+cuDZZ5+1unXrZoWHh1uhoaFWhw4drOeee84qLS21a86ePWs98sgjVkREhOVwOOw1Kx8BMGXKlCr9fNcjBxo1amQdOnTI6t+/v9WwYUMrMjLSGj9+fJXHNhw7dswaNGiQ1bBhQ6tp06bWAw88YO3evbvKmt/Vm2VVfeSAZVnW9u3braSkJKtx48ZWw4YNrb59+1qbNm3yqql85MDWrVu9xr/rUQjApcxhWdzFBwAAcD7c0wQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCAh1vWkoqKCh09elRNmjSp1Y9ZAAAAdceyLH311VdyuVwKCPj+c0mEplpy9OhRPqsJAIB66siRI7r88su/t4bQVEuaNGki6ZtvOp/ZBABA/eDxeBQTE2P/Hv8+hKZaUnlJzul0EpoAAKhnTG6t4UZwAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAA4QmAAAAAw183QAuTPzoP/q6BcDv5E4Z6usWakX+pE6+bgHwO60ydvm6BRtnmgAAAAwQmgAAAAwQmgAAAAwQmgAAAAwQmgAAAAwQmgAAAAwQmgAAAAwQmgAAAAwQmgAAAAz4NDStX79et912m1wulxwOh5YvX27PlZWVaezYserUqZMaNWokl8uloUOH6ujRo15rHD9+XCkpKXI6nQoPD9fw4cN18uRJr5qdO3eqd+/eCgkJUUxMjDIzM6v0snTpUnXo0EEhISHq1KmT3nvvvTo5ZgAAUD/5NDSVlJSoc+fOmj17dpW5r7/+Wtu3b9fTTz+t7du36+9//7sOHDig//mf//GqS0lJ0Z49e5Sdna0VK1Zo/fr1GjlypD3v8XjUv39/tW7dWrm5uZoyZYomTJig+fPn2zWbNm3SL3/5Sw0fPlwff/yxkpOTlZycrN27d9fdwQMAgHrFYVmW5esmJMnhcGjZsmVKTk7+zpqtW7eqW7duOnz4sFq1aqV9+/YpLi5OW7duVdeuXSVJWVlZuuWWW/TFF1/I5XJpzpw5+t3vfie3262goCBJ0pNPPqnly5dr//79kqS7775bJSUlWrFihb2vHj16qEuXLpo7d65R/x6PR2FhYSouLpbT6azhd+H8+Ow5oCo+ew64dNX1Z89dyO/venVPU3FxsRwOh8LDwyVJOTk5Cg8PtwOTJCUmJiogIECbN2+2a/r06WMHJklKSkrSgQMHdOLECbsmMTHRa19JSUnKycn5zl7OnDkjj8fjtQEAgEtXvQlNp0+f1tixY/XLX/7SToJut1stW7b0qmvQoIGaNWsmt9tt10RGRnrVVL4+X03lfHUmT56ssLAwe4uJiflhBwgAAPxavQhNZWVluuuuu2RZlubMmePrdiRJ48aNU3Fxsb0dOXLE1y0BAIA61MDXDZxPZWA6fPiw1qxZ43W9MSoqSoWFhV71Z8+e1fHjxxUVFWXXFBQUeNVUvj5fTeV8dYKDgxUcHFzzAwMAAPWKX59pqgxMBw8e1D/+8Q81b97caz4hIUFFRUXKzc21x9asWaOKigp1797drlm/fr3KysrsmuzsbLVv315Nmza1a1avXu21dnZ2thISEurq0AAAQD3j09B08uRJ7dixQzt27JAk5eXlaceOHcrPz1dZWZnuuOMObdu2TYsWLVJ5ebncbrfcbrdKS0slSR07dtSAAQM0YsQIbdmyRRs3blRaWpoGDx4sl8slSRoyZIiCgoI0fPhw7dmzR0uWLNH06dOVnp5u9/HYY48pKytLU6dO1f79+zVhwgRt27ZNaWlpF/17AgAA/JNPQ9O2bdt07bXX6tprr5Ukpaen69prr1VGRob+/e9/65133tEXX3yhLl26KDo62t42bdpkr7Fo0SJ16NBB/fr10y233KJevXp5PYMpLCxMq1atUl5enuLj4/XEE08oIyPD61lON9xwgxYvXqz58+erc+fOeuutt7R8+XJdffXVF++bAQAA/JrfPKepvuM5TYDv8Jwm4NLFc5oAAADqGUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAUITAACAAZ+GpvXr1+u2226Ty+WSw+HQ8uXLveYty1JGRoaio6MVGhqqxMREHTx40Kvm+PHjSklJkdPpVHh4uIYPH66TJ0961ezcuVO9e/dWSEiIYmJilJmZWaWXpUuXqkOHDgoJCVGnTp303nvv1frxAgCA+sunoamkpESdO3fW7Nmzq53PzMzUjBkzNHfuXG3evFmNGjVSUlKSTp8+bdekpKRoz549ys7O1ooVK7R+/XqNHDnSnvd4POrfv79at26t3NxcTZkyRRMmTND8+fPtmk2bNumXv/ylhg8fro8//ljJyclKTk7W7t276+7gAQBAveKwLMvydROS5HA4tGzZMiUnJ0v65iyTy+XSE088od/85jeSpOLiYkVGRmrBggUaPHiw9u3bp7i4OG3dulVdu3aVJGVlZemWW27RF198IZfLpTlz5uh3v/ud3G63goKCJElPPvmkli9frv3790uS7r77bpWUlGjFihV2Pz169FCXLl00d+5co/49Ho/CwsJUXFwsp9NZW9+WKuJH/7HO1gbqq9wpQ33dQq3In9TJ1y0AfqdVxq46Xf9Cfn/77T1NeXl5crvdSkxMtMfCwsLUvXt35eTkSJJycnIUHh5uByZJSkxMVEBAgDZv3mzX9OnTxw5MkpSUlKQDBw7oxIkTds25+6msqdxPdc6cOSOPx+O1AQCAS5ffhia32y1JioyM9BqPjIy059xut1q2bOk136BBAzVr1syrpro1zt3Hd9VUzldn8uTJCgsLs7eYmJgLPUQAAFCP+G1o8nfjxo1TcXGxvR05csTXLQEAgDrkt6EpKipKklRQUOA1XlBQYM9FRUWpsLDQa/7s2bM6fvy4V011a5y7j++qqZyvTnBwsJxOp9cGAAAuXX4bmmJjYxUVFaXVq1fbYx6PR5s3b1ZCQoIkKSEhQUVFRcrNzbVr1qxZo4qKCnXv3t2uWb9+vcrKyuya7OxstW/fXk2bNrVrzt1PZU3lfgAAAHwamk6ePKkdO3Zox44dkr65+XvHjh3Kz8+Xw+HQqFGj9Oyzz+qdd97Rrl27NHToULlcLvsv7Dp27KgBAwZoxIgR2rJlizZu3Ki0tDQNHjxYLpdLkjRkyBAFBQVp+PDh2rNnj5YsWaLp06crPT3d7uOxxx5TVlaWpk6dqv3792vChAnatm2b0tLSLva3BAAA+KkGvtz5tm3b1LdvX/t1ZZAZNmyYFixYoDFjxqikpEQjR45UUVGRevXqpaysLIWEhNjvWbRokdLS0tSvXz8FBARo0KBBmjFjhj0fFhamVatWKTU1VfHx8WrRooUyMjK8nuV0ww03aPHixXrqqaf029/+Vu3atdPy5ct19dVXX4TvAgAAqA/85jlN9R3PaQJ8h+c0AZcuntMEAABQzxCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADBCaAAAADPh1aCovL9fTTz+t2NhYhYaGqm3btnrmmWdkWZZdY1mWMjIyFB0drdDQUCUmJurgwYNe6xw/flwpKSlyOp0KDw/X8OHDdfLkSa+anTt3qnfv3goJCVFMTIwyMzMvyjECAID6wa9D04svvqg5c+Zo1qxZ2rdvn1588UVlZmZq5syZdk1mZqZmzJihuXPnavPmzWrUqJGSkpJ0+vRpuyYlJUV79uxRdna2VqxYofXr12vkyJH2vMfjUf/+/dW6dWvl5uZqypQpmjBhgubPn39RjxcAAPivBr5u4Pts2rRJt99+uwYOHChJatOmjf7yl79oy5Ytkr45yzRt2jQ99dRTuv322yVJf/zjHxUZGanly5dr8ODB2rdvn7KysrR161Z17dpVkjRz5kzdcssteumll+RyubRo0SKVlpbq9ddfV1BQkK666irt2LFDL7/8sle4AgAAP15+fabphhtu0OrVq/Xpp59Kkj755BNt2LBBN998syQpLy9PbrdbiYmJ9nvCwsLUvXt35eTkSJJycnIUHh5uByZJSkxMVEBAgDZv3mzX9OnTR0FBQXZNUlKSDhw4oBMnTlTb25kzZ+TxeLw2AABw6fLrM01PPvmkPB6POnTooMDAQJWXl+u5555TSkqKJMntdkuSIiMjvd4XGRlpz7ndbrVs2dJrvkGDBmrWrJlXTWxsbJU1KueaNm1apbfJkydr4sSJtXCUAACgPvDrM01//etftWjRIi1evFjbt2/XwoUL9dJLL2nhwoW+bk3jxo1TcXGxvR05csTXLQEAgDrk12eaRo8erSeffFKDBw+WJHXq1EmHDx/W5MmTNWzYMEVFRUmSCgoKFB0dbb+voKBAXbp0kSRFRUWpsLDQa92zZ8/q+PHj9vujoqJUUFDgVVP5urLm24KDgxUcHPzDDxIAANQLfn2m6euvv1ZAgHeLgYGBqqiokCTFxsYqKipKq1evtuc9Ho82b96shIQESVJCQoKKioqUm5tr16xZs0YVFRXq3r27XbN+/XqVlZXZNdnZ2Wrfvn21l+YAAMCPj1+Hpttuu03PPfecVq5cqc8//1zLli3Tyy+/rJ///OeSJIfDoVGjRunZZ5/VO++8o127dmno0KFyuVxKTk6WJHXs2FEDBgzQiBEjtGXLFm3cuFFpaWkaPHiwXC6XJGnIkCEKCgrS8OHDtWfPHi1ZskTTp09Xenq6rw4dAAD4Gb++PDdz5kw9/fTTevjhh1VYWCiXy6UHHnhAGRkZds2YMWNUUlKikSNHqqioSL169VJWVpZCQkLsmkWLFiktLU39+vVTQECABg0apBkzZtjzYWFhWrVqlVJTUxUfH68WLVooIyODxw0AAACbwzr38dqoMY/Ho7CwMBUXF8vpdNbZfuJH/7HO1gbqq9wpQ33dQq3In9TJ1y0AfqdVxq46Xf9Cfn/79eU5AAAAf0FoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMEBoAgAAMFCj0PTTn/5URUVFVcY9Ho9++tOf/tCeAAAA/E6NQtPatWtVWlpaZfz06dP65z//+YObAgAA8DcNLqR4586d9td79+6V2+22X5eXlysrK0v/7//9v9rrDgAAwE9cUGjq0qWLHA6HHA5HtZfhQkNDNXPmzFprDgAAwF9cUGjKy8uTZVm64oortGXLFkVERNhzQUFBatmypQIDA2u9SQAAAF+7oNDUunVrSVJFRUWdNAMAAOCvLig0nevgwYP68MMPVVhYWCVEZWRk/ODGAAAA/EmNQtOrr76qhx56SC1atFBUVJQcDoc953A4CE0AAOCSU6PQ9Oyzz+q5557T2LFja7sfAAAAv1Sj5zSdOHFCd955Z233AgAA4LdqFJruvPNOrVq1qrZ7AQAA8Fs1ujx35ZVX6umnn9ZHH32kTp066bLLLvOaf/TRR2ulOQAAAH9Ro9A0f/58NW7cWOvWrdO6deu85hwOB6EJAABccmoUmvLy8mq7DwAAAL9Wo3uaAAAAfmxqdKbpvvvu+975119/vUbNAAAA+KsahaYTJ054vS4rK9Pu3btVVFRU7Qf5AgAA1Hc1Ck3Lli2rMlZRUaGHHnpIbdu2/cFNAQAA+Jtau6cpICBA6enpeuWVV2prSQAAAL9RqzeCHzp0SGfPnq3NJQEAAPxCjS7Ppaene722LEv/+c9/tHLlSg0bNqxWGgMAAPAnNQpNH3/8sdfrgIAARUREaOrUqef9yzoAAID6qEah6cMPP6ztPgAAAPxajUJTpWPHjunAgQOSpPbt2ysiIqJWmgIAAPA3NboRvKSkRPfdd5+io6PVp08f9enTRy6XS8OHD9fXX39d2z0CAAD4XI1CU3p6utatW6d3331XRUVFKioq0ttvv61169bpiSeeqO0eAQAAfK5Gl+f+9re/6a233tJNN91kj91yyy0KDQ3VXXfdpTlz5tRWfwAAAH6hRmeavv76a0VGRlYZb9myJZfnAADAJalGoSkhIUHjx4/X6dOn7bFTp05p4sSJSkhIqLXmJOnf//63fvWrX6l58+YKDQ1Vp06dtG3bNnvesixlZGQoOjpaoaGhSkxM1MGDB73WOH78uFJSUuR0OhUeHq7hw4fr5MmTXjU7d+5U7969FRISopiYGGVmZtbqcQAAgPqtRpfnpk2bpgEDBujyyy9X586dJUmffPKJgoODtWrVqlpr7sSJE+rZs6f69u2r999/XxERETp48KCaNm1q12RmZmrGjBlauHChYmNj9fTTTyspKUl79+5VSEiIJCklJUX/+c9/lJ2drbKyMt17770aOXKkFi9eLEnyeDzq37+/EhMTNXfuXO3atUv33XefwsPDNXLkyFo7HgAAUH85LMuyavLGr7/+WosWLdL+/fslSR07dlRKSopCQ0Nrrbknn3xSGzdu1D//+c9q5y3Lksvl0hNPPKHf/OY3kqTi4mJFRkZqwYIFGjx4sPbt26e4uDht3bpVXbt2lSRlZWXplltu0RdffCGXy6U5c+bod7/7ndxut4KCgux9L1++3D6+8/F4PAoLC1NxcbGcTmctHH314kf/sc7WBuqr3ClDfd1Crcif1MnXLQB+p1XGrjpd/0J+f9fo8tzkyZP15ptvasSIEZo6daqmTp2q+++/X3/5y1/04osv1qjp6rzzzjvq2rWr7rzzTrVs2VLXXnutXn31VXs+Ly9PbrdbiYmJ9lhYWJi6d++unJwcSVJOTo7Cw8PtwCRJiYmJCggI0ObNm+2aPn362IFJkpKSknTgwAGdOHGi1o4HAADUXzUKTfPmzVOHDh2qjF911VWaO3fuD26q0meffaY5c+aoXbt2+uCDD/TQQw/p0Ucf1cKFCyVJbrdbkqrclB4ZGWnPud1utWzZ0mu+QYMGatasmVdNdWucu49vO3PmjDwej9cGAAAuXTW6p8ntdis6OrrKeEREhP7zn//84KYqVVRUqGvXrnr++eclSddee612796tuXPn+vyDgSdPnqyJEyf6tAcAAHDx1OhMU0xMjDZu3FhlfOPGjXK5XD+4qUrR0dGKi4vzGuvYsaPy8/MlSVFRUZKkgoICr5qCggJ7LioqSoWFhV7zZ8+e1fHjx71qqlvj3H1827hx41RcXGxvR44cqckhAgCAeqJGoWnEiBEaNWqU3njjDR0+fFiHDx/W66+/rscff1wjRoyoteZ69uxpf7ZdpU8//VStW7eWJMXGxioqKkqrV6+25z0ejzZv3mw/+iAhIUFFRUXKzc21a9asWaOKigp1797drlm/fr3KysrsmuzsbLVv397rL/XOFRwcLKfT6bUBAIBLV40uz40ePVpffvmlHn74YZWWlkqSQkJCNHbsWI0bN67Wmnv88cd1ww036Pnnn9ddd92lLVu2aP78+Zo/f74kyeFwaNSoUXr22WfVrl07+5EDLpdLycnJkr45MzVgwACNGDFCc+fOVVlZmdLS0jR48GD7rNiQIUM0ceJEDR8+XGPHjtXu3bs1ffp0vfLKK7V2LAAAoH6rUWhyOBx68cUX9fTTT2vfvn0KDQ1Vu3btFBwcXKvNXX/99Vq2bJnGjRunSZMmKTY2VtOmTVNKSopdM2bMGJWUlGjkyJEqKipSr169lJWVZT+jSZIWLVqktLQ09evXTwEBARo0aJBmzJhhz4eFhWnVqlVKTU1VfHy8WrRooYyMDJ7RBAAAbDV+ThO88ZwmwHd4ThNw6ar3z2kCAAD4sSE0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGCA0AQAAGKhXoemFF16Qw+HQqFGj7LHTp08rNTVVzZs3V+PGjTVo0CAVFBR4vS8/P18DBw5Uw4YN1bJlS40ePVpnz571qlm7dq2uu+46BQcH68orr9SCBQsuwhEBAID6ot6Epq1bt2revHm65pprvMYff/xxvfvuu1q6dKnWrVuno0eP6he/+IU9X15eroEDB6q0tFSbNm3SwoULtWDBAmVkZNg1eXl5GjhwoPr27asdO3Zo1KhRuv/++/XBBx9ctOMDAAD+rV6EppMnTyolJUWvvvqqmjZtao8XFxfrD3/4g15++WX99Kc/VXx8vN544w1t2rRJH330kSRp1apV2rt3r/785z+rS5cuuvnmm/XMM89o9uzZKi0tlSTNnTtXsbGxmjp1qjp27Ki0tDTdcccdeuWVV3xyvAAAwP/Ui9CUmpqqgQMHKjEx0Ws8NzdXZWVlXuMdOnRQq1atlJOTI0nKyclRp06dFBkZadckJSXJ4/Foz549ds23105KSrLXqM6ZM2fk8Xi8NgAAcOlq4OsGzufNN9/U9u3btXXr1ipzbrdbQUFBCg8P9xqPjIyU2+22a84NTJXzlXPfV+PxeHTq1CmFhoZW2ffkyZM1ceLEGh8XAACoX/z6TNORI0f02GOPadGiRQoJCfF1O17GjRun4uJiezty5IivWwIAAHXIr0NTbm6uCgsLdd1116lBgwZq0KCB1q1bpxkzZqhBgwaKjIxUaWmpioqKvN5XUFCgqKgoSVJUVFSVv6arfH2+GqfTWe1ZJkkKDg6W0+n02gAAwKXLr0NTv379tGvXLu3YscPeunbtqpSUFPvryy67TKtXr7bfc+DAAeXn5yshIUGSlJCQoF27dqmwsNCuyc7OltPpVFxcnF1z7hqVNZVrAAAA+PU9TU2aNNHVV1/tNdaoUSM1b97cHh8+fLjS09PVrFkzOZ1OPfLII0pISFCPHj0kSf3791dcXJx+/etfKzMzU263W0899ZRSU1MVHBwsSXrwwQc1a9YsjRkzRvfdd5/WrFmjv/71r1q5cuXFPWAAAOC3/Do0mXjllVcUEBCgQYMG6cyZM0pKStLvf/97ez4wMFArVqzQQw89pISEBDVq1EjDhg3TpEmT7JrY2FitXLlSjz/+uKZPn67LL79cr732mpKSknxxSAAAwA85LMuyfN3EpcDj8SgsLEzFxcV1en9T/Og/1tnaQH2VO2Wor1uoFfmTOvm6BcDvtMrYVafrX8jvb7++pwkAAMBfEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAMEJoAAAAM+HVomjx5sq6//no1adJELVu2VHJysg4cOOBVc/r0aaWmpqp58+Zq3LixBg0apIKCAq+a/Px8DRw4UA0bNlTLli01evRonT171qtm7dq1uu666xQcHKwrr7xSCxYsqOvDAwAA9Yhfh6Z169YpNTVVH330kbKzs1VWVqb+/furpKTErnn88cf17rvvaunSpVq3bp2OHj2qX/ziF/Z8eXm5Bg4cqNLSUm3atEkLFy7UggULlJGRYdfk5eVp4MCB6tu3r3bs2KFRo0bp/vvv1wcffHBRjxcAAPgvh2VZlq+bMHXs2DG1bNlS69atU58+fVRcXKyIiAgtXrxYd9xxhyRp//796tixo3JyctSjRw+9//77uvXWW3X06FFFRkZKkubOnauxY8fq2LFjCgoK0tixY7Vy5Urt3r3b3tfgwYNVVFSkrKwso948Ho/CwsJUXFwsp9NZ+wf/v+JH/7HO1gbqq9wpQ33dQq3In9TJ1y0AfqdVxq46Xf9Cfn/79ZmmbysuLpYkNWvWTJKUm5ursrIyJSYm2jUdOnRQq1atlJOTI0nKyclRp06d7MAkSUlJSfJ4PNqzZ49dc+4alTWVa1TnzJkz8ng8XhsAALh01ZvQVFFRoVGjRqlnz566+uqrJUlut1tBQUEKDw/3qo2MjJTb7bZrzg1MlfOVc99X4/F4dOrUqWr7mTx5ssLCwuwtJibmBx8jAADwX/UmNKWmpmr37t168803fd2KJGncuHEqLi62tyNHjvi6JQAAUIca+LoBE2lpaVqxYoXWr1+vyy+/3B6PiopSaWmpioqKvM42FRQUKCoqyq7ZsmWL13qVf113bs23/+KuoKBATqdToaGh1fYUHBys4ODgH3xsAACgfvDrM02WZSktLU3Lli3TmjVrFBsb6zUfHx+vyy67TKtXr7bHDhw4oPz8fCUkJEiSEhIStGvXLhUWFto12dnZcjqdiouLs2vOXaOypnINAAAAvz7TlJqaqsWLF+vtt99WkyZN7HuQwsLCFBoaqrCwMA0fPlzp6elq1qyZnE6nHnnkESUkJKhHjx6SpP79+ysuLk6//vWvlZmZKbfbraeeekqpqan2maIHH3xQs2bN0pgxY3TfffdpzZo1+utf/6qVK1f67NgBAIB/8eszTXPmzFFxcbFuuukmRUdH29uSJUvsmldeeUW33nqrBg0apD59+igqKkp///vf7fnAwECtWLFCgYGBSkhI0K9+9SsNHTpUkyZNsmtiY2O1cuVKZWdnq3Pnzpo6dapee+01JSUlXdTjBQAA/qtePafJn/GcJsB3eE4TcOniOU0AAAD1DKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKEJAADAAKHpW2bPnq02bdooJCRE3bt315YtW3zdEgAA8AOEpnMsWbJE6enpGj9+vLZv367OnTsrKSlJhYWFvm4NAAD4GKHpHC+//LJGjBihe++9V3FxcZo7d64aNmyo119/3detAQAAHyM0/a/S0lLl5uYqMTHRHgsICFBiYqJycnJ82BkAAPAHDXzdgL/473//q/LyckVGRnqNR0ZGav/+/VXqz5w5ozNnztivi4uLJUkej6dO+yw/c6pO1wfqo7r+ubtYvjpd7usWAL9T1z/fletblnXeWkJTDU2ePFkTJ06sMh4TE+ODboAft7CZD/q6BQB1ZXLYRdnNV199pbCw798Xoel/tWjRQoGBgSooKPAaLygoUFRUVJX6cePGKT093X5dUVGh48ePq3nz5nI4HHXeL3zL4/EoJiZGR44ckdPp9HU7AGoRP98/LpZl6auvvpLL5TpvLaHpfwUFBSk+Pl6rV69WcnKypG+C0OrVq5WWllalPjg4WMHBwV5j4eHhF6FT+BOn08k/qsAlip/vH4/znWGqRGg6R3p6uoYNG6auXbuqW7dumjZtmkpKSnTvvff6ujUAAOBjhKZz3H333Tp27JgyMjLkdrvVpUsXZWVlVbk5HAAA/PgQmr4lLS2t2stxwLmCg4M1fvz4KpdoAdR//Hzjuzgsk7+xAwAA+JHj4ZYAAAAGCE0AAAAGCE0AAAAGCE0AAAAGCE1ADcyePVtt2rRRSEiIunfvri1btvi6JQA/0Pr163XbbbfJ5XLJ4XBo+fLlvm4JfobQBFygJUuWKD09XePHj9f27dvVuXNnJSUlqbCw0NetAfgBSkpK1LlzZ82ePdvXrcBP8cgB4AJ1795d119/vWbNmiXpm4/biYmJ0SOPPKInn3zSx90BqA0Oh0PLli2zP1YLkDjTBFyQ0tJS5ebmKjEx0R4LCAhQYmKicnJyfNgZAKCuEZqAC/Df//5X5eXlVT5aJzIyUm6320ddAQAuBkITAACAAUITcAFatGihwMBAFRQUeI0XFBQoKirKR10BAC4GQhNwAYKCghQfH6/Vq1fbYxUVFVq9erUSEhJ82BkAoK418HUDQH2Tnp6uYcOGqWvXrurWrZumTZumkpIS3Xvvvb5uDcAPcPLkSf3rX/+yX+fl5WnHjh1q1qyZWrVq5cPO4C945ABQA7NmzdKUKVPkdrvVpUsXzZgxQ927d/d1WwB+gLVr16pv375VxocNG6YFCxZc/IbgdwhNAAAABrinCQAAwAChCQAAwAChCQAAwAChCQAAwAChCQAAwAChCQAAwAChCQAAwAChCcCPxk033aRRo0YZ1a5du1YOh0NFRUU/aJ9t2rTRtGnTftAaAPwDoQkAAMAAoQkAAMAAoQnAj9Kf/vQnde3aVU2aNFFUVJSGDBmiwsLCKnUbN27UNddco5CQEPXo0UO7d+/2mt+wYYN69+6t0NBQxcTE6NFHH1VJScnFOgwAFxGhCcCPUllZmZ555hl98sknWr58uT7//HPdc889VepGjx6tqVOnauvWrYqIiNBtt92msrIySdKhQ4c0YMAADRo0SDt37tSSJUu0YcMGpaWlXeSjAXAxNPB1AwDgC/fdd5/99RVXXKEZM2bo+uuv18mTJ9W4cWN7bvz48frZz34mSVq4cKEuv/xyLVu2THfddZcmT56slJQU++bydu3aacaMGbrxxhs1Z84chYSEXNRjAlC3ONME4EcpNzdXt912m1q1aqUmTZroxhtvlCTl5+d71SUkJNhfN2vWTO3bt9e+ffskSZ988okWLFigxo0b21tSUpIqKiqUl5d38Q4GwEXBmSYAPzolJSVKSkpSUlKSFi1apIiICOXn5yspKUmlpaXG65w8eVIPPPCAHn300SpzrVq1qs2WAfgBQhOAH539+/fryy+/1AsvvKCYmBhJ0rZt26qt/eijj+wAdOLECX366afq2LGjJOm6667T3r17deWVV16cxgH4FJfnAPzotGrVSkFBQZo5c6Y+++wzvfPOO3rmmWeqrZ00aZJWr16t3bt365577lGLFi2UnJwsSRo7dqw2bdqktLQ07dixQwcPHtTbb7/NjeDAJYrQBOBHJyIiQgsWLNDSpUsVFxenF154QS+99FK1tS+88IIee+wxxcfHy+12691331VQUJAk6ZprrtG6dev06aefqnfv3rr22muVkZEhl8t1MQ8HwEXisCzL8nUTAAAA/o4zTQAAAAYITQAAAAYITQAAAAYITQAAAAYITQAAAAYITQAAAAYITQAAAAYITQAAAAYITQAAAAYITQAAAAYITQAAAAYITQAAAAb+P4ULftHNdw8wAAAAAElFTkSuQmCC\",\n      \"text/plain\": [\n       \"<Figure size 640x480 with 1 Axes>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"# Data Exploration\\n\",\n    \"train_df = pd.DataFrame(dataset[\\\"train\\\"])\\n\",\n    \"sns.countplot(x='label', data=train_df)\\n\",\n    \"plt.title('Class distribution')\\n\",\n    \"plt.show()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"bcb1fcbc-42df-4fa8-b371-ea7daf0914ae\",\n   \"metadata\": {},\n   \"source\": [\n    \"It is essential to highlight that in practical applications, encountering a balanced class distribution is rarely the case. If the dataset for your use-case demonstrates class imbalance, it is strongly recommended to implement suitable strategies prior to training the model. Such procedures can significantly enhance the model's performance.\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"d368794c-8a56-4761-8cb5-7d735c21bfa7\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"Next, we preprocess our dataset by tokenizing the texts. We use BERT's tokenizer, which will convert the text into tokens that correspond to BERT's vocabulary.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"id\": \"d8849a36-68be-41fa-8e8a-658c5114fcb7\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Loading cached processed dataset at /Users/skapadia/.cache/huggingface/datasets/imdb/plain_text/1.0.0/d613c88cf8fa3bab83b4ded3713f1f74830d1100e171db75bbddb80b3345c9c0/cache-8d3f672c6bd459c5.arrow\\n\",\n      \"Loading cached processed dataset at /Users/skapadia/.cache/huggingface/datasets/imdb/plain_text/1.0.0/d613c88cf8fa3bab83b4ded3713f1f74830d1100e171db75bbddb80b3345c9c0/cache-a224c74dea2a378a.arrow\\n\",\n      \"Loading cached processed dataset at /Users/skapadia/.cache/huggingface/datasets/imdb/plain_text/1.0.0/d613c88cf8fa3bab83b4ded3713f1f74830d1100e171db75bbddb80b3345c9c0/cache-3771e6c862b361d6.arrow\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Step 2: Preprocess the dataset\\n\",\n    \"def tokenize_function(examples):\\n\",\n    \"    return tokenizer(examples[\\\"text\\\"], padding=\\\"max_length\\\", truncation=True)\\n\",\n    \"\\n\",\n    \"tokenized_datasets = dataset.map(tokenize_function, batched=True)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"c893c1a4-a3bc-4ed3-96b6-dcce961ff823\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"After that, we prepare our training and evaluation datasets. Remember, if you want to use all the data, you can set the num_samples variable to -1.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"id\": \"dbf75222-73e3-4e3b-b2ba-edf51e0a1083\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Loading cached shuffled indices for dataset at /Users/skapadia/.cache/huggingface/datasets/imdb/plain_text/1.0.0/d613c88cf8fa3bab83b4ded3713f1f74830d1100e171db75bbddb80b3345c9c0/cache-84712cace8bef640.arrow\\n\",\n      \"Loading cached shuffled indices for dataset at /Users/skapadia/.cache/huggingface/datasets/imdb/plain_text/1.0.0/d613c88cf8fa3bab83b4ded3713f1f74830d1100e171db75bbddb80b3345c9c0/cache-89ec35fd52b28a4f.arrow\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"if num_samples == -1:\\n\",\n    \"    small_train_dataset = tokenized_datasets[\\\"train\\\"].shuffle(seed=42)\\n\",\n    \"    small_eval_dataset = tokenized_datasets[\\\"test\\\"].shuffle(seed=42)\\n\",\n    \"else:\\n\",\n    \"    small_train_dataset = tokenized_datasets[\\\"train\\\"].shuffle(seed=42).select(range(num_samples)) \\n\",\n    \"    small_eval_dataset = tokenized_datasets[\\\"test\\\"].shuffle(seed=42).select(range(num_samples))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"5cd50a71-f337-43ef-b7b5-e9e048c996ae\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"Then, we load the pre-trained BERT model. We'll use the AutoModelForSequenceClassification class, a BERT model designed for classification tasks.\\n\",\n    \"\\n\",\n    \"For this tutorial, we use the 'bert-base-uncased' version of BERT, which is trained on lower-case English text, is used for this tutorial.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"id\": \"8b4b002a-35b6-4f34-b34d-4fc0cd5fcca6\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertForSequenceClassification: ['cls.predictions.bias', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.LayerNorm.weight', 'cls.seq_relationship.weight', 'cls.predictions.transform.dense.bias']\\n\",\n      \"- This IS expected if you are initializing BertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\\n\",\n      \"- This IS NOT expected if you are initializing BertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\\n\",\n      \"Some weights of BertForSequenceClassification were not initialized from the model checkpoint at bert-base-uncased and are newly initialized: ['classifier.weight', 'classifier.bias']\\n\",\n      \"You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Step 3: Load pre-trained model\\n\",\n    \"model = AutoModelForSequenceClassification.from_pretrained('bert-base-uncased', num_labels=2)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"64b1ff16-44bc-4aec-a466-359927157eca\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"Now, we're ready to define our training arguments and create a Trainer instance to train our model.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"id\": \"77dc2392-8a97-445c-a53b-6afdbc8c09bc\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/Users/skapadia/Library/Caches/pypoetry/virtualenvs/transformers-models-r2uefcSj-py3.9/lib/python3.9/site-packages/transformers/optimization.py:411: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\\n\",\n      \"  warnings.warn(\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"\\n\",\n       \"    <div>\\n\",\n       \"      \\n\",\n       \"      <progress value='130' max='130' style='width:300px; height:20px; vertical-align: middle;'></progress>\\n\",\n       \"      [130/130 16:51, Epoch 10/10]\\n\",\n       \"    </div>\\n\",\n       \"    <table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \" <tr style=\\\"text-align: left;\\\">\\n\",\n       \"      <th>Epoch</th>\\n\",\n       \"      <th>Training Loss</th>\\n\",\n       \"      <th>Validation Loss</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>0.673399</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>0.577147</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>3</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>0.523433</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>4</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>0.736292</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>5</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>0.903130</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>6</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>0.924082</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>7</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>1.438573</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>8</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>1.406258</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>9</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>1.299035</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <td>10</td>\\n\",\n       \"      <td>No log</td>\\n\",\n       \"      <td>1.258621</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table><p>\"\n      ],\n      \"text/plain\": [\n       \"<IPython.core.display.HTML object>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"TrainOutput(global_step=130, training_loss=0.1417579357440655, metrics={'train_runtime': 1018.0087, 'train_samples_per_second': 0.982, 'train_steps_per_second': 0.128, 'total_flos': 263111055360000.0, 'train_loss': 0.1417579357440655, 'epoch': 10.0})\"\n      ]\n     },\n     \"execution_count\": 7,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Step 4: Define training arguments\\n\",\n    \"training_args = TrainingArguments(\\\"test_trainer\\\", evaluation_strategy=\\\"epoch\\\", no_cuda=True, num_train_epochs=num_epochs)\\n\",\n    \"\\n\",\n    \"# Step 5: Create Trainer instance and train\\n\",\n    \"trainer = Trainer(\\n\",\n    \"    model=model, args=training_args, train_dataset=small_train_dataset, eval_dataset=small_eval_dataset\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"trainer.train()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"2352319c-7bbb-4cc7-9cbc-c6872722394d\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Interpreting Results\\n\",\n    \"Having trained our model, let's evaluate it. We'll calculate the confusion matrix and the ROC curve to understand how well our model performs.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"id\": \"54f7f593-b59f-40cc-9149-2cab6b9b2ff2\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [],\n      \"text/plain\": [\n       \"<IPython.core.display.HTML object>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAf8AAAGzCAYAAAAhax6pAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAteklEQVR4nO3deXQUZfb/8U+HJB2WLIRAFtlBWURgRMSIbLLECAxIXBAdA6LI/CKjxDUuI6B+m0GHRVnUGQYQiDo4Aw7OKLJIIhoUoxHckEAUFRIEJwkEaGK6fn946LErAdLQSbdd75enzjFPVddzK+fA5d56qtpmGIYhAABgGSH+DgAAANQvkj8AABZD8gcAwGJI/gAAWAzJHwAAiyH5AwBgMSR/AAAshuQPAIDFkPwBALAYkj/wC7t27dKwYcMUHR0tm82mNWvW+PT8X3/9tWw2m5YuXerT8/6aDRw4UAMHDvR3GIClkPwRcHbv3q077rhD7du3V0REhKKiotS3b1/NmzdPx44dq9O509PTtWPHDj355JNavny5Lrnkkjqdrz6NHz9eNptNUVFRNf4ed+3aJZvNJpvNpqefftrr8+/bt0/Tpk1TQUGBD6IFUJdC/R0A8Ev//ve/dd1118lut+uWW25Rt27ddOLECW3ZskX33XefPvvsM73wwgt1MvexY8eUl5enhx9+WHfeeWedzNGmTRsdO3ZMYWFhdXL+MwkNDdXRo0e1du1aXX/99R77Vq5cqYiICB0/fvyszr1v3z5Nnz5dbdu2Vc+ePWv9ubfeeuus5gNw9kj+CBhFRUUaO3as2rRpo02bNikxMdG9LyMjQ4WFhfr3v/9dZ/P/8MMPkqSYmJg6m8NmsykiIqLOzn8mdrtdffv21UsvvVQt+WdnZ2v48OH6xz/+US+xHD16VI0aNVJ4eHi9zAfgf2j7I2DMmjVLR44c0eLFiz0S/0kdO3bUXXfd5f75p59+0uOPP64OHTrIbrerbdu2euihh+R0Oj0+17ZtW40YMUJbtmzRpZdeqoiICLVv314vvvii+5hp06apTZs2kqT77rtPNptNbdu2lfRzu/zk///StGnTZLPZPMbWr1+vK664QjExMWrSpIk6deqkhx56yL3/VPf8N23apH79+qlx48aKiYnRqFGj9MUXX9Q4X2FhocaPH6+YmBhFR0drwoQJOnr06Kl/sSbjxo3TG2+8odLSUvfYtm3btGvXLo0bN67a8T/++KPuvfdeXXTRRWrSpImioqKUmpqqTz75xH3M5s2b1bt3b0nShAkT3LcPTl7nwIED1a1bN+Xn56t///5q1KiR+/divuefnp6uiIiIatefkpKipk2bat++fbW+VgA1I/kjYKxdu1bt27fX5ZdfXqvjb7vtNv3xj3/UxRdfrDlz5mjAgAFyOBwaO3ZstWMLCwt17bXXaujQofrzn/+spk2bavz48frss88kSWPGjNGcOXMkSTfeeKOWL1+uuXPnehX/Z599phEjRsjpdGrGjBn685//rN/+9rd69913T/u5DRs2KCUlRQcOHNC0adOUmZmp9957T3379tXXX39d7fjrr79ehw8flsPh0PXXX6+lS5dq+vTptY5zzJgxstls+uc//+key87OVufOnXXxxRdXO37Pnj1as2aNRowYodmzZ+u+++7Tjh07NGDAAHci7tKli2bMmCFJmjRpkpYvX67ly5erf//+7vMcOnRIqamp6tmzp+bOnatBgwbVGN+8efPUvHlzpaenq6qqSpL0/PPP66233tKzzz6rpKSkWl8rgFMwgABQVlZmSDJGjRpVq+MLCgoMScZtt93mMX7vvfcakoxNmza5x9q0aWNIMnJzc91jBw4cMOx2u3HPPfe4x4qKigxJxlNPPeVxzvT0dKNNmzbVYnjssceMX/4RmjNnjiHJ+OGHH04Z98k5lixZ4h7r2bOn0aJFC+PQoUPusU8++cQICQkxbrnllmrz3XrrrR7nvOaaa4xmzZqdcs5fXkfjxo0NwzCMa6+91hg8eLBhGIZRVVVlJCQkGNOnT6/xd3D8+HGjqqqq2nXY7XZjxowZ7rFt27ZVu7aTBgwYYEgynnvuuRr3DRgwwGNs3bp1hiTjiSeeMPbs2WM0adLEGD169BmvEUDtUPkjIJSXl0uSIiMja3X8f/7zH0lSZmamx/g999wjSdXWBnTt2lX9+vVz/9y8eXN16tRJe/bsOeuYzU6uFXjttdfkcrlq9Zn9+/eroKBA48ePV2xsrHu8e/fuGjp0qPs6f2ny5MkeP/fr10+HDh1y/w5rY9y4cdq8ebOKi4u1adMmFRcX19jyl35eJxAS8vNfFVVVVTp06JD7lsZHH31U6zntdrsmTJhQq2OHDRumO+64QzNmzNCYMWMUERGh559/vtZzATg9kj8CQlRUlCTp8OHDtTr+m2++UUhIiDp27OgxnpCQoJiYGH3zzTce461bt652jqZNm+q///3vWUZc3Q033KC+ffvqtttuU3x8vMaOHau///3vp/2HwMk4O3XqVG1fly5ddPDgQVVUVHiMm6+ladOmkuTVtVx99dWKjIzUK6+8opUrV6p3797VfpcnuVwuzZkzR+eff77sdrvi4uLUvHlzbd++XWVlZbWe87zzzvNqcd/TTz+t2NhYFRQU6JlnnlGLFi1q/VkAp0fyR0CIiopSUlKSPv30U68+Z15wdyoNGjSocdwwjLOe4+T96JMaNmyo3NxcbdiwQb/73e+0fft23XDDDRo6dGi1Y8/FuVzLSXa7XWPGjNGyZcu0evXqU1b9kvR///d/yszMVP/+/bVixQqtW7dO69ev14UXXljrDof08+/HGx9//LEOHDggSdqxY4dXnwVweiR/BIwRI0Zo9+7dysvLO+Oxbdq0kcvl0q5duzzGS0pKVFpa6l657wtNmzb1WBl/krm7IEkhISEaPHiwZs+erc8//1xPPvmkNm3apLfffrvGc5+Mc+fOndX2ffnll4qLi1Pjxo3P7QJOYdy4cfr44491+PDhGhdJnvTqq69q0KBBWrx4scaOHathw4ZpyJAh1X4ntf2HWG1UVFRowoQJ6tq1qyZNmqRZs2Zp27ZtPjs/YHUkfwSM+++/X40bN9Ztt92mkpKSavt3796tefPmSfq5bS2p2or82bNnS5KGDx/us7g6dOigsrIybd++3T22f/9+rV692uO4H3/8sdpnT77sxvz44UmJiYnq2bOnli1b5pFMP/30U7311lvu66wLgwYN0uOPP6758+crISHhlMc1aNCgWldh1apV+v777z3GTv4jpaZ/KHnrgQce0N69e7Vs2TLNnj1bbdu2VXp6+il/jwC8w0t+EDA6dOig7Oxs3XDDDerSpYvHG/7ee+89rVq1SuPHj5ck9ejRQ+np6XrhhRdUWlqqAQMG6IMPPtCyZcs0evToUz5GdjbGjh2rBx54QNdcc43+8Ic/6OjRo1q0aJEuuOACjwVvM2bMUG5uroYPH642bdrowIEDWrhwoVq2bKkrrrjilOd/6qmnlJqaquTkZE2cOFHHjh3Ts88+q+joaE2bNs1n12EWEhKiRx555IzHjRgxQjNmzNCECRN0+eWXa8eOHVq5cqXat2/vcVyHDh0UExOj5557TpGRkWrcuLH69Omjdu3aeRXXpk2btHDhQj322GPuRw+XLFmigQMH6tFHH9WsWbO8Oh+AGvj5aQOgmq+++sq4/fbbjbZt2xrh4eFGZGSk0bdvX+PZZ581jh8/7j6usrLSmD59utGuXTsjLCzMaNWqlZGVleVxjGH8/Kjf8OHDq81jfsTsVI/6GYZhvPXWW0a3bt2M8PBwo1OnTsaKFSuqPeq3ceNGY9SoUUZSUpIRHh5uJCUlGTfeeKPx1VdfVZvD/Djchg0bjL59+xoNGzY0oqKijJEjRxqff/65xzEn5zM/SrhkyRJDklFUVHTK36lheD7qdyqnetTvnnvuMRITE42GDRsaffv2NfLy8mp8RO+1114zunbtaoSGhnpc54ABA4wLL7ywxjl/eZ7y8nKjTZs2xsUXX2xUVlZ6HDd16lQjJCTEyMvLO+01ADgzm2F4sUoIAAD86nHPHwAAiyH5AwBgMSR/AAAshuQPAIDFkPwBALAYkj8AABZD8gcAwGIC5g1/R5+61d8hAAHnpXkn/B0CEJAmfreiTs9fedB3X/cdFtf+zAfVs4BJ/gAABAyX776JMxDR9gcAwGKo/AEAMDNc/o6gTpH8AQAwc5H8AQCwFCPIK3/u+QMAYDFU/gAAmNH2BwDAYmj7AwCAYELlDwCAWZC/5IfkDwCAGW1/AAAQTKj8AQAwY7U/AADWwkt+AABAUKHyBwDAjLY/AAAWE+Rtf5I/AABmQf6cP/f8AQCwGCp/AADMaPsDAGAxQb7gj7Y/AAAWQ+UPAIAZbX8AACyGtj8AAAgmVP4AAJgYRnA/50/yBwDALMjv+dP2BwDAYqj8AQAwC/IFfyR/AADMgrztT/IHAMCML/YBAAD1bebMmbLZbLr77rvdY8ePH1dGRoaaNWumJk2aKC0tTSUlJV6fm+QPAICZ4fLddha2bdum559/Xt27d/cYnzp1qtauXatVq1YpJydH+/bt05gxY7w+P8kfAAAzl8t3m5eOHDmim266SX/5y1/UtGlT93hZWZkWL16s2bNn68orr1SvXr20ZMkSvffee9q6datXc5D8AQCoQ06nU+Xl5R6b0+k85fEZGRkaPny4hgwZ4jGen5+vyspKj/HOnTurdevWysvL8yomkj8AAGY+bPs7HA5FR0d7bA6Ho8ZpX375ZX300Uc17i8uLlZ4eLhiYmI8xuPj41VcXOzV5bHaHwAAMx8+55+VlaXMzEyPMbvdXu24b7/9VnfddZfWr1+viIgIn81fE5I/AAB1yG6315jszfLz83XgwAFdfPHF7rGqqirl5uZq/vz5WrdunU6cOKHS0lKP6r+kpEQJCQlexUTyBwDAzA9v+Bs8eLB27NjhMTZhwgR17txZDzzwgFq1aqWwsDBt3LhRaWlpkqSdO3dq7969Sk5O9moukj8AACb++Fa/yMhIdevWzWOscePGatasmXt84sSJyszMVGxsrKKiojRlyhQlJyfrsssu82oukj8AAL8Sc+bMUUhIiNLS0uR0OpWSkqKFCxd6fR6SPwAAZgHyxT6bN2/2+DkiIkILFizQggULzum8JH8AAMz4Yh8AACwmQCr/usJLfgAAsBgqfwAAzGj7AwBgMbT9AQBAMKHyBwDAjLY/AAAWQ9sfAAAEEyp/AADMgrzyJ/kDAGAW5Pf8afsDAGAxVP4AAJjR9gcAwGKCvO1P8gcAwCzIK3/u+QMAYDFU/gAAmNH2BwDAYmj7AwCAYELlDwCAWZBX/iR/AADMDMPfEdQp2v4AAFgMlT8AAGa0/QEAsJggT/60/QEAsBgqfwAAzHjJDwAAFhPkbX+SPwAAZjzqBwAAggmVPwAAZrT9AQCwmCBP/rT9AQCwGCp/AADMeNQPAABrMVys9gcAAPVg0aJF6t69u6KiohQVFaXk5GS98cYb7v0DBw6UzWbz2CZPnuz1PFT+AACY+WnBX8uWLTVz5kydf/75MgxDy5Yt06hRo/Txxx/rwgsvlCTdfvvtmjFjhvszjRo18noekj8AAGZ+uuc/cuRIj5+ffPJJLVq0SFu3bnUn/0aNGikhIeGc5qHtDwBAHXI6nSovL/fYnE7nGT9XVVWll19+WRUVFUpOTnaPr1y5UnFxcerWrZuysrJ09OhRr2Mi+QMAYOYyfLY5HA5FR0d7bA6H45RT79ixQ02aNJHdbtfkyZO1evVqde3aVZI0btw4rVixQm+//baysrK0fPly3XzzzV5fns0wAuMFxkefutXfIQAB56V5J/wdAhCQJn63ok7Pf/TZ/+ezczWYNKdapW+322W322s8/sSJE9q7d6/Kysr06quv6q9//atycnLc/wD4pU2bNmnw4MEqLCxUhw4dah0T9/wBADDz4YK/0yX6moSHh6tjx46SpF69emnbtm2aN2+enn/++WrH9unTR5K8Tv60/QEACGAul+uUawQKCgokSYmJiV6dk8ofAAAzP90Rz8rKUmpqqlq3bq3Dhw8rOztbmzdv1rp167R7925lZ2fr6quvVrNmzbR9+3ZNnTpV/fv3V/fu3b2ah+QPAICZn57zP3DggG655Rbt379f0dHR6t69u9atW6ehQ4fq22+/1YYNGzR37lxVVFSoVatWSktL0yOPPOL1PCR/CwrtOVChPQfJFhUnSXId+l6V762Vq2hHtWPtaVPVoP1Fcq5+VlWFH9d3qEC9SujTSRdNHq5mF7VT44Sm2jBxjr5Zl+/eHxEXpd4PjdV5/S+SPbqRit/fqbxHl6m8qMSPUSOYLF68+JT7WrVqpZycHJ/Mwz1/CzIO/1cncl7V8Ren6/jyGXJ986Xs10yRrVmSx3GhvYZKCoiHQYB6EdrIrh8/36u8R5bVuH/o4qmKat1CGybO0ZqUR3Tku4NKfSlLoQ1rv5gLvxI+fNQvEJH8Lahq9ydyFe2QUXpAxn9LVLnln9KJ4wpJ+t9KUVuLVgrtnSLnm3/zY6RA/fru7e3Kf+pVffPmh9X2RbVLUIte5+vdh5bo4Cd7VLZnv97NWqIGEWFqPzq5hrPhV81w+W4LQF63/Q8ePKi//e1vysvLU3FxsSQpISFBl19+ucaPH6/mzZv7PEjUIZtNDTr1lsLscu3b/fNYaLjsw+9Q5YYVUkW5f+MDAkQD+89/XVY5K/83aBiqOvGT4ntfoK9e2uyfwICz4FXy37Ztm1JSUtSoUSMNGTJEF1xwgSSppKREzzzzjGbOnKl169bpkksuOe15nE5ntccWqn6qkj20gZfh42zZ4s5TxE0PS6Fh0gmnnGvmyzi0T5IUduVYufYVqqqwwL9BAgGktHC/jnx3UJc8eIPefXCxfjrqVLfbU9UkqZkatYjxd3jwtQBt1/uKV8l/ypQpuu666/Tcc8/JZrN57DMMQ5MnT9aUKVOUl5d32vM4HA5Nnz7dY+yhIT318LDfeBMOzoHxY7GOL5sm2Rsq9IJLZL/6Nh1/+U8KiWmhBq27/LwPgJvxU5U23D5X/Z6+Xb/77AW5fqrSvi2f6dtNBZLp70P8+hl+Wu1fX7xK/p988omWLl1aLfFLks1m09SpU/Wb35w5gWdlZSkzM9NjrGrBFG9CwblyVckoPSBJqiz5RiGJ7RTaa4hUWSlbTHM1/MN8j8PDR2XI9d1Xcr4yyx/RAgHh0I6vtSblYYVFNlSDsFAd//GwRq6dpoOfFPk7NMArXiX/hIQEffDBB+rcuXON+z/44APFx8ef8Tw1verwKC1/P7PJ1iBUJ95do5925HrsaTjhcVW+/bKqdhf4JzQgwFQePqZKSVHt4hXXvb0+eupVf4cEX6Pt/z/33nuvJk2apPz8fA0ePNid6EtKSrRx40b95S9/0dNPP10ngcJ3wvqlqapoh4zyQ1J4hEK7XKaQ1p3kXDVbqiiXUcMiP6P8kIyyg36IFqg/oY3simr7vwKmSavmiu3aWs7SClXsO6S2wy/V8R8Pq+L7g2rauZUum/47fbPuQ32f+6kfo0adCNBV+r7iVfLPyMhQXFyc5syZo4ULF6qqqkqS1KBBA/Xq1UtLly7V9ddfXyeBwndsjaIUfvVtsjWOlpzH5Dr4nZyrZsv1zef+Dg3wq7ge7TV81cPuny+b9vNXpX7191y9k/mCGsXHqM9jN6lhXLSOHSjVrle3qGDean+Fi7oU5JX/WX+lb2VlpQ4e/LkSjIuLU1hY2DkFwlf6AtXxlb5Azer6K30rZtzks3M1/uNKn53LV8769b5hYWFef4sQAAC/Cqz2BwDAYoK87c/rfQEAsBgqfwAAzFjtDwCAxdD2BwAAwYTKHwAAE97tDwCA1dD2BwAAwYTKHwAAsyCv/En+AACY8agfAAAWE+SVP/f8AQCwGCp/AABMjCCv/En+AACYBXnyp+0PAIDFUPkDAGDGG/4AALAY2v4AACCYUPkDAGAW5JU/yR8AABPDCO7kT9sfAACLofIHAMCMtj8AABYT5Mmftj8AACaGy/DZ5o1Fixape/fuioqKUlRUlJKTk/XGG2+49x8/flwZGRlq1qyZmjRporS0NJWUlHh9fSR/AAACRMuWLTVz5kzl5+frww8/1JVXXqlRo0bps88+kyRNnTpVa9eu1apVq5STk6N9+/ZpzJgxXs9D2x8AADM/tf1Hjhzp8fOTTz6pRYsWaevWrWrZsqUWL16s7OxsXXnllZKkJUuWqEuXLtq6dasuu+yyWs9D8gcAwMyHb/d1Op1yOp0eY3a7XXa7/bSfq6qq0qpVq1RRUaHk5GTl5+ersrJSQ4YMcR/TuXNntW7dWnl5eV4lf9r+AADUIYfDoejoaI/N4XCc8vgdO3aoSZMmstvtmjx5slavXq2uXbuquLhY4eHhiomJ8Tg+Pj5excXFXsVE5Q8AgIm3C/VOJysrS5mZmR5jp6v6O3XqpIKCApWVlenVV19Venq6cnJyfBaPRPIHAKA6Hyb/2rT4fyk8PFwdO3aUJPXq1Uvbtm3TvHnzdMMNN+jEiRMqLS31qP5LSkqUkJDgVUy0/QEACGAul0tOp1O9evVSWFiYNm7c6N63c+dO7d27V8nJyV6dk8ofAAAzHy7480ZWVpZSU1PVunVrHT58WNnZ2dq8ebPWrVun6OhoTZw4UZmZmYqNjVVUVJSmTJmi5ORkrxb7SSR/AACq8eU9f28cOHBAt9xyi/bv36/o6Gh1795d69at09ChQyVJc+bMUUhIiNLS0uR0OpWSkqKFCxd6PQ/JHwCAALF48eLT7o+IiNCCBQu0YMGCc5qH5A8AgJmf2v71heQPAICJv9r+9YXkDwCAWZBX/jzqBwCAxVD5AwBgYgR55U/yBwDALMiTP21/AAAshsofAAAT2v4AAFhNkCd/2v4AAFgMlT8AACa0/QEAsBiSPwAAFhPsyZ97/gAAWAyVPwAAZobN3xHUKZI/AAAmtP0BAEBQofIHAMDEcNH2BwDAUmj7AwCAoELlDwCAicFqfwAArIW2PwAACCpU/gAAmLDaHwAAizEMf0dQt0j+AACYBHvlzz1/AAAshsofAACTYK/8Sf4AAJgE+z1/2v4AAFgMlT8AACa0/QEAsJhgf70vbX8AACyGyh8AABPe7Q8AgMW4DJvPNm84HA717t1bkZGRatGihUaPHq2dO3d6HDNw4EDZbDaPbfLkyV7NQ/IHACBA5OTkKCMjQ1u3btX69etVWVmpYcOGqaKiwuO422+/Xfv373dvs2bN8moe2v4AAJj4a8Hfm2++6fHz0qVL1aJFC+Xn56t///7u8UaNGikhIeGs56HyBwDAxHDZfLY5nU6Vl5d7bE6ns1ZxlJWVSZJiY2M9xleuXKm4uDh169ZNWVlZOnr0qFfXR/IHAMDEMHy3ORwORUdHe2wOh+OMMbhcLt19993q27evunXr5h4fN26cVqxYobfffltZWVlavny5br75Zq+uj7Y/AAB1KCsrS5mZmR5jdrv9jJ/LyMjQp59+qi1btniMT5o0yf3/F110kRITEzV48GDt3r1bHTp0qFVMJH8AAEx8+YY/u91eq2T/S3feeadef/115ebmqmXLlqc9tk+fPpKkwsJCkj8AAGfL20f0fMUwDE2ZMkWrV6/W5s2b1a5duzN+pqCgQJKUmJhY63lI/gAABIiMjAxlZ2frtddeU2RkpIqLiyVJ0dHRatiwoXbv3q3s7GxdffXVatasmbZv366pU6eqf//+6t69e63nIfkDAGDir0f9Fi1aJOnnF/n80pIlSzR+/HiFh4drw4YNmjt3rioqKtSqVSulpaXpkUce8Woekj8AACaG4a95Tz9xq1atlJOTc87z8KgfAAAWQ+UPAICJvxb81ReSPwAAJv66519faPsDAGAxVP4AAJj4a8FffSH5AwBgwj3/ehL18Dp/hwAEnGP73vF3CIAlcc8fAAAElYCp/AEACBS0/QEAsJggX+9H2x8AAKuh8gcAwIS2PwAAFsNqfwAAEFSo/AEAMHH5O4A6RvIHAMDEEG1/AAAQRKj8AQAwcQX5g/4kfwAATFxB3vYn+QMAYMI9fwAAEFSo/AEAMOFRPwAALIa2PwAACCpU/gAAmND2BwDAYoI9+dP2BwDAYqj8AQAwCfYFfyR/AABMXMGd+2n7AwBgNVT+AACY8G5/AAAsJsi/1I/kDwCAGY/6AQCAoELyBwDAxGWz+WzzhsPhUO/evRUZGakWLVpo9OjR2rlzp8cxx48fV0ZGhpo1a6YmTZooLS1NJSUlXs1D8gcAwMTw4eaNnJwcZWRkaOvWrVq/fr0qKys1bNgwVVRUuI+ZOnWq1q5dq1WrViknJ0f79u3TmDFjvJrHZhhGQKxrCA0/z98hAAHn2L53/B0CEJDC4trX6flXJd7ks3Ndt3/lWX/2hx9+UIsWLZSTk6P+/furrKxMzZs3V3Z2tq699lpJ0pdffqkuXbooLy9Pl112Wa3OS+UPAICJy4eb0+lUeXm5x+Z0OmsVR1lZmSQpNjZWkpSfn6/KykoNGTLEfUznzp3VunVr5eXl1fr6SP4AAJi4bL7bHA6HoqOjPTaHw3HmGFwu3X333erbt6+6desmSSouLlZ4eLhiYmI8jo2Pj1dxcXGtr49H/QAAqENZWVnKzMz0GLPb7Wf8XEZGhj799FNt2bLF5zGR/AEAMPHlG/7sdnutkv0v3XnnnXr99deVm5urli1buscTEhJ04sQJlZaWelT/JSUlSkhIqPX5afsDAGDir9X+hmHozjvv1OrVq7Vp0ya1a9fOY3+vXr0UFhamjRs3usd27typvXv3Kjk5udbzUPkDABAgMjIylJ2drddee02RkZHu+/jR0dFq2LChoqOjNXHiRGVmZio2NlZRUVGaMmWKkpOTa73SXyL5AwBQjb++0nfRokWSpIEDB3qML1myROPHj5ckzZkzRyEhIUpLS5PT6VRKSooWLlzo1Tw85w8EMJ7zB2pW18/5Lz3vZp+da/z3K3x2Ll+h8gcAwCQgquI6xII/AAAshsofAAATf93zry8kfwAATFz+DqCO0fYHAMBiqPwBADAJ9sqf5A8AgIkR5Pf8afsDAGAxVP4AAJjQ9gcAwGKCPfnT9gcAwGKo/AEAMAn21/uS/AEAMOENfwAAWAz3/AEAQFCh8gcAwCTYK3+SPwAAJsG+4I+2PwAAFkPlDwCACav9AQCwmGC/50/bHwAAi6HyBwDAJNgX/JH8AQAwcQV5+qftDwCAxVD5AwBgEuwL/kj+AACYBHfTn+QPAEA1wV75c88fAACLofIHAMCEN/wBAGAxPOoHAACCCpU/AAAmwV33k/wBAKiG1f4AACCokPwBADBxyfDZ5o3c3FyNHDlSSUlJstlsWrNmjcf+8ePHy2azeWxXXXWV19dH8gcAwMTw4eaNiooK9ejRQwsWLDjlMVdddZX279/v3l566SUvZ+GePwAAASM1NVWpqamnPcZutyshIeGc5qHyBwDAxOXDzel0qry83GNzOp1nHdvmzZvVokULderUSb///e916NAhr89B8gcAwMSX9/wdDoeio6M9NofDcVZxXXXVVXrxxRe1ceNG/elPf1JOTo5SU1NVVVXl1Xlo+wMAYOLL5/yzsrKUmZnpMWa328/qXGPHjnX//0UXXaTu3burQ4cO2rx5swYPHlzr81D5AwBQh+x2u6Kiojy2s03+Zu3bt1dcXJwKCwu9+hyVPwAAJr+Wl/x89913OnTokBITE736HMkfAAATw08v+D1y5IhHFV9UVKSCggLFxsYqNjZW06dPV1pamhISErR7927df//96tixo1JSUryah+QPAECA+PDDDzVo0CD3zyfXCqSnp2vRokXavn27li1bptLSUiUlJWnYsGF6/PHHvb6NQPIHAMDEX23/gQMHyjBO3XVYt26dT+Yh+QMAYOLta3l/bVjtDwCAxVD5AwBgEtx1P8kfAIBqaPvDEu6YdIs+yl+vHw9+qR8Pfqktuf/SVSmDzvxBIEj9dfnf1a1vqmbOfa7aPsMwNPmeR9Wtb6o25r7nh+iAc0PlD0nS99/v18MPO7SrsEg2m023/O46/fMff9Mll6bo88+/8nd4QL3a8cVOrXrtP7qgY7sa9y9/ZY1s9RwT6tev5SU/Z4vKH5Kk1/+9Xm+8uUmFhUXatWuPHv3jn3TkSIX6XHqxv0MD6tXRo8f04PSnNO2BuxQV2aTa/i+/2q1lL/9Djz801Q/Rob4YPvwvEJH8UU1ISIiuv/63aty4kba+n+/vcIB69cSfF6h/cm8l9/5NtX3Hjh/X/dP/pIfvyVBcs1g/RIf64suv9A1EPk/+3377rW699dbTHlPTdxuf7qUGqB/dunVW6Y9f6eiRIi2cP1PXXnebvvhil7/DAurNfzZs1hdf7dbdkyfUuH/WMy+oZ7euurJfcj1HBviWz5P/jz/+qGXLlp32mJq+29hwHfZ1KPDSzp271av3MF3ed4Sef+FF/W3xXHXpcr6/wwLqxf6SHzRz7vOa+dj9stvDq+1/+52tej//Ez141x1+iA71Ldjb/jbDy5L7X//612n379mzR/fcc4+qqqpOeYzT6ZTT6fQYa9qss2w2ltAEknVvvKzde77R/8t4wN+hWNaxfe/4OwTL2Jj7nu7KelwNGvyvJqqqcslmsykkxKYbRg/XS/98XSEhNo/9ISEhurjHhVo6f5Y/wrassLj2dXr+9LZpPjvXsq//4bNz+YrXq/1Hjx4tm8122jb9mZK43W6v9iUEJP7AExISUmMFBASjy3r11OrlizzGHnlyttq1aaWJN1+nptFRum701R77r/nd73X/HyZpYN8+9RkqcM68Tv6JiYlauHChRo0aVeP+goIC9erV65wDQ/168okH9eabb2vvt98rMrKJbhw7WgMGJOvq4eP8HRpQLxo3bqTz27f1GGvYMEIxUZHu8ZoW+SXGN1fLpIR6iBD1yRXk69C8Tv69evVSfn7+KZP/mboCCEzNm8dpyd/mKTGxhcrKDmvHji909fBx2rCRtjMA6wn2LOb1Pf933nlHFRUVuuqqq2rcX1FRoQ8//FADBgzwKpDQ8PO8Oh6wAu75AzWr63v+N7cZ47Nzrfjmnz47l694Xfn369fvtPsbN27sdeIHACCQBPu7/Xm9LwAAJoH6iJ6v8IY/AAAshsofAACTQH0tr6+Q/AEAMOGePwAAFsM9fwAAEFSo/AEAMOGePwAAFhPsb6ql7Q8AgMVQ+QMAYMJqfwAALCbY7/nT9gcAwGKo/AEAMAn25/xJ/gAAmAT7PX/a/gAAWAyVPwAAJsH+nD/JHwAAk2Bf7U/yBwDAJNgX/HHPHwCAAJGbm6uRI0cqKSlJNptNa9as8dhvGIb++Mc/KjExUQ0bNtSQIUO0a9cur+ch+QMAYOKS4bPNGxUVFerRo4cWLFhQ4/5Zs2bpmWee0XPPPaf3339fjRs3VkpKio4fP+7VPLT9AQAw8deCv9TUVKWmpta4zzAMzZ07V4888ohGjRolSXrxxRcVHx+vNWvWaOzYsbWeh8ofAIA65HQ6VV5e7rE5nU6vz1NUVKTi4mINGTLEPRYdHa0+ffooLy/Pq3OR/AEAMPFl29/hcCg6OtpjczgcXsdUXFwsSYqPj/cYj4+Pd++rLdr+AACY+HK1f1ZWljIzMz3G7Ha7z85/Nkj+AADUIbvd7pNkn5CQIEkqKSlRYmKie7ykpEQ9e/b06ly0/QEAMHEZhs82X2nXrp0SEhK0ceNG91h5ebnef/99JScne3UuKn8AAEz89YqfI0eOqLCw0P1zUVGRCgoKFBsbq9atW+vuu+/WE088ofPPP1/t2rXTo48+qqSkJI0ePdqreUj+AAAEiA8//FCDBg1y/3xyrUB6erqWLl2q+++/XxUVFZo0aZJKS0t1xRVX6M0331RERIRX89iMAPn2gtDw8/wdAhBwju17x98hAAEpLK59nZ6/73lX+uxc736/yWfn8hUqfwAATLx9M9+vDckfAACTAGmK1xlW+wMAYDFU/gAAmND2BwDAYnz5hr9ARNsfAACLofIHAMAk2Bf8kfwBADAJ9nv+tP0BALAYKn8AAExo+wMAYDG0/QEAQFCh8gcAwCTYn/Mn+QMAYOLinj8AANYS7JU/9/wBALAYKn8AAExo+wMAYDG0/QEAQFCh8gcAwIS2PwAAFkPbHwAABBUqfwAATGj7AwBgMbT9AQBAUKHyBwDAxDBc/g6hTpH8AQAwcQV525/kDwCAiRHkC/645w8AgMVQ+QMAYELbHwAAi6HtDwAAggqVPwAAJrzhDwAAi+ENfwAAoF5MmzZNNpvNY+vcubPP56HyBwDAxJ8L/i688EJt2LDB/XNoqO9TNckfAAATfz7qFxoaqoSEhDqdg7Y/AAB1yOl0qry83GNzOp2nPH7Xrl1KSkpS+/btddNNN2nv3r0+j4nkDwCAiWEYPtscDoeio6M9NofDUeO8ffr00dKlS/Xmm29q0aJFKioqUr9+/XT48GGfXp/NCJA3GYSGn+fvEICAc2zfO/4OAQhIYXHt6/T8sZHn++xc+w9+Wq3St9vtstvtZ/xsaWmp2rRpo9mzZ2vixIk+i4l7/gAAmPiyLq5toq9JTEyMLrjgAhUWFvosHom2PwAAAevIkSPavXu3EhMTfXpekj8AACYuGT7bvHHvvfcqJydHX3/9td577z1dc801atCggW688UafXh9tfwAATPy1HO67777TjTfeqEOHDql58+a64oortHXrVjVv3tyn85D8AQAIEC+//HK9zEPyBwDAhC/2AQDAYvhiHwAAEFSo/AEAMKHtDwCAxQTIy2/rDG1/AAAshsofAACTYF/wR/IHAMAk2Nv+JH8AAEyCPflzzx8AAIuh8gcAwCS4637JZgR7bwNecTqdcjgcysrKOuvvnwaCDX8uEGxI/vBQXl6u6OholZWVKSoqyt/hAAGBPxcINtzzBwDAYkj+AABYDMkfAACLIfnDg91u12OPPcaiJuAX+HOBYMOCPwAALIbKHwAAiyH5AwBgMSR/AAAshuQPAIDFkPwBALAYkj/cFixYoLZt2yoiIkJ9+vTRBx984O+QAL/Kzc3VyJEjlZSUJJvNpjVr1vg7JMAnSP6QJL3yyivKzMzUY489po8++kg9evRQSkqKDhw44O/QAL+pqKhQjx49tGDBAn+HAvgUz/lDktSnTx/17t1b8+fPlyS5XC61atVKU6ZM0YMPPujn6AD/s9lsWr16tUaPHu3vUIBzRuUPnThxQvn5+RoyZIh7LCQkREOGDFFeXp4fIwMA1AWSP3Tw4EFVVVUpPj7eYzw+Pl7FxcV+igoAUFdI/gAAWAzJH4qLi1ODBg1UUlLiMV5SUqKEhAQ/RQUAqCskfyg8PFy9evXSxo0b3WMul0sbN25UcnKyHyMDANSFUH8HgMCQmZmp9PR0XXLJJbr00ks1d+5cVVRUaMKECf4ODfCbI0eOqLCw0P1zUVGRCgoKFBsbq9atW/sxMuDc8Kgf3ObPn6+nnnpKxcXF6tmzp5555hn16dPH32EBfrN582YNGjSo2nh6erqWLl1a/wEBPkLyBwDAYrjnDwCAxZD8AQCwGJI/AAAWQ/IHAMBiSP4AAFgMyR8AAIsh+QMAYDEkfwAALIbkDwCAxZD8AQCwGJI/AAAW8/8BiCgtkn5k6EMAAAAASUVORK5CYII=\",\n      \"text/plain\": [\n       \"<Figure size 640x480 with 2 Axes>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAsUAAAHWCAYAAACfYfSwAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAB8oklEQVR4nO3dd1RU19oG8GfoHUGkimIvERsqwRIsRDCJsUWJvZfYxYYN7NjFbkxiidFYk2gsYIldY8cu2LsoIlXqzP7+8PN4J4AyOHAoz28t1r3znjLPcCC+7Nlnj0IIIUBEREREVITpyB2AiIiIiEhubIqJiIiIqMhjU0xERERERR6bYiIiIiIq8tgUExEREVGRx6aYiIiIiIo8NsVEREREVOSxKSYiIiKiIo9NMREREREVeWyKiShXuLi4oEePHnLHKBJ69OgBFxcXuWNkqXHjxqhWrZrcMfKdw4cPQ6FQ4PDhw1o539q1a6FQKHD//n2tnI+oqGFTTFQAvfvH792Xnp4enJyc0KNHDzx58kTueJQLnj59ismTJyMsLEzuKEXKzJkz8ddff8kdQ01+zERUGCiEEELuEESkmbVr16Jnz56YOnUqypQpg+TkZPz7779Yu3YtXFxccPXqVRgZGcmaMSUlBTo6OtDX15c1R2Fx7tw51K1bF2vWrMkwAp+WlgaVSgVDQ0N5wn1E48aNERUVhatXr8odRWNmZmb47rvvsHbtWq2fW6VSITU1FQYGBtDRyf4YVVaZlEol0tLSYGhoCIVCoeW0RIWfntwBiCjnWrRogTp16gAA+vTpAxsbG8yePRs7d+5Ehw4dZM0mR4OWnJyscYMhF21m5R8eQHp6OlQqFQwMDOSO8lH/e+21+cerrq4udHV1tXY+oqIm///LQUTZ1qhRIwDAnTt31Oo3b97Ed999B2traxgZGaFOnTrYuXNnhuNjYmIwYsQIuLi4wNDQECVLlkS3bt0QFRUl7ZOSkoLAwECUL18ehoaGcHZ2xpgxY5CSkqJ2rv+dU3zu3DkoFAqsW7cuw3OGhoZCoVBg165dUu3Jkyfo1asX7OzsYGhoiM8++wyrV69WO+7dfMxNmzZh4sSJcHJygomJCeLi4rL8/iQmJmLkyJFwdnaGoaEhKlWqhHnz5uG/b5gpFAoMHjwYGzZsQKVKlWBkZAQ3NzccPXo0wzk/NWt0dDRGjRoFV1dXmJmZwcLCAi1atMClS5fUjq9bty4AoGfPntK0mXcjhf+dU3z//n0oFArMmzcPq1atQrly5WBoaIi6devi7NmzGV7D1q1bUbVqVRgZGaFatWr4888/NZqnvHfvXnh6esLc3BwWFhaoW7cuNm7cmGG/69evo0mTJjAxMYGTkxPmzJmjtj01NRUBAQFwc3ODpaUlTE1N0ahRIxw6dEhtv/99fcHBwdLru379erbPAbwdqV20aBFcXV1hZGSEEiVKwMfHB+fOnQPw9ucgMTER69atk77n/ztK/6nXPrM5xbdu3UK7du1gb28PIyMjlCxZEt9//z1iY2M/mimrOcXZvT5ERR1HiokKkXf/GFpZWUm1a9euoUGDBnBycoK/vz9MTU2xZcsWtG7dGtu3b0ebNm0AAAkJCWjUqBFu3LiBXr16oXbt2oiKisLOnTvx+PFj2NjYQKVS4dtvv8Xx48fRr18/VKlSBVeuXMHChQsRERGR5TzHOnXqoGzZstiyZQu6d++utm3z5s2wsrKCt7c3ACAyMhKff/651JiWKFECe/fuRe/evREXF4fhw4erHT9t2jQYGBhg1KhRSElJyXKkUAiBb7/9FocOHULv3r1Rs2ZNhIaGYvTo0Xjy5AkWLlyotv+RI0ewefNmDB06FIaGhli+fDl8fHxw5swZ6aYxbWS9fv06/vrrL7Rv3x5lypRBZGQkfvzxR3h6euL69etwdHRElSpVMHXqVAQEBKBfv37SHz/169fP/Afh/23cuBHx8fHo378/FAoF5syZg7Zt2+Lu3bvS6PLu3bvh6+sLV1dXBAUF4fXr1+jduzecnJw+eO531q5di169euGzzz7DuHHjUKxYMVy8eBEhISHo1KmTtN/r16/h4+ODtm3bokOHDti2bRvGjh0LV1dXtGjRAgAQFxeHn3/+GR07dkTfvn0RHx+PX375Bd7e3jhz5gxq1qyp9txr1qxBcnIy+vXrB0NDQ1hbW2t0jt69e2Pt2rVo0aIF+vTpg/T0dBw7dgz//vsv6tSpg/Xr16NPnz6oV68e+vXrBwAoV66c1q79f6WmpsLb2xspKSkYMmQI7O3t8eTJE+zatQsxMTGwtLT8YKZPuT5EBEAQUYGzZs0aAUAcOHBAvHz5Ujx69Ehs27ZNlChRQhgaGopHjx5J+zZr1ky4urqK5ORkqaZSqUT9+vVFhQoVpFpAQIAAIP74448Mz6dSqYQQQqxfv17o6OiIY8eOqW1fuXKlACBOnDgh1UqXLi26d+8uPR43bpzQ19cX0dHRUi0lJUUUK1ZM9OrVS6r17t1bODg4iKioKLXn+P7774WlpaV48+aNEEKIQ4cOCQCibNmyUu1D/vrrLwFATJ8+Xa3+3XffCYVCIW7fvi3VAAgA4ty5c1LtwYMHwsjISLRp00arWZOTk4VSqVSr3bt3TxgaGoqpU6dKtbNnzwoAYs2aNRleW/fu3UXp0qXVjgcgihcvrvb93rFjhwAg/v77b6nm6uoqSpYsKeLj46Xa4cOHBQC1c2YmJiZGmJubC3d3d5GUlKS27d3PjBBCeHp6CgDi119/lWopKSnC3t5etGvXTqqlp6eLlJQUtfO8fv1a2NnZqf2MvHt9FhYW4sWLF2r7Z/cc//zzjwAghg4dmuF1/W92U1NTtZ/jd7Rx7d9tO3TokBBCiIsXLwoAYuvWrRme739llendfxfu3bsnhMj+9SGitzh9gqgA8/LyQokSJeDs7IzvvvsOpqam2LlzJ0qWLAkAiI6Oxj///IMOHTogPj4eUVFRiIqKwqtXr+Dt7Y1bt25Jq1Vs374dNWrUkEaO/9e7m3a2bt2KKlWqoHLlytK5oqKi0LRpUwDI9C3qd3x9fZGWloY//vhDqu3btw8xMTHw9fUF8HY0d/v27WjZsiWEEGrP4e3tjdjYWFy4cEHtvN27d4exsfFHv1d79uyBrq4uhg4dqlYfOXIkhBDYu3evWt3DwwNubm7S41KlSqFVq1YIDQ2FUqnUWlZDQ0NpXrFSqcSrV69gZmaGSpUqZTheU76+vmrvGrwbYb579y6AtytaXLlyBd26dYOZmZm0n6enJ1xdXT96/v379yM+Ph7+/v4Z5sb+90YvMzMzdOnSRXpsYGCAevXqSVmAt3Ni342gqlQqREdHIz09HXXq1Mn0e9GuXTuUKFFCrZbdc2zfvh0KhQKBgYEZzvuxm9Ry6+fU0tISwNspRW/evPngvtmhyfUhIk6fICrQli1bhooVKyI2NharV6/G0aNH1W5wu337NoQQmDRpEiZNmpTpOV68eAEnJyfcuXMH7dq1++Dz3bp1Czdu3MjQiPzvubJSo0YNVK5cGZs3b0bv3r0BvJ06YWNjIzXVL1++RExMDFatWoVVq1Zl6znKlCnzwczvPHjwAI6OjjA3N1erV6lSRdr+vypUqJDhHBUrVsSbN2/w8uVL6OjoaCXru3mty5cvx71796BUKqVtxYsXz9Zry0qpUqXUHr9rkF+/fg3g/WsuX758hmPLly//0ab83dz17KxBXLJkyQyNmJWVFS5fvqxWW7duHebPn4+bN28iLS1Nqmf2vcvq2mfnHHfu3IGjoyOsra0/mv2/cuvntEyZMvDz88OCBQuwYcMGNGrUCN9++y26dOkiNcya0OT6EBGbYqICrV69etLqE61bt0bDhg3RqVMnhIeHw8zMDCqVCgAwatQoac7uf2XWEGVFpVLB1dUVCxYsyHS7s7PzB4/39fXFjBkzEBUVBXNzc+zcuRMdO3aEnp6edH4A6NKlS4a5x+9Ur15d7XF2Rolzg7ayzpw5E5MmTUKvXr0wbdo0WFtbQ0dHB8OHD5eeI6eyWolAyLASZ3ay/Pbbb+jRowdat26N0aNHw9bWFrq6uggKCspw8yiQ+fdT03PkRG7+nM6fPx89evTAjh07sG/fPgwdOhRBQUH4999/pXeAiCh3sCkmKiTe/cPfpEkTLF26FP7+/ihbtiyAt0t2eXl5ffD4cuXKfXQd2XLlyuHSpUto1qxZjt5+9fX1xZQpU7B9+3bY2dkhLi4O33//vbS9RIkSMDc3h1Kp/GheTZUuXRoHDhxAfHy82mjxzZs3pe3/69atWxnOERERARMTE2mkXBtZt23bhiZNmuCXX35Rq8fExMDGxkZ6nBtvd797zbdv386wLbPaf727wevq1asa/XGVlW3btqFs2bL4448/1F5vZlMcPvUc5cqVQ2hoKKKjoz84WpzZ9z03f04BwNXVFa6urpg4cSJOnjyJBg0aYOXKlZg+fXqWmTKj7etDVNhxTjFRIdK4cWPUq1cPwcHBSE5Ohq2tLRo3bowff/wRz549y7D/y5cvpf/frl07XLp0CX/++WeG/d6N5nXo0AFPnjzBTz/9lGGfpKQkJCYmfjBflSpV4Orqis2bN2Pz5s1wcHDAF198IW3X1dVFu3btsH379kwb9P/Nq6mvvvoKSqUSS5cuVasvXLgQCoVCWgHhnVOnTqlNH3j06BF27NiB5s2bS+vBaiOrrq5uhpHbrVu3ZvhkQlNTUwBvm2VtcXR0RLVq1fDrr78iISFBqh85cgRXrlz56PHNmzeHubk5goKCkJycrLYtJ6PR70aT//fY06dP49SpU1o/R7t27SCEwJQpUzKc43+PNTU1zfA9z62f07i4OKSnp6vVXF1doaOjo7bkYWaZMqPt60NU2HGkmKiQGT16NNq3b4+1a9diwIABWLZsGRo2bAhXV1f07dsXZcuWRWRkJE6dOoXHjx9L6+GOHj0a27ZtQ/v27dGrVy+4ubkhOjoaO3fuxMqVK1GjRg107doVW7ZswYABA3Do0CE0aNAASqUSN2/exJYtWxAaGipN58iKr68vAgICYGRkhN69e2f48IpZs2bh0KFDcHd3R9++fVG1alVER0fjwoULOHDgAKKjo3P0fWnZsiWaNGmCCRMm4P79+6hRowb27duHHTt2YPjw4RmWtapWrRq8vb3VlmQDoNZEaSPrN998g6lTp6Jnz56oX78+rly5gg0bNkij/O+UK1cOxYoVw8qVK2Fubg5TU1O4u7tne051VmbOnIlWrVqhQYMG6NmzJ16/fo2lS5eiWrVqao1yZiwsLLBw4UL06dMHdevWRadOnWBlZYVLly7hzZs3ma5L/SHffPMN/vjjD7Rp0wZff/017t27h5UrV6Jq1aofzaLpOZo0aYKuXbti8eLFuHXrFnx8fKBSqXDs2DE0adIEgwcPBgC4ubnhwIEDWLBgARwdHVGmTBm4u7vnys/pP//8g8GDB6N9+/aoWLEi0tPTsX79eqkJfyerTP+l7etDVOjl9XIXRPTp3i29dPbs2QzblEqlKFeunChXrpxIT08XQghx584d0a1bN2Fvby/09fWFk5OT+Oabb8S2bdvUjn316pUYPHiwcHJyEgYGBqJkyZKie/fuastOpaamitmzZ4vPPvtMGBoaCisrK+Hm5iamTJkiYmNjpf3+uyTbO7du3ZKWPDt+/Himry8yMlIMGjRIODs7C319fWFvby+aNWsmVq1aJe3zbjmrjy1f9b/i4+PFiBEjhKOjo9DX1xcVKlQQc+fOzbA8FQAxaNAg8dtvv4kKFSoIQ0NDUatWLWnpLG1mTU5OFiNHjhQODg7C2NhYNGjQQJw6dUp4enoKT09PtX137NghqlatKvT09NSWZ8tqSba5c+dmeD4AIjAwUK22adMmUblyZWFoaCiqVasmdu7cKdq1aycqV6784W/o/9u5c6eoX7++MDY2FhYWFqJevXri999/l7Z7enqKzz77LMNx/82tUqnEzJkzRenSpaXv+a5duzR6fdk9hxBvl2+bO3euqFy5sjAwMBAlSpQQLVq0EOfPn5f2uXnzpvjiiy+EsbGxAKD2M/2p1/6/S7LdvXtX9OrVS5QrV04YGRkJa2tr0aRJE3HgwAG147LK9N8l2d752PUhorcUQvA9FCKi/6VQKDBo0KAMUy2Kkpo1a6JEiRLYv3+/3FGIiPIE5xQTERVhaWlpGeaxHj58GJcuXULjxo3lCUVEJAPOKSYiKsKePHkCLy8vdOnSBY6Ojrh58yZWrlwJe3t7DBgwQO54RER5hk0xEVERZmVlBTc3N/z88894+fIlTE1N8fXXX2PWrFmf/OEhREQFCecUExEREVGRxznFRERERFTksSkmIiIioiKvyM0pVqlUePr0KczNzXPlY1OJiIiIKPcIIRAfHw9HR8cMHwD1KYpcU/z06VM4OzvLHYOIiIiIPsGjR49QsmRJrZ2vyDXF5ubmAN5+Iy0sLGROQ0RERESaiIuLg7Ozs9TTaUuRa4rfTZmwsLBgU0xERERUQGl7GixvtCMiIiKiIo9NMREREREVeWyKiYiIiKjIY1NMREREREUem2IiIiIiKvLYFBMRERFRkcemmIiIiIiKPDbFRERERFTksSkmIiIioiKPTTERERERFXlsiomIiIioyJO1KT569ChatmwJR0dHKBQK/PXXXx895vDhw6hduzYMDQ1Rvnx5rF27NtdzEhEREVHhJmtTnJiYiBo1amDZsmXZ2v/evXv4+uuv0aRJE4SFhWH48OHo06cPQkNDczkpERERERVmenI+eYsWLdCiRYts779y5UqUKVMG8+fPBwBUqVIFx48fx8KFC+Ht7Z1bMYmIiIgon0hJSc+V88raFGvq1KlT8PLyUqt5e3tj+PDhWR6TkpKClJQU6XFcXFxuxSMiIiLKXPhW4GQAkBovd5ICKz5ZHzP21sSWc465cv4C1RQ/f/4cdnZ2ajU7OzvExcUhKSkJxsbGGY4JCgrClClT8ioiERERUUYnA4Dom3KnKJCEADZcqI4xu7/EszhzAMm58jwFqinOiXHjxsHPz096HBcXB2dnZxkTERERUZHzboRYoQOYOsibpYCZuacmJu6sKz3W11UhTan95ylQTbG9vT0iIyPVapGRkbCwsMh0lBgADA0NYWhomBfxiIiIiD7M1AHo/1juFAVK71YJmH1oCeLjU9GqVSVMnvw5atWao/XnKVBNsYeHB/bs2aNW279/Pzw8PGRKRERERETakpamxM2bUXB1fT9d1t7eDMuWfQVbW1N4e5fPtfvDZF2SLSEhAWFhYQgLCwPwdsm1sLAwPHz4EMDbqQ/dunWT9h8wYADu3r2LMWPG4ObNm1i+fDm2bNmCESNGyBGfiIiIiLRk//47qFFjJZo0WYfo6CS1bV271oC3d/lcfX5Zm+Jz586hVq1aqFWrFgDAz88PtWrVQkBAAADg2bNnUoMMAGXKlMHu3buxf/9+1KhRA/Pnz8fPP//M5diIiIiICqi7d1+jTZvNaN78N9y4EYVXr5IQGHgoz3PIOn2icePGEEJkuT2zT6tr3LgxLl68mIupiIiIiCi3JSamIijoOObNO4mUlPd3zn3+eUl061Yjz/MUqDnFRERE9Im4Xq48Ep/JnSDfEEJg8+ZrGD16Px4/fj8/2N7eDLNne6FLl+rQ0VHkeS42xUREREUJ18uVl4G53Alkde/ea3Tv/heOHXs/PVZfXwcjRnyOiRO/gLm5fCuGsSkmIiIqSrhernwMzIEG0+ROIStLSyNcu/ZSevz11xWwYIE3KlYsLmOqt9gUExERFUVcL5dkYG1tjOnTm2Dhwn+xcKE3vv66otyRJLKuPkFEREREhdPhw/fRqNEaREYmqNX79XPDlSs/5KuGGGBTTERERERa9OBBDDp02IomTdbh+PGHGD/+oNp2XV0dGBrmv8kK+S8RERERERU4SUlpmDPnBGbNOoHk5HSpfuXKC6SkpOfLRvh/5e90RERERJSvCSHwxx83MHLkPjx4ECvVS5QwQVBQM/TsWUuWJdY0xaaYiKig4nqzlBNcL5e06OrVFxg2LAT//HNPqunp6WDIkHoICPBEsWJGMqbTDJtiIqKCiuvN0qco4uvl0qdLS1OiRYsNah/A8eWXZREc7IOqVUvImCxn2BQTERVUXG+Wcorr5ZIW6OvrYvr0JujRYwfKlCmGhQu98e23laBQ5P+pEplhU0xEVNBxvVkiygPHjj1AmTJWKFnSQqp17VoDaWkqdOlSHUZGBbut5JJsRERERJSlx4/j0KnTdnzxxVqMGbNfbZuOjgJ9+tQu8A0xwKaYiIiIiDKRnJyOGTOOolKlpfj996sAgN9/v4rTpwvnO1MFv60nIiIiIq0RQmDnznCMGBGKe/dipHrx4saYMaMp6tRxlC9cLmJTTJQfcGktygkurUVEWnbjxksMHx6KffvuSDVdXQUGDqyLKVMaw8rKWL5wuYxNMVF+wKW16FNwaS0i0oLZs49j4sRDSE9XSbWmTctg0SIfVKtmK2OyvMGmmCg/4NJalFNcWouItKRsWSupIS5d2hLz5zdH27ZVCuwSa5piU0yUn3BpLSIiyiPp6Sro6b1fc+G776riq68qwN3dCaNH14exsb6M6fIem2IiIiKiIuTZs3j4+x9EXFwK/vzTV6orFArs2tWxyIwM/xebYiIiIqIiICUlHYsWnca0aUeRkJAKAAgNvQ1v7/LSPkW1IQbYFBMREREVenv23MLw4SG4dStaqllZGeH162QZU+UvbIqJiIiICqmIiFcYMSIUe/bckmoKBdC/vxumTWsKGxsTGdPlL2yKibIjt9cR5nqzRESkRfHxKZg+/SgWLvwXaWnvl1hr1KgUFi9ugZo17WVMlz+xKSbKjrxaR5jrzRIRkRYcP/4Qc+aclB47OZlj3rzm8PX9rEjPG/4QNsVE2ZEX6whzvVkiItKSFi0q4KuvKuDgwbsYPbo+/P0bwtTUQO5Y+RqbYiJNcB1hIiLKZ168SMRvv13GiBGfq40CL1v2FVQqgbJlrWRMV3CwKSYiIiIqgNLSlFi69AwmTz6CuLgUlC1rhdatK0vbXVyKyReuANL5+C5ERERElJ/s23cHNWqshJ/fPsTFpQAApkw5AiGEzMkKLo4UExERERUQd+++hp9fKHbsCJdqCgXQu3ctzJjRjDfRfQI2xURERET5XEJCKoKCjmH+/FNISVFKdQ+PkliypAXc3BxlTFc4sCmmoknTdYe5jjAREcnk1as3qFFjJZ48ef9vloODGebM+RKdO7tydFhL2BRT0ZTTdYe5jjAREeWx4sVN8PnnJbF9+w3o6+vAz88DEyY0grm5odzRChU2xVQ05WTdYa4jTEREeSA6OglWVkZqI8Dz5jWHEMCsWc1QoUJxGdMVXmyKqWjjusNERJRPpKersGLFWQQEHMayZV+hUydXaZuLSzFs395BxnSFH5dkIyIiIpLZP//cQ61aP2Lo0BDExCRj9Oj9SEhIlTtWkcKRYiIiIiKZ3L8fg1Gj9mH79htq9S+/LIvUVGUWR1FuYFNMRERElMfevEnDnDknMHv2CSQnp0v1unUdsWRJC7i7l5QxXdHEppiIiIgoD23bdh0jR+7Dw4exUs3W1hSzZjVD9+41oaPDJdbkwKaYCg9N1h7musNERCSTnTvDpYZYT08Hw4a5Y9KkL2BpaSRzsqKNTTEVHjlZe5jrDhMRUR6bNcsLf/55Ew0aOCM42AeVK9vIHYnAppgKE03XHua6w0RElIuUShV++ukCLCwM1ZZXc3Q0x+XLA+DiUoyfRpePsCmmwodrDxMRkcyOHn2AoUP34tKlSNjYmOCrryqgWLH30yPKlLGSMR1lhusUExEREWnJo0ex6NhxOzw91+LSpUgAQFTUG/z9d7jMyehjOFJMRERE9ImSk9Mxf/5JzJx5HG/epEn1WrXssXhxCzRsWErGdJQdbIqJiIiIckgIgR07wuHnF4p792KkevHixpg5sxl6964FXV2+MV8QsCkmIiIiyqHFi09j+PBQ6bGurgKDBtXF5MmNYWVlLGMy0hT/dCEiIiLKoc6dq0s30DVtWgZhYQOwaFELNsQFEEeKiYiIiLJBpRK4eTMKVauWkGo2NiZYurQFjI310aZNZS6xVoCxKSYiIiL6iJMnH2Ho0L24c+c1IiIGo0QJU2lb587VZUxG2sLpE0RERERZePo0Hl27/okGDVbj/PlniIlJxsSJ/8gdi3IBR4qJiIiI/iMlJR3Bwf9i2rSjSEx8v8Ra9ep2ap9OR4UHm2IiIiKi/7F7dwSGDw/F7dvRUs3KygjTpzdFv35u0NPjG+2FEZtiIiIiIgBPnsShb9+/sXfvbammo6PAgAFumDq1CYoXN5ExHeU2NsVEREREAExNDXDu3FPp8RdflMbixT6oUcNexlSUVzj+T0RERASgWDEjBAU1Q8mSFti0qR0OH+7OhrgIYVNMRERERc6ZM0/QrNmvePIkTq3es2ct3Lw5CL6+1bjmcBHDppiIiIiKjOfPE9Cr1w64u/+Mf/65B3//g2rbdXQUMDU1kCkdyYlziomIiKjQS01VYsmS05gy5Qji41Ol+sWLz5CYmMpGmNgUExERUeEWEnIbw4eHIDz8lVSztDTElCmNMXBgXejr68oXjvINNsVERERUKN2+HQ0/v1D8/XeEVFMogD59amPGjKZqH9VMxKaYiIiICh2lUgUfn99w585rqVa/vjMWL/aBm5ujjMkov5L9Rrtly5bBxcUFRkZGcHd3x5kzZz64f3BwMCpVqgRjY2M4OztjxIgRSE5OzqO0lKfCtwJrqgA/lszeV+IzuRMTEVE+oaurg+nTmwIAHBzM8NtvbXD8eE82xJQlWUeKN2/eDD8/P6xcuRLu7u4IDg6Gt7c3wsPDYWtrm2H/jRs3wt/fH6tXr0b9+vURERGBHj16QKFQYMGCBTK8AspVJwOA6JuaH2dgrv0sRESUr128+AzW1sYoXbqYVPP1/QyvXyehS5fqMDc3lC8cFQgKIYSQ68nd3d1Rt25dLF26FACgUqng7OyMIUOGwN/fP8P+gwcPxo0bN3Dw4PvlU0aOHInTp0/j+PHj2XrOuLg4WFpaIjY2FhYWFtp5IZQ7fiwJJDwBFDqAqUP2jjEwBxpMAyp+l7vZiIgoX3j5MhETJ/6Dn366gDZtqmD79g5yR6Jcllu9nGwjxampqTh//jzGjRsn1XR0dODl5YVTp05lekz9+vXx22+/4cyZM6hXrx7u3r2LPXv2oGvXrlk+T0pKClJSUqTHcXFxWe5L+ZSpA9D/sdwpiIgoH0lLU2LFinMIDDyMmJi30yj/+OMGDh++j8aNXeQNRwWSbE1xVFQUlEol7Ozs1Op2dna4eTPzt8w7deqEqKgoNGzYEEIIpKenY8CAARg/fnyWzxMUFIQpU6ZoNTsRERHJ5+DBuxg2LATXrr2UaubmBggM9ET9+s4yJqOCTPYb7TRx+PBhzJw5E8uXL8eFCxfwxx9/YPfu3Zg2bVqWx4wbNw6xsbHS16NHj/IwMREREWnL/fsxaNduC7y81qs1xD161ERExBCMHFkfBgZcc5hyRraRYhsbG+jq6iIyMlKtHhkZCXt7+0yPmTRpErp27Yo+ffoAAFxdXZGYmIh+/fphwoQJ0NHJ2OMbGhrC0JCT64mIiAqyZcvOYNSo/UhOTpdqdes6YsmSFnB3LyljMiosZBspNjAwgJubm9pNcyqVCgcPHoSHh0emx7x58yZD46ur+/YvQhnvFyQiIqJcVrKkhdQQ29mZYs2aVvj33z5siElrZF2Szc/PD927d0edOnVQr149BAcHIzExET179gQAdOvWDU5OTggKCgIAtGzZEgsWLECtWrXg7u6O27dvY9KkSWjZsqXUHBMREVHBp1SqoKv7fiDs228r4ZtvKqJy5eKYNMkTFhZ8F5i0S9am2NfXFy9fvkRAQACeP3+OmjVrIiQkRLr57uHDh2ojwxMnToRCocDEiRPx5MkTlChRAi1btsSMGTPkeglERESkRa9evUFAwCE8eBCLv//uCIVCAQBQKBTYseN76OgoZE5IhZWs6xTLgesUFyDv1ik2c+KSbEREhVx6ugqrVp3HpEmHEB2dBADYufN7tGxZSeZklN8UunWKiYiIiADgyJH7GDo0BJcvv7/53tRUHy9eJMqYiooaNsVEREQki0ePYjF69H5s3nxNrd6lS3XMnu0FR0dzmZJRUcSmmIiIiPJUUlIa5s07iaCg40hKer/EWu3aDli82AcNGpSSMR0VVWyKiYiIKE+dOvUYAQGHpcc2NiYICmqGnj1rqq04QZSX+JNHREREeapp0zJo3boydHUVGDbMHRERg9GnT202xCQrjhTTe+FbgZMBQGq83EneSnwmdwIiIvpEMTHJWLcuDEOHukvLqwFAcLA3pk9vgs8+s5UxHdF7bIrpvZMBQPRNuVNkZMAbLYiIChqlUoXVqy9i/Ph/EBX1Bvb2ZvD1rSZtL126mHzhiDLBppjeezdCrNABTB3kzfKOgTnQYJrcKYiISAMnTjzE0KEhuHDh/Tt+gYGH0b79Z/zwDcq32BRTRqYO/LAMIiLS2JMncRg79gA2bLiiVvf1/Qxz5nzJhpjyNTbFRERE9ElSUtKxYMEpzJhxDImJaVK9enU7LF7sA09PF/nCEWUTm2IiIiLKsfj4FNSuvQq3b0dLNWtrY0yf3gR9+7pBT48rSlDBwKaYiIiIcszc3BD16jnh9u1o6Ogo8MMPdTB1ahNYWxvLHY1II2yKiYiIKNvi41NgamqgNj94zhwvvH6dhFmzvFC9up2M6Yhyjk1xYabpusNcF5iIiLKgUgn8+usl+Psf+P9Pn6slbXNyssCePZ1lTEf06dgUF2Y5XXeY6wITEdH/OHPmCYYM2YszZ54AAPz9D6Jt2yqwtDSSORmR9rApLsxysu4w1wUmIqL/9/x5AsaNO4i1a8PU6g0blkJSUjosLeXJRZQb2BQXBVx3mIiINJCaqsTixacxdeoRxMenSvWqVUtg0SIfeHmVlTEdUe5gU0xERESSkJDbGDYsBBERr6SapaUhpk5tgh9+qAN9fV0Z0xHlHjbFREREJNm27brUECsUQN++tTF9elOUKGEqczKi3MUVtYmIiEgyc2YzWFgYon59Z5w71w8//tiSDTEVCRwpLmg0WWaNS6wREVEWhBDYsOEKhBDo2rWGVLe1NcXZs31RoYI1FArFB85AVLiwKS5ocrLMGpdYIyKi/3H+/FMMHRqCkycfwcrKCF99VQHFi5tI2ytWLC5jOiJ5sCkuaDRdZo1LrBER0f978SIREyYcxC+/XIQQb2uvXydjy5Zr+OGHuvKGI5IZm+KCisusERFRNqWlKbF8+VkEBh5GbGyKVK9UqTgWLfKBt3d5GdMR5Q9siomIiAqxAwfuYtiwEFy//lKqWVgYIjDQE4MH14OBAZdYIwLYFBMRERVav/xyAX36/K1W69WrJmbObAY7OzOZUhHlT1ySjYiIqJBq27YKbGze3kDn7u6EM2f64JdfWrEhJsoER4qJiIgKASEEbt2KVls5wsrKGIsX+yAtTYUuXapDR4dLrBFlhU0xERFRAXfp0nMMHRqCCxeeISJiMBwc3i/F2bGjq4zJiAoOTp8gIiIqoF69eoOBA3ejdu1VOHr0ARISUjFu3EG5YxEVSBwpJiIiKmDS01VYteo8Jk78B69fJ0v18uWt8d13VWVMRlRwsSkmIiIqQA4fvo+hQ/fiypUXUs3UVB+TJn2B4cM/h6Eh/2knygn+5hARERUAL18mYvDgvdiy5ZpavWvX6pg1ywuOjuZZHElE2cGmmIiIqAAwNtbH8eMPpcdubg5YsqQFPDycZUxFVHh80o12ycnJH9+JiIiIPpmZmQHmzPFCiRIm+Pnnljhzpi8bYiIt0rgpVqlUmDZtGpycnGBmZoa7d+8CACZNmoRffvlF6wGJiIiKmqtXX+Cbbzbi4cNYtXqnTq64dWsIeveuzTWHibRM4+kT06dPx7p16zBnzhz07dtXqlerVg3BwcHo3bu3VgMWOOFbgZMBQGp87pw/8VnunJeIiGT3+nUSAgMPY/nys1AqBUaP3o/Nm7+TtisUClhaGsmYkKjw0rgp/vXXX7Fq1So0a9YMAwYMkOo1atTAzZs3tRquQDoZAETnwffBgDdUEBEVFkqlCr/8chHjxx/Eq1dJUv3cuaeIiUlGsWJshIlym8ZN8ZMnT1C+fPkMdZVKhbS0NK2EKtDejRArdABTh9x5DgNzoMG03Dk3ERHlqRMnHmLIkL24ePG5VDMx0cf48Q0xcmR9GBnxnniivKDxb1rVqlVx7NgxlC5dWq2+bds21KpVS2vBCjxTB6D/Y7lTEBFRPvXkSRzGjDmAjRuvqNU7dqyGOXO+RMmSFjIlIyqaNG6KAwIC0L17dzx58gQqlQp//PEHwsPD8euvv2LXrl25kZGIiKhQEULAx2cDrl59/wEcNWrYYcmSFmjUqPQHjiSi3KLx6hOtWrXC33//jQMHDsDU1BQBAQG4ceMG/v77b3z55Ze5kZGIiKhQUSgUmDKlMQCgeHFjrFjxNc6f78eGmEhGOZqo1KhRI+zfv1/bWYiIiAqlmzejYGioizJlrKRamzaVsWiRD7p0qQ5ra2MZ0xERkIOR4rJly+LVq1cZ6jExMShbtqxWQhERERUGsbHJGDkyFK6uKzBkyF61bQqFAkOHurMhJsonNG6K79+/D6VSmaGekpKCJ0+eaCUUERFRQaZSCaxZcxEVKy7FggX/Ij1dhd27byE09Lbc0YgoC9mePrFz507p/4eGhsLS0lJ6rFQqcfDgQbi4uGg1HBERUUFz+vRjDBmyF2fPPpVqRkZ6GDu2AecME+Vj2W6KW7duDeDt2z3du3dX26avrw8XFxfMnz9fq+GIiIgKiufPE+DvfwDr1l1Sq3/3XVXMm/clSpcuJk8wIsqWbDfFKpUKAFCmTBmcPXsWNjY2uRaKiIioIFm3LgxDhuxFfHyqVKtWzRaLF/ugSZMyMiYjouzSePWJe/fu5UYOIiKiAsvOzkxqiIsVM8K0aU0wYEAd6OlpfOsOEckkR0uyJSYm4siRI3j48CFSU1PVtg0dOlQrwYiIiPIrlUpAR0chPfbxKY9WrSrB3t4M06c3hY2NiYzpiCgnNG6KL168iK+++gpv3rxBYmIirK2tERUVBRMTE9ja2rIpJiKiQis+PgUzZhxDWNhz7N3bGQrF+8Z4+/YO0NXlyDBRQaXxb++IESPQsmVLvH79GsbGxvj333/x4MEDuLm5Yd68ebmRkYiISFYqlcD69ZdQqdJSzJ59AqGhd7B9+w21fdgQExVsGv8Gh4WFYeTIkdDR0YGuri5SUlLg7OyMOXPmYPz48bmRkYiISDbnzj1Fw4ar0a3bX3j2LAEAYGCgi8eP42RORkTapPH0CX19fejovO2lbW1t8fDhQ1SpUgWWlpZ49OiR1gMSERHJ4cWLRIwffxCrV1+EEO/rrVpVwvz5zVGunLV84YhI6zRuimvVqoWzZ8+iQoUK8PT0REBAAKKiorB+/XpUq1YtNzISERHlmbQ0JZYuPYPJk48gLi5FqleubINFi3zQvHk5GdMRUW7RePrEzJkz4eDgAACYMWMGrKys8MMPP+Dly5f48ccftR6QiIgoL5058wR+fvukhtjCwhALFjTH5csD2BATFWIajxTXqVNH+v+2trYICQnRaiAiIiI5NWhQCh06fIatW6+hV69amDGjKezszOSORUS5TGu3yl64cAHffPONtk5HRESU6xITU7FixVmoVEKtPm/elzh9ug9+/vlbNsRERYRGTXFoaChGjRqF8ePH4+7duwCAmzdvonXr1qhbt670UdBERET5mRACmzZdReXKyzBw4B5s2HBZbbuzsyXq1nWSKR0RySHbTfEvv/yCFi1aYO3atZg9ezY+//xz/Pbbb/Dw8IC9vT2uXr2KPXv25GZWIiKiTxYW9hyenmvRseN2aVm1SZMOIT2dAztERVm2m+JFixZh9uzZiIqKwpYtWxAVFYXly5fjypUrWLlyJapUqZKbOYmIiD5JVNQb/PDDLri5rcKxYw+l+tdfV8D+/V2hp8cP3yAqyrJ9o92dO3fQvn17AEDbtm2hp6eHuXPnomTJkrkWjoiI6FOlp6uwcuU5BAQcwuvXyVK9QgVrLFzoja+/rihjOiLKL7LdFCclJcHExAQAoFAoYGhoKC3NRkRElB8lJaXh889/weXLkVLNzMwAAQFfYNiwz2FgoCtjOiLKTzRaku3nn3+Gmdnbu3DT09Oxdu1a2NjYqO0zdOhQjQIsW7YMc+fOxfPnz1GjRg0sWbIE9erVy3L/mJgYTJgwAX/88Qeio6NRunRpBAcH46uvvtLoeYmIqPAzNtZH7doOUlPcrVsNzJrVDA4O5jInI6L8RiGEEB/fDXBxcYFCofjwyRQKaVWK7Ni8eTO6deuGlStXwt3dHcHBwdi6dSvCw8Nha2ubYf/U1FQ0aNAAtra2GD9+PJycnPDgwQMUK1YMNWrUyNZzxsXFwdLSErGxsbCwsMh21mz7sSSQ8AQwcwL6P9b++YmIKEtJSWkwMNCFru77+cHPnyegc+c/MH16E3h4OMuYjoi0Ibd6uWw3xbnB3d0ddevWxdKlSwEAKpUKzs7OGDJkCPz9/TPsv3LlSsydOxc3b96Evr5+jp6TTTERUeEjhMD27TcwcuQ+jB/fEP371/n4QURUIOVWLyfbrbapqak4f/48vLy83ofR0YGXlxdOnTqV6TE7d+6Eh4cHBg0aBDs7O1SrVg0zZ86EUqnM8nlSUlIQFxen9kVERIXHlSuRaNbsV7RvvxUPH8ZiwoR/EB2dJHcsIipgZGuKo6KioFQqYWdnp1a3s7PD8+fPMz3m7t272LZtG5RKJfbs2YNJkyZh/vz5mD59epbPExQUBEtLS+nL2ZlvnRERFQbR0UkYMmQPatX6EYcO3Zfqbm6OSEhIlS8YERVIGt1oJzeVSgVbW1usWrUKurq6cHNzw5MnTzB37lwEBgZmesy4cePg5+cnPY6Li2NjTERUgCmVKvz00wVMnPgPXr16PyJctqwVFixojm+/rfTRe2CIiP5LtqbYxsYGurq6iIyMVKtHRkbC3t4+02McHBygr68PXd33S+hUqVIFz58/R2pqKgwMDDIcY2hoCENDQ+2GJyIiWRw79gBDh4YgLOz9O4omJvqYMKER/Pw8YGRUoMZ6iCgfkW36hIGBAdzc3HDw4EGpplKpcPDgQXh4eGR6TIMGDXD79m2oVO8/ijMiIgIODg6ZNsRERFS4bN58Ta0h7tTJFeHhgzF+fCM2xET0SXLUFN+5cwcTJ05Ex44d8eLFCwDA3r17ce3aNY3O4+fnh59++gnr1q3DjRs38MMPPyAxMRE9e/YEAHTr1g3jxo2T9v/hhx8QHR2NYcOGISIiArt378bMmTMxaNCgnLwMIiIqYKZObQJra2PUrGmPo0d7YMOGtihZMhdWEiKiIkfjP6uPHDmCFi1aoEGDBjh69ChmzJgBW1tbXLp0Cb/88gu2bduW7XP5+vri5cuXCAgIwPPnz1GzZk2EhIRIN989fPgQOjrv+3ZnZ2eEhoZixIgRqF69OpycnDBs2DCMHTtW05dBRET5mBACO3eGIyYmGd2715Tq1tbGOHasJypVKq62FjER0afSeJ1iDw8PtG/fHn5+fjA3N8elS5dQtmxZnDlzBm3btsXjx/l7bV6uU0xElL/duPESw4eHYt++O7CwMERExGDY2ZnJHYuI8ol8s07xlStX0KZNmwx1W1tbREVFaSUUEREVPbGxyfDzC0X16iuxb98dAEBcXArWrbskczIiKgo0nj5RrFgxPHv2DGXKlFGrX7x4EU5OTloLRkRERYNKJbBmzUWMG3cQL1++keqlSlli/vzmaNeuiozpiKio0Lgp/v777zF27Fhs3boVCoUCKpUKJ06cwKhRo9CtW7fcyEhERIXUqVOPMHRoCM6deyrVjIz04O/fAKNHN4CJib6M6YioKNG4KX632oOzszOUSiWqVq0KpVKJTp06YeLEibmRkYiICqFNm66iY8ftarX27ati7twvUbp0MXlCEVGRpXFTbGBggJ9++gmTJk3C1atXkZCQgFq1aqFChQq5kY+IiAqpr76qADs7U0RGJqJaNVssXuyDJk3KfPxAIqJcoHFTfPz4cTRs2BClSpVCqVKlciMTEREVQnfvvkbZslbSYwsLQyxa5IOoqDfo378O9PS4xBoRyUfj/wI1bdoUZcqUwfjx43H9+vXcyERERIVIRMQrfP31Rri6rsDjx3Fq23x9q2HQoHpsiIlIdhr/V+jp06cYOXIkjhw5gmrVqqFmzZqYO3duvl+fmIiI8lZcXArGjNmPatWWY8+eW3jzJg1jxuyXOxYRUaY0boptbGwwePBgnDhxAnfu3EH79u2xbt06uLi4oGnTprmRkYiIChCVSuDXXy+hUqWlmDv3JNLSVACAkiUt0KpVJZnTERFlTuM5xf+rTJky8Pf3R40aNTBp0iQcOXJEW7mIiKgAOnv2CYYODcG//75/99DQUBejR9eHv39DmJoayJiOiChrOW6KT5w4gQ0bNmDbtm1ITk5Gq1atEBQUpM1sRERUQMTEJGPUqH1YvfoihHhfb9OmMubNa652gx0RUX6kcVM8btw4bNq0CU+fPsWXX36JRYsWoVWrVjAxMcmNfEREVAAYGOhi3747UkNcpYoNFi3ywZdflpM3GBFRNmncFB89ehSjR49Ghw4dYGNjkxuZiIiogDEx0ce8ec3Rt+/fmDKlMQYNqgt9fV25YxERZZvGTfGJEydyIwcRERUQd+5Ew9//IGbNaoZy5aylevv2VdGsWRkUL853Domo4MlWU7xz5060aNEC+vr62Llz5wf3/fbbb7USjIiI8peEhFQEBR3DvHmnkJqqRGqqEjt2fC9tVygUbIiJqMDKVlPcunVrPH/+HLa2tmjdunWW+ykUCiiVSm1lIyKifEAIgd9/v4oxY/bjyZN4qX727BO8eJEIW1tTGdMREWlHtppilUqV6f8nIqLC7eLFZxg6NATHjz+Uavr6OvDz88CECY1gbm4oYzoiIu3R+MM7fv31V6SkpGSop6am4tdff9VKKCIiktfLl4no3/9vuLmtUmuIv/mmIq5dG4hZs7zYEBNRoaJxU9yzZ0/ExsZmqMfHx6Nnz55aCUVERPL65pvfsWrVBWmJtQoVrLF7dyf8/XdHVKhQXN5wRES5QOOmWAgBhUKRof748WNYWlpqJRQREckrIOALAICZmQHmzPHC1asD8dVXFWRORUSUe7K9JFutWrWgUCigUCjQrFkz6Om9P1SpVOLevXvw8fHJlZBERJR77t+PgUol1D517uuvK2LBgub4/vtqcHAwlzEdEVHeyHZT/G7VibCwMHh7e8PMzEzaZmBgABcXF7Rr107rAYmIKHe8eZOGOXNOYPbsE2jUqBRCQ7uovRM4YoSHjOmIiPJWtpviwMBAAICLiwt8fX1hZGSUa6GIiCj3CCGwbdt1jBq1Hw8fvr1HZP/+u/j77wh8+20lmdMREclD40+06969e27kICKiPHDlSiSGDg3B4cP3pZqeng6GDq0HT8/S8gUjIpJZtppia2trREREwMbGBlZWVpneaPdOdHS01sIREZF2REcnISDgEFasOAeVSkh1b+9yCA72QeXKNjKmIyKSX7aa4oULF8Lc3Fz6/x9qiomIKH/Ztu06+vffhejoJKlWtqwVgoO98c03FfnfdCIiZLMp/t8pEz169MitLERElAusrY2lhtjERB8TJzbCiBEeMDLSeAYdEVGhpfF/ES9cuAB9fX24uroCAHbs2IE1a9agatWqmDx5MgwMDLQekoiIsu+/68k3bVoG7dpVgaGhHmbP9kLJkhYypiMiyp80/vCO/v37IyIiAgBw9+5d+Pr6wsTEBFu3bsWYMWO0HpCIiLInOTkd06cfhY/PBggh1Lb9/ns7bNjQlg0xEVEWNG6KIyIiULNmTQDA1q1b4enpiY0bN2Lt2rXYvn27tvMREdFHCCHw1183UbXqMkyadAj79t3B779fVdtHX19XpnRERAWDxtMnhBBQqVQAgAMHDuCbb74BADg7OyMqKkq76YiI6IOuX3+JYcNCcODAXammq6vA3buvZUxFRFTwaNwU16lTB9OnT4eXlxeOHDmCFStWAADu3bsHOzs7rQckIqKMYmKSMWXKYSxZcgZK5fupEs2alcGiRT747DNbGdMRERU8GjfFwcHB6Ny5M/766y9MmDAB5cuXBwBs27YN9evX13pAIiJ6T6USWLPmIsaNO4iXL99IdReXYpg/vznatKnMJdaIiHJA46a4evXquHLlSob63LlzoavLOWtERLkpLOw5+vT5W3psbKyHceMaYtSo+jA21pcxGRFRwZbjRSrPnz+PGzduAACqVq2K2rVray0UERFlrnZtB3TrVgO//noJHTp8hrlzv0SpUpZyxyIiKvA0bopfvHgBX19fHDlyBMWKFQMAxMTEoEmTJti0aRNKlCih7YxEREVSSko61q+/jJ49a0JX9/1iQbNmNUOvXjXh6ekiXzgiokJG4yXZhgwZgoSEBFy7dg3R0dGIjo7G1atXERcXh6FDh+ZGRiKiIkUIgV27IlCt2gr07fs31qwJU9vu4GDOhpiISMs0bopDQkKwfPlyVKlSRapVrVoVy5Ytw969e7UajoioqAkPj8LXX29Ey5a/4/btaADApEmHkJKSLnMyIqLCTePpEyqVCvr6GW/m0NfXl9YvJiIizcTFpWDatCMIDj6N9PT3/y394ovSWLzYB4aGOb4FhIiIskHjkeKmTZti2LBhePr0qVR78uQJRowYgWbNmmk1HBFRYadSCaxdG4aKFZdg3rxTUkNcsqQFNm1qh8OHu6NGDXuZUxIRFX4aDz0sXboU3377LVxcXODs7AwAePToEapVq4bffvtN6wGJiAqrtDQlPD3X4tSpx1LN0FAXY8Y0wNixDWBqaiBjOiKiokXjptjZ2RkXLlzAwYMHpSXZqlSpAi8vL62HIyIqzPT1dVG9up3UFLdtWwXz5n2JMmWsZE5GRFT0aNQUb968GTt37kRqaiqaNWuGIUOG5FYuIqJCJzVVCR0dBfT03s9cmz69Ka5de4nAQE94eZWVMR0RUdGW7TnFK1asQMeOHXHu3DncunULgwYNwujRo3MzGxFRoRESchvVq6/AihVn1eo2NiY4dqwnG2IiIplluyleunQpAgMDER4ejrCwMKxbtw7Lly/PzWxERAXenTvRaNVqE1q02IDw8FcICDiMly8T5Y5FRET/ke2m+O7du+jevbv0uFOnTkhPT8ezZ89yJRgRUUGWkJCK8eMPomrV5di5M1yqV6lig9jYFBmTERFRZrI9pzglJQWmpqbSYx0dHRgYGCApKSlXghERFURCCGzceAVjxhzA06fxUt3BwQxz536JTp1coVAoZExIRESZ0ehGu0mTJsHExER6nJqaihkzZsDS0lKqLViwQHvpiIgKkAsXnmHo0L04ceKRVDMw0IWf3+cYP74RzM0NZUxHREQfku2m+IsvvkB4eLharX79+rh79670mKMfRFSU/fbbZbWGuGXLiliwwBvly1vLmIqIiLIj203x4cOHczEGEVHBFxjoiQ0brqBYMSMEB3ujRYsKckciIqJs0vjDO4iICDh48C4ePYpDjx41pZqlpREOHOiKSpVsYGCgK184IiLSGJtiIiIN3L8fg5Ej9+GPP27A1FQfX35ZFk5OFtJ2V1c7GdMREVFOZXtJNiKiouzNmzQEBh5ClSrL8Mcfbz/iPjExDatWnZc5GRERaQNHiomIPkAIga1br2PUqH149ChOqtvZmWLWLC9061ZDxnRERKQtbIqJiLJw+XIkhg7diyNHHkg1fX0dDBvmjkmTPGFhwSXWiIgKixxNnzh27Bi6dOkCDw8PPHnyBACwfv16HD9+XKvhiIjk8vff4ahV60e1htjHpzyuXPkBc+c2Z0NMRFTIaNwUb9++Hd7e3jA2NsbFixeRkvL240pjY2Mxc+ZMrQckIpJD06Zl4OBgBgAoV84Kf//dEXv2dEKlSjYyJyMiotygcVM8ffp0rFy5Ej/99BP09fWleoMGDXDhwgWthiMiyiuPHsWqPTY1NcCiRT4ICmqGa9cG4ptvKvIDioiICjGN5xSHh4fjiy++yFC3tLRETEyMNjIREeWZhw9jMXr0fuzcGY4bNwbBxaWYtK1du6ryBSMiojyl8Uixvb09bt++naF+/PhxlC1bViuhiIhyW1JSGqZNO4LKlZdiy5ZrSE5Ox6hR++SORUREMtF4pLhv374YNmwYVq9eDYVCgadPn+LUqVMYNWoUJk2alBsZiYi0RgiBP/+8iZEj9+H+/RipbmNjAh+f8hBCcJoEEVERpHFT7O/vD5VKhWbNmuHNmzf44osvYGhoiFGjRmHIkCG5kZGISCuuXXuBYcNCcPDgPammq6vAkCH1EBjYGMWKGcmYjoiI5KTx9AmFQoEJEyYgOjoaV69exb///ouXL19i2rRpOQ6xbNkyuLi4wMjICO7u7jhz5ky2jtu0aRMUCgVat26d4+cmosIvISEVw4btRY0aK9UaYi+vsrh8+QcsXOjDhpiIqIjL8Yd3GBgYoGrVT78JZfPmzfDz88PKlSvh7u6O4OBgeHt7Izw8HLa2tlked//+fYwaNQqNGjX65AxEVLjp6elg165bUCoFAMDFpRgWLvRGq1aVOFWCiIgAAAohhNDkgCZNmnzwH5F//vlHowDu7u6oW7culi5dCgBQqVRwdnbGkCFD4O/vn+kxSqUSX3zxBXr16oVjx44hJiYGf/31V7aeLy4uDpaWloiNjYWFhYVGWbPlx5JAwhPAzAno/1j75yeiHNmx4yY6dtyO8eMbYeRIDxgb63/8ICIiyndyq5fTePpEzZo1UaNGDemratWqSE1NxYULF+Dq6qrRuVJTU3H+/Hl4eXm9D6SjAy8vL5w6dSrL46ZOnQpbW1v07t37o8+RkpKCuLg4tS8iKryePIlD9+5/4ebNKLX6t99Wwr17wzBx4hdsiImIKAONp08sXLgw0/rkyZORkJCg0bmioqKgVCphZ2enVrezs8PNmzczPeb48eP45ZdfEBYWlq3nCAoKwpQpUzTKRUQFT0pKOhYu/BfTpx9FYmIaIiMTsHdvZ+mdLYVCATs7M5lTEhFRfqXxSHFWunTpgtWrV2vrdJmKj49H165d8dNPP8HGJnsftTpu3DjExsZKX48ePcrVjESUt4QQ+PvvcHz22XKMG3cQiYlpAICzZ5/i8WO+M0RERNmT4xvt/uvUqVMwMtLs7m0bGxvo6uoiMjJSrR4ZGQl7e/sM+9+5cwf3799Hy5YtpZpKpQIA6OnpITw8HOXKlVM7xtDQEIaGhhrlIqKCITw8CsOHhyIk5P0HCunoKPDDD3UwdWoTWFsby5iOiIgKEo2b4rZt26o9FkLg2bNnOHfunMYf3mFgYAA3NzccPHhQWlZNpVLh4MGDGDx4cIb9K1eujCtXrqjVJk6ciPj4eCxatAjOzs6avRgiKpDi4lIwdeoRLFp0GunpKqneuLELFi3yQfXqdh84moiIKCONm2JLS0u1xzo6OqhUqRKmTp2K5s2baxzAz88P3bt3R506dVCvXj0EBwcjMTERPXv2BAB069YNTk5OCAoKgpGREapVq6Z2fLFixQAgQ52ICq/WrTfh0KH70mNnZwvMn98c331XlUusERFRjmjUFCuVSvTs2ROurq6wsrLSSgBfX1+8fPkSAQEBeP78OWrWrImQkBDp5ruHDx9CR0drU5+JqBAYN64hDh26D0NDXYwd2wBjxzaEiQlXlCAiopzTeJ1iIyMj3LhxA2XKlMmtTLmK6xQTFSzPnycgMTEV5cpZq9Xnzj2B776rijJltPMHOhERFQz5Zp3iatWq4e7du1oLQESUmdRUJebNO4mKFZegV6+d+O/f76NHN2BDTEREWqNxUzx9+nSMGjUKu3btwrNnz/jBGESkdXv33oKr6wqMHr0f8fGpOHr0AbZsuSZ3LCIiKsSyPad46tSpGDlyJL766isAwLfffqt2Q4sQAgqFAkqlUvspiahIuH07GiNGhGLXrgipplAAffvWRtOmBXPKFhERFQzZboqnTJmCAQMG4NChQ7mZh4iKoPj4FMyYcQwLF/6L1NT3f1g3aOCMxYtboHZtBxnTERFRUZDtpvjdfD5PT89cC0NERc+ePbfQp89OPHv2/mPinZzMMWfOl+jYsRqXWCMiojyh0ZJs/MeJiLTNwsJQaogNDHQxapQHxo1rBDMzA5mTERFRUaJRU1yxYsWPNsbR0dGfFIiICrd39x+807BhKXTsWA1v3qRh/vzmGZZeIyIiygsaNcVTpkzJ8Il2RETZkZamxPLlZ7Fnz23s3dsZOjrvG+O1a1vDwEBXxnRERFTUadQUf//997C1tc2tLERUSB04cBfDhoXg+vWXAIBff72EHj1qStvZEBMRkdyy3RRzPjERaerevdcYOXIf/vzzplr9xo2XMiUiIiLKnMarTxARfUxiYipmzTqOuXNPIiXl/RJr7u5OWLy4BerVc5IxHRERUUbZbopVKlVu5iCiQkAIgc2br2H06P14/Pj9J1za25th9mwvdOlSXW0uMRERUX6h0ZxiIqIPuXEjCp06bce7N5b09XUwfPjnmDjxC1hYGMobjoiI6AN05A5ARIVH1aol0Lt3LQBAixblcfXqQMyZ8yUbYiIiyvfYFBNRjqSnq7B+/SWkp6tPrZoxoxn+/rsj9uzpjIoVi8uUjoiISDOcPkFEGjt8+D6GDt2LK1deIC4uBYMG1ZO22dqa4ptvKsqYjoiISHMcKSaibHv4MBYdOmxFkybrcOXKCwBAQMBhJCamypyMiIjo03CkmIg+KikpDXPnnsSsWceRlJQu1d3cHLBkSQuYmhrImI6IiOjTsSkmoiwJIfDHHzcwcuQ+PHgQK9VLlDBBUFAz9OxZi0usERFRocCmmIgypVSq8PXXGxEaekeq6eoqMHSoOwICPFGsmJGM6YiIiLSLc4qJKFO6ujqoVOn96hFeXmVx+fIPWLDAmw0xEREVOhwpJiIAb0eGVSoBfX1dqTZ5cmOcOvUY48c3QqtWlaBQcKoEEREVThwpJiKcOPEQdev+hODgf9XqVlbGOH26D1q3rsyGmIiICjWOFBMVYU+exGHMmAPYuPEKAODWrWh06VIdDg7m0j5shomIqChgU0xUBCUnp2PBglOYOfMYEhPTpHq5clZ49SpJrSkmIiIqCtgUExUhQgj8/XcERowIxd27r6V68eLGmD69Kfr2rQ1dXc6qIiKioodNMVERcePGSwwfHop9+94vsaajo8DAgXUwZUoTWFsby5iOiIhIXmyKiYqIX3+9pNYQN2nigkWLfODqaidjKiIiovyB75MSFRHjxzeCg4MZSpWyxNat7XHwYDc2xERERP+PI8VEhdC//z7GzZtR6NGjplQzNzfE3r2dUaFCcZiY6MsXjoiIKB9iU0xUiDx7Fo9x4w5i3bpLMDbWQ9OmZVCqlKW0vUYNexnTERER5V+cPkFUCKSmKjF37glUrLgU69ZdAgAkJaVj8eLTMicjIiIqGDhSTFTA7dlzCyNGhCIi4pVUK1bMCNOmNcGAAXVkTEZERFRwsCkmKqBu3XqFESNCsXv3LammUAD9+rlh+vSmsLExkTEdERFRwcKmmKgAOnjwLlq02IC0NJVUa9iwFBYv9kGtWg4yJiMiIiqY2BQTFUD16zvD0dEcDx7EwsnJHHPnfonvv68GhUIhdzQiIqICiU0xUQHw/HkC7O3NpMfGxvpYtMgHZ88+hb9/Q5iZGciYjoiIqODj6hNE+diLF4no02cnXFyCcevWK7VtrVpVxvTpTdkQExERaQGbYqJ8KC1NieDgf1GhwhL88stFpKQoMWJEqNyxiIiICi1OnyDKZ/bvv4Nhw0Jw40aUVLOwMESzZmWgUgno6HDeMBERkbaxKSbKJ+7efY2RI/fhr79uSjWFAujVqxZmzGgKOzuzDxxNREREn4JNMZHMkpPTMWPGUcydexIpKUqp/vnnJbF4sQ/q1nWSMR0REVHRwKaYSGY6Ogps3Xpdaojt7c0we7YXunSpzqkSREREeYQ32hHJzMBAF8HBPtDX18GYMfURETEY3brVYENMRESUh9gUE+WhqKg3+OGHXbh69YVa3cenPO7eHYbZs7+EubmhTOmIiIiKLk6fIMoD6ekqrFx5DpMmHUJMTDIiIqJx4EBXtU+gK1nSQsaERERERRubYqJc9s8/9zBsWIja6PCZM09w585rlC9vLWMyIiIieofTJ4hyyYMHMWjffiuaNftVrSHu3r0GIiIGsyEmIiLKRzhSTKRlb96kYc6cE5g9+wSSk9Olep06jliypAU+/7ykjOmIiIgoM2yKibTM13cbdu2KkB7b2poiKKgZevSoyRUliIiI8ilOnyDSslGjPAAAeno68PP7HBERg9GrVy02xERERPkYR4qJPsHr10mIjk5CuXLv5wd7erpg9mwvtGxZEVWqlJAxHREREWUXR4qJckCpVOHHH8+hQoUl6Nr1T6hUQm37mDEN2BATEREVIGyKiTR0/PhD1KnzEwYM2I1Xr5Jw6tRjbNhwWe5YRERE9Ak4fYIomx4/jsOYMfvx++9X1eodO1ZDkyZlZEpFRERE2sCmmOgjkpPTMX/+ScyceRxv3qRJ9Zo17bF4sQ8aNSotYzoiIiLSBjbFRB9w+PB99Oq1A/fuxUi14sWNMWNGU/TpUxu6upyBREREVBiwKSb6ABMTfakh1tVVYODAupg8uTGsrY3lDUZERERaxaaY6APq1XNCjx418eBBDBYt8oGrq53ckYiIiCgXsCkmAqBSCaxZcxFbtlzHnj2d1KZFrFjxNQwNdaFQ8MM3iIiICitOiKQi79SpR3B3/xl9+vyNffvu4OefL6htNzLSY0NMRERUyHGk+GPCtwInA4DU+Oztn/gsd/OQ1jx7Fo+xYw9g/Xr1NYbDwp7LlIiIiIjkki9GipctWwYXFxcYGRnB3d0dZ86cyXLfn376CY0aNYKVlRWsrKzg5eX1wf0/2ckAIPomkPAke19C9fY4A/Pcy0SfJCUlHXPmnEDFikvVGmJXV1scOtQdK1Z8I2M6IiIikoPsI8WbN2+Gn58fVq5cCXd3dwQHB8Pb2xvh4eGwtbXNsP/hw4fRsWNH1K9fH0ZGRpg9ezaaN2+Oa9euwcnJSfsB340QK3QAU4fsHWNgDjSYpv0s9Ml2747A8OGhuH07WqpZWRlh2rQm6N+/DvT08sXfiURERJTHFEIIIWcAd3d31K1bF0uXLgUAqFQqODs7Y8iQIfD39//o8UqlElZWVli6dCm6dev20f3j4uJgaWmJ2NhYWFhYfDzgjyXfjgCbOQH9H398f8q37tyJRsWKS6FSvf2R19FRoH9/N0yb1gTFi5vInI6IiIiyQ+NeLptkHRZLTU3F+fPn4eXlJdV0dHTg5eWFU6dOZescb968QVpaGqytrTPdnpKSgri4OLUvKprKlbPGgAFuAIAvviiNCxf6Yfnyr9kQExERkbxNcVRUFJRKJezs1Nd+tbOzw/Pn2bvZaezYsXB0dFRrrP9XUFAQLC0tpS9nZ+dPzk35n0olsHXrNaSlKdXqU6c2waZN7XD4cHfUqGEvUzoiIiLKbwr0BMpZs2Zh06ZN+PPPP2FkZJTpPuPGjUNsbKz09ejRozxOSXnt7NknqF//F3TosA1Ll6rfhFm8uAl8fatxiTUiIiJSI2tTbGNjA11dXURGRqrVIyMjYW//4VG8efPmYdasWdi3bx+qV6+e5X6GhoawsLBQ+6LCKTIyAb1770C9ej/j9OknAIDJk48gNjZZ5mRERESU38naFBsYGMDNzQ0HDx6UaiqVCgcPHoSHh0eWx82ZMwfTpk1DSEgI6tSpkxdRKR9LTVViwYJTqFhxKVavDpPqVarYYNu29rC0zPxdBCIiIqJ3ZF+Szc/PD927d0edOnVQr149BAcHIzExET179gQAdOvWDU5OTggKCgIAzJ49GwEBAdi4cSNcXFykucdmZmYwMzOT7XWQPPbtu4Nhw0Jw82aUVLO0NMSUKY0xcGBd6OvryheOiIiICgzZm2JfX1+8fPkSAQEBeP78OWrWrImQkBDp5ruHDx9CR+f9gPaKFSuQmpqK7777Tu08gYGBmDx5cl5GJxkJIdChwzZs23ZdqikUQO/etTBjRjPY2prKmI6IiIgKGtmbYgAYPHgwBg8enOm2w4cPqz2+f/9+7geifE+hUMDFxVJ67OFREkuWtICbm6OMqYiIiKigyhdNMdHHCCGgVAq1T5ybNMkT//xzHyNGfI7OnV25ogQRERHlWIFeko2KhosXn+GLL9Zi9uzjanULC0OcO9cXXbpUZ0NMREREn4QjxZRvRUW9wcSJ/2DVqvMQArhw4Rm6dasBZ+f30ybYDBMREZE2sCmmfCc9XYUVK84iIOAwYmLerzFcsqQFXrxIVGuKiYiIiLSBTTHlK//8cw/DhoXg6tUXUs3c3AABAZ4YOtQdBgZcYo2IiIi0j00x5Qv378dg1Kh92L79hlq9R4+aCApqBnt7rkFNREREuYdNMeULa9eGqTXEdes6YsmSFnB3LyljKiIiIioquPoE5QtjxjSAs7MFbG1NsXr1t/j33z5siImIiCjPcKSY8tyVK5E4f/4ZevSoKdVMTPSxY8f3KFvWCpaWRvKFIyIioiKJTTHlmejoJAQEHMKKFeegp6eDRo1KoVw5a2l7rVoOMqYjIiKioozTJyjXKZVvl1irUGEJli07C5VKIDVViblzT8odjYiIiAgAR4oplx09+gBDh+7FpUuRUs3UVB8TJjTCiBEeMiYjIiIieo9NMeWKR49iMXr0fmzefE2t3rmzK2bP9oKTk4VMyYiIiIgyYlNMWnfy5CN4ef2KpKR0qVa7tgMWL/ZBgwalZExGRERElDnOKSatc3NzkEaCbWxMsGrVNzhzpg8bYiIiIsq3OFJMnywq6g1sbEykx4aGeli0yAf79t1BYKAnrKyMZUxHRERE9HEcKaYci4lJxogRIXB2Xojr11+qbfvqqwoIDvZhQ0xEREQFApti0phSqcLPP19AxYpLEBx8GsnJ6Rg+PARCCLmjEREREeUIp0+QRk6efIShQ/fi/PlnUs3YWA8NG5aCSiWgq6uQMR0RERFRzrAppmx5+jQeY8cewG+/XVard+jwGebO/RKlSlnKlIyIiIjo07Eppg9KS1NiwYJTmDbtKBIT06S6q6stlixpAU9PF/nCEREREWkJm2L6IIVCgfXrL0sNsZWVEaZPb4p+/dygp8cp6URERFQ4sKuhD9LT08HixS2go6PAwIF1cOvWEAwcWJcNMRERERUqHCkmSVxcCqZPP4rvv6+G2rUdpHrTpmVw585QuLgUky8cERERUS5iU0xQqQTWr78Ef/+DeP48ASdPPsKxYz2hULxfSYINMRERERVmbIqLuLNnn2DIkL04ffqJVDt37imuXn0BV1c7GZMRERER5R1ODC2iIiMT0KvXDtSr97NaQ9y2bRXcuDGIDTEREREVKRwpLmJSU5VYuvQMpkw5gri4FKletWoJLFrkAy+vsjKmIyIiIpIHm+IipkePv/D771elx5aWhpgypTEGDqwLfX1d+YIRERERyYjTJ4qYIUPqAQAUCqBv39q4dWsIhg37nA0xERERFWkcKS7EEhJSERmZgHLlrKWah4czgoKa4csvy8LNzVHGdERERET5B0eKCyEhBDZsuIxKlZbC13cbVCqhtt3fvyEbYiIiIqL/waa4kDl//ikaNlyDLl3+xNOn8Th//hnWrLkodywiIiKifI3TJwqJly8TMWHCP/j55wsQ/zMw3LJlRXh6usiWi4iIiKggYFNcwKWlKbF8+VkEBh5GbOz7JdYqVSqO4GAf+PiUlzEdEVHhIoRAeno6lEql3FGICjV9fX3o6ubtIgBsiguw06cfo1evnbh+/aVUMzc3QGCgJ4YMcYeBAVeUICLSltTUVDx79gxv3ryROwpRoadQKFCyZEmYmZnl2XOyKS7ADAx0cePG+4a4Z8+amDmzGezt8+4HiIioKFCpVLh37x50dXXh6OgIAwMDKBQKuWMRFUpCCLx8+RKPHz9GhQoV8mzEmE1xAVarlgP69XNDWNhzLF7cAvXqOckdiYioUEpNTYVKpYKzszNMTEzkjkNU6JUoUQL3799HWloam2J6TwiBrVuvY+3aMOzc2RF6eu8XDVmwwBtGRnrQ0eGIBRFRbtPR4aJNRHlBjndi+Nudz12+HIkmTdbB13cb9u69jRUrzqptNzHRZ0NMRERE9InYFOdTr169waBBu1Gr1o84cuSBVD958rGMqYiIiIgKJzbF+Ux6ugrLl59FxYpLsXz5OenT6MqVs8Lff3fExo1tZU5IRERUNISHh8Pe3h7x8fFyRyl0Pv/8c2zfvl3uGGrYFOcjR47ch5vbKgwatAfR0UkAAFNTfQQFNcO1awPxzTcVebczERFppEePHlAoFFAoFNDX10eZMmUwZswYJCcnZ9h3165d8PT0hLm5OUxMTFC3bl2sXbs20/Nu374djRs3hqWlJczMzFC9enVMnToV0dHRufyK8s64ceMwZMgQmJubyx0l1yxbtgwuLi4wMjKCu7s7zpw589FjgoODUalSJRgbG8PZ2RkjRoxQ+3mKj4/H8OHDUbp0aRgbG6N+/fo4e1Z9+ufEiRPh7+8PlUql9deUU2yK84mnT+Ph5bUely9HSrUuXaojPHww/P0bwtCQ90QSEVHO+Pj44NmzZ7h79y4WLlyIH3/8EYGBgWr7LFmyBK1atUKDBg1w+vRpXL58Gd9//z0GDBiAUaNGqe07YcIE+Pr6om7duti7dy+uXr2K+fPn49KlS1i/fn2eva7U1NRcO/fDhw+xa9cu9OjR45POk5sZP9XmzZvh5+eHwMBAXLhwATVq1IC3tzdevHiR5TEbN26Ev78/AgMDcePGDfzyyy/YvHkzxo8fL+3Tp08f7N+/H+vXr8eVK1fQvHlzeHl54cmTJ9I+LVq0QHx8PPbu3Zurr1EjooiJjY0VAERsbGz2DljpJMQ8vP3fXDZiRIgAJovatX8Ux48/yPXnIyKi7ElKShLXr18XSUlJckfRWPfu3UWrVq3Uam3bthW1atWSHj98+FDo6+sLPz+/DMcvXrxYABD//vuvEEKI06dPCwAiODg40+d7/fp1llkePXokvv/+e2FlZSVMTEyEm5ubdN7Mcg4bNkx4enpKjz09PcWgQYPEsGHDRPHixUXjxo1Fx44dRYcOHdSOS01NFcWLFxfr1q0TQgihVCrFzJkzhYuLizAyMhLVq1cXW7duzTKnEELMnTtX1KlTR60WFRUlvv/+e+Ho6CiMjY1FtWrVxMaNG9X2ySyjEEJcuXJF+Pj4CFNTU2Frayu6dOkiXr58KR23d+9e0aBBA2FpaSmsra3F119/LW7fvv3BjJ+qXr16YtCgQdJjpVIpHB0dRVBQUJbHDBo0SDRt2lSt5ufnJxo0aCCEEOLNmzdCV1dX7Nq1S22f2rVriwkTJqjVevbsKbp06ZLp83zod07jXi6bOPwoAyEEdu2KQPPm5dRGgAMDPVGtmi26d68BXV0O4hMR5Xu/1QESn+ftc5raA13O5fjwq1ev4uTJkyhdurRU27ZtG9LS0jKMCANA//79MX78ePz+++9wd3fHhg0bYGZmhoEDB2Z6/mLFimVaT0hIgKenJ5ycnLBz507Y29vjwoULGr99vm7dOvzwww84ceIEAOD27dto3749EhISpE8/Cw0NxZs3b9CmTRsAQFBQEH777TesXLkSFSpUwNGjR9GlSxeUKFECnp6emT7PsWPHUKdOHbVacnIy3NzcMHbsWFhYWGD37t3o2rUrypUrh3r16mWZMSYmBk2bNkWfPn2wcOFCJCUlYezYsejQoQP++ecfAEBiYiL8/PxQvXp1JCQkICAgAG3atEFYWFiWSwHOnDkTM2fO/OD36/r16yhVqlSGempqKs6fP49x48ZJNR0dHXh5eeHUqVNZnq9+/fr47bffcObMGdSrVw93797Fnj170LVrVwCQPgbdyMhI7ThjY2McP35crVavXj3MmjXrg/nzEpviPHbt2gsMGxaCgwfvISioGfz9G0rbLC2N0KtXLRnTERGRRhKfAwlPPr6fzHbt2gUzMzOkp6cjJSUFOjo6WLp0qbQ9IiIClpaWcHBwyHCsgYEBypYti4iICADArVu3ULZsWejr62uUYePGjXj58iXOnj0La2trAED58uU1fi0VKlTAnDlzpMflypWDqakp/vzzT6kx27hxI7799luYm5sjJSUFM2fOxIEDB+Dh4QEAKFu2LI4fP44ff/wxy6b4wYMHGZpiJycntT8chgwZgtDQUGzZskWtKf5vxunTp6NWrVpqDezq1avh7OyMiIgIVKxYEe3atVN7rtWrV6NEiRK4fv06qlWrlmnGAQMGoEOHDh/8fjk6OmZaj4qKglKphJ2dnVrdzs4ON2/ezPJ8nTp1QlRUFBo2bAghBNLT0zFgwABp+oS5uTk8PDwwbdo0VKlSBXZ2dvj9999x6tSpDNfb0dERjx49gkqlyhdrgLMpziOvXydh8uTDWLbsLJTKtytKTJ9+FL161YKtranM6YiIKEdM7QvEczZp0gQrVqxAYmIiFi5cCD09vQxNWHYJIXJ0XFhYGGrVqiU1xDnl5uam9lhPTw8dOnTAhg0b0LVrVyQmJmLHjh3YtGkTgLcjyW/evMGXX36pdlxqaipq1cp6ICopKSnDaKdSqcTMmTOxZcsWPHnyBKmpqUhJScnwKYf/zXjp0iUcOnRIGsn+X3fu3EHFihVx69YtBAQE4PTp04iKipJG0B8+fJhlU2xtbf3J309NHT58GDNnzsTy5cvh7u6O27dvY9iwYZg2bRomTZoEAFi/fj169eoFJycn6Orqonbt2ujYsSPOnz+vdi5jY2OoVCqkpKTA2Ng4T19HZtgU5zKlUoXVqy9i/Ph/EBX1Rqq7uBTDwoXeKFGCHxdKRFRgfcI0hrxkamoqjdKtXr0aNWrUwC+//ILevXsDACpWrIjY2Fg8ffo0w8hiamoq7ty5gyZNmkj7Hj9+HGlpaRqNFn+s6dHR0cnQcKelpWX6Wv6rc+fO8PT0xIsXL7B//34YGxvDx8cHwNtpGwCwe/duODk5qR1naGiYZR4bGxu8fv1arTZ37lwsWrQIwcHBcHV1hampKYYPH57hZrr/ZkxISEDLli0xe/bsDM/zbnS+ZcuWKF26NH766Sc4OjpCpVKhWrVqH7xR71OmT9jY2EBXVxeRkZFq9cjISNjbZ/2H16RJk9C1a1f06dMHAODq6orExET069cPEyZMgI6ODsqVK4cjR44gMTERcXFxcHBwgK+vL8qWLat2rujoaJiamuaLhhjg6hO56sSJh6hb9yf067dLaoiNjfUwbVoTXL8+EK1bV+YSa0RElKd0dHQwfvx4TJw4EUlJb5f/bNeuHfT19TF//vwM+69cuRKJiYno2LEjgLdvnyckJGD58uWZnj8mJibTevXq1REWFpblkm0lSpTAs2fP1GphYWHZek3169eHs7MzNm/ejA0bNqB9+/ZSw161alUYGhri4cOHKF++vNqXs7NzluesVasWrl+/rlY7ceIEWrVqhS5duqBGjRpq00o+pHbt2rh27RpcXFwyZDA1NcWrV68QHh6OiRMnolmzZqhSpUqGhjwzAwYMQFhY2Ae/spo+YWBgADc3Nxw8eFCqqVQqHDx4UJpmkpk3b95kmOqgq6sLIOO7CKampnBwcMDr168RGhqKVq1aqW2/evXqB0fr8xpHinOBEAJ9+uzE6tVhavXvv6+GOXO84OxsKU8wIiIiAO3bt8fo0aOxbNkyjBo1CqVKlcKcOXMwcuRIGBkZoWvXrtDX18eOHTswfvx4jBw5Eu7u7gAAd3d3jBkzBiNHjsSTJ0/Qpk0bODo64vbt21i5ciUaNmyIYcOGZXjOjh07YubMmWjdujWCgoLg4OCAixcvwtHRER4eHmjatCnmzp2LX3/9FR4eHvjtt980apo6deqElStXIiIiAocOHZLq5ubmGDVqFEaMGAGVSoWGDRsiNjYWJ06cgIWFBbp3757p+by9vdGnTx8olUqp6atQoQK2bduGkydPwsrKCgsWLEBkZCSqVq36wWyDBg3CTz/9hI4dO2LMmDGwtrbG7du3sWnTJvz888+wsrJC8eLFsWrVKjg4OODhw4fw9/f/6Gv+1OkTfn5+6N69O+rUqYN69eohODgYiYmJ6Nmzp7RPt27d4OTkhKCgIABvR7QXLFiAWrVqSdMnJk2ahJYtW0rfp9DQUAghUKlSJdy+fRujR49G5cqV1c4LvL2ZsXnz5jnOr3VaXcuiAMirJdnGjz8ggMkCmCyqV18hjhy5n4O0RESUHxS2JdmEECIoKEiUKFFCJCQkSLUdO3aIRo0aCVNTU2FkZCTc3NzE6tWrMz3v5s2bxRdffCHMzc2FqampqF69upg6deoHl2S7f/++aNeunbCwsBAmJiaiTp064vTp09L2gIAAYWdnJywtLcWIESPE4MGDMyzJNmzYsEzPff36dQFAlC5dWqhUKrVtKpVKBAcHi0qVKgl9fX1RokQJ4e3tLY4cOZJl1rS0NOHo6ChCQkKk2qtXr0SrVq2EmZmZsLW1FRMnThTdunVT+/5mlTEiIkK0adNGFCtWTBgbG4vKlSuL4cOHS1n3798vqlSpIgwNDUX16tXF4cOHBQDx559/ZplRG5YsWSJKlSolDAwMRL169aQl8v739XTv3l16nJaWJiZPnizKlSsnjIyMhLOzsxg4cKDadd+8ebMoW7asMDAwEPb29mLQoEEiJiZG7byPHz8W+vr64tGjR5nmkmNJNoUQOZwxX0DFxcXB0tISsbGxsLCw+PgBP5Z8e2exmRPQ/3GmuwghoFIJtWXUEhJS0aDBagwY4Ia+fd2gp8eZKkREBVVycjLu3buHMmXKZLj5igqvZcuWYefOnQgNDZU7SqEzduxYvH79GqtWrcp0+4d+5zTu5bKJ0yc+0c2bURg+PATu7k6YMqWJVDczM0BYWH/OGSYiIiqg+vfvj5iYGMTHxxfqj3qWg62tLfz8/OSOoYZNcQ7FxiZj2rSjWLToNNLTVThy5AF69qwFF5di0j5siImIiAouPT09TJgwQe4YhdLIkSPljpABm2INqVQC69aFwd//IF68SJTqJUqY4PHjOLWmmIiIiIgKBjbFGjh9+jGGDNmLs2efSjUjIz2MHdsAY8Y0gImJZp/uQ0RERET5A5vibHgeZwb/bZ5Y9+8vavV27apg3rzmHB0mIioiiti96USykeN3reg2xasrA8bZWBEi8RnWnquPdf9WlEqffVYCixe3QNOmZXIxIBER5RfvPgjizZs3+ebTt4gKs3ef5Pdu7eO8UHSb4sRngDJ7uw5v9C9+Ovs5otNsMHVqY/zwQ10usUZEVITo6uqiWLFiePHiBQDAxMSEN1MT5RKVSoWXL1/CxMQEenp516oW3aZYoQDMMn704e0XFjhxxw7dPW5JNSMDc2z9sTqcG7ZFiRIZP3OdiIgKP3t7ewCQGmMiyj06OjooVapUnv7xWXSbYhN7tQ/jSEhIxYwZR7Fgwb9QqQTc/beicmUbaXttOTISEVG+oVAo4ODgAFtbW6Slpckdh6hQMzAwgI5O3r4rX3Sb4v8nhMCGDVcwduwBPH0aL9VnzDiG9evbyJiMiIjyI11d3Tyd50hEeSNfTIxdtmwZXFxcYGRkBHd3d5w5c+aD+2/duhWVK1eGkZERXF1dsWfPnhw97/nzT9Gw4Rp07fqn1BAbGOhi3LiGWLHi6xydk4iIiIgKHtmb4s2bN8PPzw+BgYG4cOECatSoAW9v7yznbJ08eRIdO3ZE7969cfHiRbRu3RqtW7fG1atXNXreoZs9ULfuTzh58pFU+/bbSrh+fSBmzmwGMzODT3pdRERERFRwKITMiy66u7ujbt26WLp0KYC3dxw6OztjyJAh8Pf3z7C/r68vEhMTsWvXLqn2+eefo2bNmli5cuVHny8uLg6WlpYA/AEYAQAqVSqORYt84O1dXiuviYiIiIhyx7teLjY2FhYWFlo7r6xzilNTU3H+/HmMGzdOquno6MDLywunTp3K9JhTp07Bz89Prebt7Y2//vor0/1TUlKQkpIiPY6NjX23BWZmBhg3rhH69XODgYEu4uLiPun1EBEREVHuetevaXtcV9amOCoqCkqlEnZ2dmp1Ozs73Lx5M9Njnj9/nun+z58/z3T/oKAgTJkyJZMtC5GQAEyY8PaLiIiIiAqOV69e/f+7/9pR6FefGDdunNrIckxMDEqXLo2HDx9q9RtJ+VtcXBycnZ3x6NEjrb7VQvkXr3nRw2teNPG6Fz2xsbEoVaoUrK2ttXpeWZtiGxsb6OrqIjIyUq0eGRkpLZL+X/b29hrtb2hoCENDwwx1S0tL/vIUQRYWFrzuRQyvedHDa1408boXPdpex1jW1ScMDAzg5uaGgwcPSjWVSoWDBw/Cw8Mj02M8PDzU9geA/fv3Z7k/EREREdHHyD59ws/PD927d0edOnVQr149BAcHIzExET179gQAdOvWDU5OTggKCgIADBs2DJ6enpg/fz6+/vprbNq0CefOncOqVavkfBlEREREVIDJ3hT7+vri5cuXCAgIwPPnz1GzZk2EhIRIN9M9fPhQbXi8fv362LhxIyZOnIjx48ejQoUK+Ouvv1CtWrVsPZ+hoSECAwMznVJBhReve9HDa1708JoXTbzuRU9uXXPZ1ykmIiIiIpKb7J9oR0REREQkNzbFRERERFTksSkmIiIioiKPTTERERERFXmFsiletmwZXFxcYGRkBHd3d5w5c+aD+2/duhWVK1eGkZERXF1dsWfPnjxKStqkyXX/6aef0KhRI1hZWcHKygpeXl4f/Tmh/EfT3/V3Nm3aBIVCgdatW+duQNI6Ta95TEwMBg0aBAcHBxgaGqJixYr8b3wBpOl1Dw4ORqVKlWBsbAxnZ2eMGDECycnJeZSWPtXRo0fRsmVLODo6QqFQ4K+//vroMYcPH0bt2rVhaGiI8uXLY+3atZo/sShkNm3aJAwMDMTq1avFtWvXRN++fUWxYsVEZGRkpvufOHFC6Orqijlz5ojr16+LiRMnCn19fXHlypU8Tk6fQtPr3qlTJ7Fs2TJx8eJFcePGDdGjRw9haWkpHj9+nMfJKac0vebv3Lt3Tzg5OYlGjRqJVq1a5U1Y0gpNr3lKSoqoU6eO+Oqrr8Tx48fFvXv3xOHDh0VYWFgeJ6dPoel137BhgzA0NBQbNmwQ9+7dE6GhocLBwUGMGDEij5NTTu3Zs0dMmDBB/PHHHwKA+PPPPz+4/927d4WJiYnw8/MT169fF0uWLBG6uroiJCREo+ctdE1xvXr1xKBBg6THSqVSODo6iqCgoEz379Chg/j666/Vau7u7qJ///65mpO0S9Pr/l/p6enC3NxcrFu3Lrcikpbl5Jqnp6eL+vXri59//ll0796dTXEBo+k1X7FihShbtqxITU3Nq4iUCzS97oMGDRJNmzZVq/n5+YkGDRrkak7KHdlpiseMGSM+++wztZqvr6/w9vbW6LkK1fSJ1NRUnD9/Hl5eXlJNR0cHXl5eOHXqVKbHnDp1Sm1/APD29s5yf8p/cnLd/+vNmzdIS0uDtbV1bsUkLcrpNZ86dSpsbW3Ru3fvvIhJWpSTa75z5054eHhg0KBBsLOzQ7Vq1TBz5kwolcq8ik2fKCfXvX79+jh//rw0xeLu3bvYs2cPvvrqqzzJTHlPW72c7J9op01RUVFQKpXSp+G9Y2dnh5s3b2Z6zPPnzzPd//nz57mWk7QrJ9f9v8aOHQtHR8cMv1SUP+Xkmh8/fhy//PILwsLC8iAhaVtOrvndu3fxzz//oHPnztizZw9u376NgQMHIi0tDYGBgXkRmz5RTq57p06dEBUVhYYNG0IIgfT0dAwYMADjx4/Pi8gkg6x6ubi4OCQlJcHY2Dhb5ylUI8VEOTFr1ixs2rQJf/75J4yMjOSOQ7kgPj4eXbt2xU8//QQbGxu541AeUalUsLW1xapVq+Dm5gZfX19MmDABK1eulDsa5aLDhw9j5syZWL58OS5cuIA//vgDu3fvxrRp0+SORvlcoRoptrGxga6uLiIjI9XqkZGRsLe3z/QYe3t7jfan/Ccn1/2defPmYdasWThw4ACqV6+emzFJizS95nfu3MH9+/fRsmVLqaZSqQAAenp6CA8PR7ly5XI3NH2SnPyeOzg4QF9fH7q6ulKtSpUqeP78OVJTU2FgYJCrmenT5eS6T5o0CV27dkWfPn0AAK6urkhMTES/fv0wYcIE6OhwPLCwyaqXs7CwyPYoMVDIRooNDAzg5uaGgwcPSjWVSoWDBw/Cw8Mj02M8PDzU9geA/fv3Z7k/5T85ue4AMGfOHEybNg0hISGoU6dOXkQlLdH0mleuXBlXrlxBWFiY9PXtt9+iSZMmCAsLg7Ozc17GpxzIye95gwYNcPv2bekPIACIiIiAg4MDG+ICIifX/c2bNxka33d/GL29b4sKG631cprdA5j/bdq0SRgaGoq1a9eK69evi379+olixYqJ58+fCyGE6Nq1q/D395f2P3HihNDT0xPz5s0TN27cEIGBgVySrQDS9LrPmjVLGBgYiG3btolnz55JX/Hx8XK9BNKQptf8v7j6RMGj6TV/+PChMDc3F4MHDxbh4eFi165dwtbWVkyfPl2ul0A5oOl1DwwMFObm5uL3338Xd+/eFfv27RPlypUTHTp0kOslkIbi4+PFxYsXxcWLFwUAsWDBAnHx4kXx4MEDIYQQ/v7+omvXrtL+75ZkGz16tLhx44ZYtmwZl2R7Z8mSJaJUqVLCwMBA1KtXT/z777/SNk9PT9G9e3e1/bds2SIqVqwoDAwMxGeffSZ2796dx4lJGzS57qVLlxYAMnwFBgbmfXDKMU1/1/8Xm+KCSdNrfvLkSeHu7i4MDQ1F2bJlxYwZM0R6enoep6ZPpcl1T0tLE5MnTxblypUTRkZGwtnZWQwcOFC8fv0674NTjhw6dCjTf6PfXefu3bsLT0/PDMfUrFlTGBgYiLJly4o1a9Zo/LwKIfheAhEREREVbYVqTjERERERUU6wKSYiIiKiIo9NMREREREVeWyKiYiIiKjIY1NMREREREUem2IiIiIiKvLYFBMRERFRkcemmIiIiIiKPDbFRET/sXbtWhQrVkzuGDmmUCjw119/fXCfHj16oHXr1nmSh4ioIGBTTESFUo8ePaBQKDJ83b59W+5oWLt2rZRHR0cHJUuWRM+ePfHixQutnP/Zs2do0aIFAOD+/ftQKBQICwtT22fRokVYu3atVp4vK5MnT5Zep66uLpydndGvXz9ER0drdB428ESUF/TkDkBElFt8fHywZs0atVqJEiVkSqPOwsIC4eHhUKlUuHTpEnr27ImnT58iNDT0k89tb2//0X0sLS0/+Xmy47PPPsOBAwegVCpx48YN9OrVC7Gxsdi8eXOePD8RUXZxpJiICi1DQ0PY29urfenq6mLBggVwdXWFqakpnJ2dMXDgQCQkJGR5nkuXLqFJkyYwNzeHhYUF3NzccO7cOWn78ePH0ahRIxgbG8PZ2RlDhw5FYmLiB7MpFArY29vD0dERLVq0wNChQ3HgwAEkJSVBpVJh6tSpKFmyJAwNDVGzZk2EhIRIx6ampmLw4MFwcHCAkZERSpcujaCgILVzv5s+UaZMGQBArVq1oFAo0LhxYwDqo6+rVq2Co6MjVCqVWsZWrVqhV69e0uMdO3agdu3aMDIyQtmyZTFlyhSkp6d/8HXq6enB3t4eTk5O8PLyQvv27bF//35pu1KpRO/evVGmTBkYGxujUqVKWLRokbR98uTJWLduHXbs2CGNOh8+fBgA8OjRI3To0AHFihWDtbU1WrVqhfv3738wDxFRVtgUE1GRo6Ojg8WLF+PatWtYt24d/vnnH4wZMybL/Tt37oySJUvi7NmzOH/+PPz9/aGvrw8AuHPnDnx8fNCuXTtcvnwZmzdvxvHjxzF48GCNMhkbG0OlUiE9PR2LFi3C/PnzMW/ePFy+fBne3t749ttvcevWLQDA4sWLsXPnTmzZsgXh4eHYsGEDXFxcMj3vmTNnAAAHDhzAs2fP8Mcff2TYp3379nj16hUOHTok1aKjoxESEoLOnTsDAI4dO4Zu3bph2LBhuH79On788UesXbsWM2bMyPZrvH//PkJDQ2FgYCDVVCoVSpYsia1bt+L69esICAjA+PHjsWXLFgDAqFGj0KFDB/j4+ODZs2d49uwZ6tevj7S0NHh7e8Pc3BzHjh3DiRMnYGZmBh8fH6SmpmY7ExGRRBARFULdu3cXurq6wtTUVPr67rvvMt1369atonjx4tLjNWvWCEtLS+mxubm5WLt2babH9u7dW/Tr10+tduzYMaGjoyOSkpIyPea/54+IiBAVK1YUderUEUII4ejoKGbMmKF2TN26dcXAgQOFEEIMGTJENG3aVKhUqkzPD0D8+eefQggh7t27JwCIixcvqu3TvXt30apVK+lxq1atRK9evaTHP/74o3B0dBRKpVIIIUSzZs3EzJkz1c6xfv164eDgkGkGIYQIDAwUOjo6wtTUVBgZGQkAAoBYsGBBlscIIcSgQYNEu3btssz67rkrVaqk9j1ISUkRxsbGIjQ09IPnJyLKDOcUE1Gh1aRJE6xYsUJ6bGpqCuDtqGlQUBBu3ryJuLg4pKenIzk5GW/evIGJiUmG8/j5+aFPnz5Yv369NAWgXLlyAN5Orbh8+TI2bNgg7S+EgEqlwr1791ClSpVMs8XGxsLMzAwqlQrJyclo2LAhfv75Z8TFxeHp06do0KCB2v4NGjTApUuXALyd+vDll1+iUqVK8PHxwTfffIPmzZt/0veqc+fO6Nu3L5YvXw5DQ0Ns2LAB33//PXR0dKTXeeLECbWRYaVS+cHvGwBUqlQJO3fuRHJyMn777TeEhYVhyJAhavssW7YMq1evxsOHD5GUlITU1FTUrFnzg3kvXbqE27dvw9zcXK2enJyMO3fu5OA7QERFHZtiIiq0TE1NUb58ebXa/fv38c033+CHH37AjBkzYG1tjePHj6N3795ITU3NtLmbPHkyOnXqhN27d2Pv3r0IDAzEpk2b0KZNGyQkJKB///4YOnRohuNKlSqVZTZzc3NcuHABOjo6cHBwgLGxMQAgLi7uo6+rdu3auHfvHvbu3YsDBw6gQ4cO8PLywrZt2z56bFZatmwJIQR2796NunXr4tixY1i4cKG0PSEhAVOmTEHbtm0zHGtkZJTleQ0MDKRrMGvWLHz99deYMmUKpk2bBgDYtGkTRo0ahfnz58PDwwPm5uaYO3cuTp8+/cG8CQkJcHNzU/tj5J38cjMlERUsbIqJqEg5f/48VCoV5s+fL42Cvpu/+iEVK1ZExYoVMWLECHTs2BFr1qxBmzZtULt2bVy/fj1D8/0xOjo6mR5jYWEBR0dHnDhxAp6enlL9xIkTqFevntp+vr6+8PX1xXfffQcfHx9ER0fD2tpa7Xzv5u8qlcoP5jEyMkLbtm2xYcMG3L59G5UqVULt2rWl7bVr10Z4eLjGr/O/Jk6ciKZNm+KHH36QXmf9+vUxcOBAaZ//jvQaGBhkyF+7dm1s3rwZtra2sLCw+KRMREQAb7QjoiKmfPnySEtLw5IlS3D37l2sX78eK1euzHL/pKQkDB48GIcPH8aDBw9w4sQJnD17VpoWMXbsWJw8eRKDBw9GWFgYbt26hR07dmh8o93/Gj16NGbPno3NmzcjPDwc/v7+CAsLw7BhwwAACxYswO+//46bN28iIiICW7duhb29faYfOGJrawtjY2OEhIQgMjISsbGxWT5v586dsXv3bqxevVq6we6dgIAA/Prrr5gyZQquXbuGGzduYNOmTZg4caJGr83DwwPVq1fHzJkzAQAVKlTAuXPnEBoaioiICEyaNAlnz55VO8bFxQWXL19GeHg4oqKikJaWhs6dO8PGxgatWrXCsWPHcO/ePRw+fBhDhw7F48ePNcpERASwKSaiIqZGjRpYsGABZs+ejWrVqmHDhg1qy5n9l66uLl69eoVu3bqhYsWK6NChA1q0aIEpU6YAAKpXr44jR44gIiICjRo1Qq1atRAQEABHR8ccZxw6dCj8/PwwcuRIuLq6IiQkBDt37kSFChUAvJ16MWfOHNSpUwd169bF/fv3sWfPHmnk+3/p6elh8eLF+PHHH+Ho6IhWrVpl+bxNmzaFtbU1wsPD0alTJ7Vt3t7e2LVrF/bt24e6devi888/x8KFC1G6dGmNX9+IESPw888/49GjR+jfvz/atm0LX19fuLu749WrV2qjxgDQt29fVKpUCXXq1EGJEiVw4sQJmJiY4OjRoyhVqhTatm2LKlWqoHfv3khOTubIMRHliEIIIeQOQUREREQkJ44UExEREVGRx6aYiIiIiIo8NsVEREREVOSxKSYiIiKiIo9NMREREREVeWyKiYiIiKjIY1NMREREREUem2IiIiIiKvLYFBMRERFRkcemmIiIiIiKPDbFRERERFTk/R8yF8qsJhFsEgAAAABJRU5ErkJggg==\",\n      \"text/plain\": [\n       \"<Figure size 809x500 with 1 Axes>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"# Step 6: Evaluation\\n\",\n    \"predictions = trainer.predict(small_eval_dataset)\\n\",\n    \"\\n\",\n    \"# Confusion matrix\\n\",\n    \"cm = confusion_matrix(small_eval_dataset['label'], predictions.predictions.argmax(-1))\\n\",\n    \"sns.heatmap(cm, annot=True, fmt='d')\\n\",\n    \"plt.title('Confusion Matrix')\\n\",\n    \"plt.show()\\n\",\n    \"\\n\",\n    \"# ROC Curve\\n\",\n    \"fpr, tpr, _ = roc_curve(small_eval_dataset['label'], predictions.predictions[:, 1])\\n\",\n    \"roc_auc = auc(fpr, tpr)\\n\",\n    \"\\n\",\n    \"plt.figure(figsize=(1.618 * 5, 5))\\n\",\n    \"plt.plot(fpr, tpr, color='darkorange', lw=2, label='ROC curve (area = %0.2f)' % roc_auc)\\n\",\n    \"plt.plot([0, 1], [0, 1], color='navy', lw=2, linestyle='--')\\n\",\n    \"plt.xlim([0.0, 1.0])\\n\",\n    \"plt.ylim([0.0, 1.05])\\n\",\n    \"plt.xlabel('False Positive Rate')\\n\",\n    \"plt.ylabel('True Positive Rate')\\n\",\n    \"plt.title('Receiver operating characteristic')\\n\",\n    \"plt.legend(loc=\\\"lower right\\\")\\n\",\n    \"plt.show()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"40df8f20-6218-4576-ad59-eb55be73cd0c\",\n   \"metadata\": {},\n   \"source\": [\n    \"Interpreting these results calls for a basic understanding of evaluation metrics. \\n\",\n    \"\\n\",\n    \"The confusion matrix gives a detailed breakdown of how our predictions measure up to the actual labels, while the ROC curve shows us the trade-off between the true positive rate (sensitivity) and the false positive rate (1 - specificity) at various threshold settings. The area under the ROC curve (AUC-ROC) is a measure of the overall performance of the model, with 1.0 representing perfect classification and 0.5 indicating a model no better than random guessing.\\n\",\n    \"** **\\n\",\n    \"Finally, to see our model in action, let's use it to infer the sentiment of a sample text.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"id\": \"ef6449a2-1295-4c3f-b831-4a61da9ce39c\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Positive sentiment\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Step 7: Inference on a new sample\\n\",\n    \"sample_text = \\\"This is a fantastic movie. I really enjoyed it.\\\"\\n\",\n    \"sample_inputs = tokenizer(sample_text, padding=\\\"max_length\\\", truncation=True, max_length=512, return_tensors=\\\"pt\\\")\\n\",\n    \"\\n\",\n    \"# Move inputs to device (if GPU available)\\n\",\n    \"sample_inputs.to(training_args.device)\\n\",\n    \"\\n\",\n    \"# Make prediction\\n\",\n    \"predictions = model(**sample_inputs)\\n\",\n    \"predicted_class = predictions.logits.argmax(-1).item()\\n\",\n    \"\\n\",\n    \"if predicted_class == 1:\\n\",\n    \"    print(\\\"Positive sentiment\\\")\\n\",\n    \"else:\\n\",\n    \"    print(\\\"Negative sentiment\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"168ebb28-0be4-43eb-9942-133296ceb56b\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Closing Thoughts\\n\",\n    \"BERT represents an enormous leap forward in our ability to model and understand natural language. It's become a cornerstone for NLP tasks due to its superior handling of context within language. With this tutorial, my goal was to give you a practical introduction to BERT, detailing its basic theory and showing its application in Python using Hugging Face's transformers library.\\n\",\n    \"\\n\",\n    \"By walking through an example of sentiment analysis on IMDb movie reviews, I hope you've gained a clear understanding of how to apply BERT to real-world NLP problems. The Python code I've included here can be adjusted and extended to tackle different tasks and datasets, paving the way for even more sophisticated and accurate language models.\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.9.16\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "recommender/published_notebooks/recommendation_python_lightfm.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Recommendation in Python: LighFM\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/usr/local/lib/python2.7/site-packages/sklearn/ensemble/weight_boosting.py:29: DeprecationWarning: numpy.core.umath_tests is an internal NumPy module and should not be imported. It will be removed in a future NumPy release.\\n\",\n      \"  from numpy.core.umath_tests import inner1d\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# import dependent libraries\\n\",\n    \"import pandas as pd\\n\",\n    \"import os\\n\",\n    \"from scipy.sparse import csr_matrix\\n\",\n    \"import numpy as np\\n\",\n    \"from IPython.display import display_html\\n\",\n    \"import warnings\\n\",\n    \"\\n\",\n    \"import matplotlib.pyplot as plt\\n\",\n    \"from matplotlib.gridspec import GridSpec\\n\",\n    \"import seaborn as sns\\n\",\n    \"%matplotlib inline\\n\",\n    \"\\n\",\n    \"from lightfm.cross_validation import random_train_test_split\\n\",\n    \"from lightfm.evaluation import auc_score, precision_at_k, recall_at_k\\n\",\n    \"from lightfm import LightFM\\n\",\n    \"from skopt import forest_minimize\\n\",\n    \"\\n\",\n    \"def display_side_by_side(*args):\\n\",\n    \"    html_str = ''\\n\",\n    \"    for df in args:\\n\",\n    \"        html_str += df.to_html()\\n\",\n    \"    display_html(html_str.replace(\\n\",\n    \"        'table', 'table style=\\\"display:inline\\\"'), raw=True)\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"# update the working directory to the root of the project\\n\",\n    \"os.chdir('..')\\n\",\n    \"warnings.filterwarnings(\\\"ignore\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"### Goodreads Data\\n\",\n    \"\\n\",\n    \"The datasets were collected in late 2017 from goodreads.com, where we only scraped users' public shelves, i.e. everyone can see it on web without login. User IDs and review IDs are anonymized. \\n\",\n    \"\\n\",\n    \"We collected these datasets for academic use only. Please do not redistribute them or use for commercial purposes. \\n\",\n    \"\\n\",\n    \"\\n\",\n    \"There are three groups of datasets: (1) meta-data of the books, (2) user-book interactions (users' public shelves) and (3) users' detailed book reviews. These datasets can be merged together by matching book/user/review ids. For the purposes of this tutorial, we'll be using only the former two.\\n\",\n    \"\\n\",\n    \"You can download the dataset using in this article from here:\\n\",\n    \"1. Books Metadata: https://drive.google.com/uc?id=1H6xUV48D5sa2uSF_BusW-IBJ7PCQZTS1\\n\",\n    \"2. User-Book Interactions: https://drive.google.com/uc?id=17G5_MeSWuhYnD4fGJMvKRSOlBqCCimxJ\\n\",\n    \"\\n\",\n    \"#### Load Raw Data\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"CPU times: user 2min 10s, sys: 12.8 s, total: 2min 23s\\n\",\n      \"Wall time: 2min 31s\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%%time\\n\",\n    \"books_metadata = pd.read_json('./data/goodreads_books_poetry.json', lines=True)\\n\",\n    \"interactions = pd.read_json('./data/goodreads_interactions_poetry.json', lines=True)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Data Inspection & Preparation: Books Metadata\\n\",\n    \"\\n\",\n    \"Let's start by inspecting the books' metadata information. To develop a reliable and robust ML model, it is essential to get a thorough understanding of the available data.\\n\",\n    \"\\n\",\n    \"As the first step, let's take a look at all the available fields, and sample data\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"array([u'asin', u'authors', u'average_rating', u'book_id',\\n\",\n       \"       u'country_code', u'description', u'edition_information', u'format',\\n\",\n       \"       u'image_url', u'is_ebook', u'isbn', u'isbn13', u'kindle_asin',\\n\",\n       \"       u'language_code', u'link', u'num_pages', u'popular_shelves',\\n\",\n       \"       u'publication_day', u'publication_month', u'publication_year',\\n\",\n       \"       u'publisher', u'ratings_count', u'series', u'similar_books',\\n\",\n       \"       u'text_reviews_count', u'title', u'title_without_series', u'url',\\n\",\n       \"       u'work_id'], dtype=object)\"\n      ]\n     },\n     \"execution_count\": 3,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"books_metadata.columns.values\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>asin</th>\\n\",\n       \"      <th>authors</th>\\n\",\n       \"      <th>average_rating</th>\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>country_code</th>\\n\",\n       \"      <th>description</th>\\n\",\n       \"      <th>edition_information</th>\\n\",\n       \"      <th>format</th>\\n\",\n       \"      <th>image_url</th>\\n\",\n       \"      <th>is_ebook</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>publication_year</th>\\n\",\n       \"      <th>publisher</th>\\n\",\n       \"      <th>ratings_count</th>\\n\",\n       \"      <th>series</th>\\n\",\n       \"      <th>similar_books</th>\\n\",\n       \"      <th>text_reviews_count</th>\\n\",\n       \"      <th>title</th>\\n\",\n       \"      <th>title_without_series</th>\\n\",\n       \"      <th>url</th>\\n\",\n       \"      <th>work_id</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3086</th>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>[{u'author_id': u'5031312', u'role': u''}, {u'...</td>\\n\",\n       \"      <td>4.06</td>\\n\",\n       \"      <td>3656020</td>\\n\",\n       \"      <td>US</td>\\n\",\n       \"      <td>A landmark of world literature, The Divine Com...</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>Hardcover</td>\\n\",\n       \"      <td>https://images.gr-assets.com/books/1397524730m...</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2008</td>\\n\",\n       \"      <td>Barnes and Noble</td>\\n\",\n       \"      <td>462</td>\\n\",\n       \"      <td>[444230]</td>\\n\",\n       \"      <td>[38154, 51799, 3311228, 13767037, 138144, 8756...</td>\\n\",\n       \"      <td>56</td>\\n\",\n       \"      <td>The Divine Comedy</td>\\n\",\n       \"      <td>The Divine Comedy</td>\\n\",\n       \"      <td>https://www.goodreads.com/book/show/3656020-th...</td>\\n\",\n       \"      <td>809248</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>18621</th>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>[{u'author_id': u'152483', u'role': u'Editor'}]</td>\\n\",\n       \"      <td>3.30</td>\\n\",\n       \"      <td>260876</td>\\n\",\n       \"      <td>US</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>Hardcover</td>\\n\",\n       \"      <td>https://s.gr-assets.com/assets/nophoto/book/11...</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1973</td>\\n\",\n       \"      <td>OUP Oxford</td>\\n\",\n       \"      <td>33</td>\\n\",\n       \"      <td>[]</td>\\n\",\n       \"      <td>[]</td>\\n\",\n       \"      <td>4</td>\\n\",\n       \"      <td>The Homeric Hymn to Demeter</td>\\n\",\n       \"      <td>The Homeric Hymn to Demeter</td>\\n\",\n       \"      <td>https://www.goodreads.com/book/show/260876.The...</td>\\n\",\n       \"      <td>252847</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>2 rows × 29 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"      asin                                            authors  average_rating  \\\\\\n\",\n       \"3086        [{u'author_id': u'5031312', u'role': u''}, {u'...            4.06   \\n\",\n       \"18621         [{u'author_id': u'152483', u'role': u'Editor'}]            3.30   \\n\",\n       \"\\n\",\n       \"       book_id country_code  \\\\\\n\",\n       \"3086   3656020           US   \\n\",\n       \"18621   260876           US   \\n\",\n       \"\\n\",\n       \"                                             description edition_information  \\\\\\n\",\n       \"3086   A landmark of world literature, The Divine Com...                       \\n\",\n       \"18621                                                                          \\n\",\n       \"\\n\",\n       \"          format                                          image_url is_ebook  \\\\\\n\",\n       \"3086   Hardcover  https://images.gr-assets.com/books/1397524730m...    false   \\n\",\n       \"18621  Hardcover  https://s.gr-assets.com/assets/nophoto/book/11...    false   \\n\",\n       \"\\n\",\n       \"       ... publication_year         publisher ratings_count    series  \\\\\\n\",\n       \"3086   ...             2008  Barnes and Noble           462  [444230]   \\n\",\n       \"18621  ...             1973        OUP Oxford            33        []   \\n\",\n       \"\\n\",\n       \"                                           similar_books text_reviews_count  \\\\\\n\",\n       \"3086   [38154, 51799, 3311228, 13767037, 138144, 8756...                 56   \\n\",\n       \"18621                                                 []                  4   \\n\",\n       \"\\n\",\n       \"                             title         title_without_series  \\\\\\n\",\n       \"3086             The Divine Comedy            The Divine Comedy   \\n\",\n       \"18621  The Homeric Hymn to Demeter  The Homeric Hymn to Demeter   \\n\",\n       \"\\n\",\n       \"                                                     url work_id  \\n\",\n       \"3086   https://www.goodreads.com/book/show/3656020-th...  809248  \\n\",\n       \"18621  https://www.goodreads.com/book/show/260876.The...  252847  \\n\",\n       \"\\n\",\n       \"[2 rows x 29 columns]\"\n      ]\n     },\n     \"execution_count\": 4,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"books_metadata.sample(2)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"(36514, 29)\"\n      ]\n     },\n     \"execution_count\": 5,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"books_metadata.shape\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"While all the available information is vital to extract contextual information to be able to train a better recommendation system, for this example, we'll only focus on the selected fields that require minimal manipulation.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>average_rating</th>\\n\",\n       \"      <th>is_ebook</th>\\n\",\n       \"      <th>num_pages</th>\\n\",\n       \"      <th>publication_year</th>\\n\",\n       \"      <th>ratings_count</th>\\n\",\n       \"      <th>language_code</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>7326</th>\\n\",\n       \"      <td>333171</td>\\n\",\n       \"      <td>4.02</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>176</td>\\n\",\n       \"      <td>1999</td>\\n\",\n       \"      <td>14</td>\\n\",\n       \"      <td></td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>33952</th>\\n\",\n       \"      <td>32940040</td>\\n\",\n       \"      <td>4.00</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>110</td>\\n\",\n       \"      <td>2017</td>\\n\",\n       \"      <td>38</td>\\n\",\n       \"      <td></td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>11012</th>\\n\",\n       \"      <td>702390</td>\\n\",\n       \"      <td>4.19</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>336</td>\\n\",\n       \"      <td>2002</td>\\n\",\n       \"      <td>53</td>\\n\",\n       \"      <td></td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>31968</th>\\n\",\n       \"      <td>1271413</td>\\n\",\n       \"      <td>4.19</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>1992</td>\\n\",\n       \"      <td>19</td>\\n\",\n       \"      <td>eng</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>31512</th>\\n\",\n       \"      <td>955328</td>\\n\",\n       \"      <td>4.54</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>106</td>\\n\",\n       \"      <td>1994</td>\\n\",\n       \"      <td>28</td>\\n\",\n       \"      <td></td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"        book_id  average_rating is_ebook num_pages publication_year  \\\\\\n\",\n       \"7326     333171            4.02    false       176             1999   \\n\",\n       \"33952  32940040            4.00    false       110             2017   \\n\",\n       \"11012    702390            4.19    false       336             2002   \\n\",\n       \"31968   1271413            4.19    false                       1992   \\n\",\n       \"31512    955328            4.54    false       106             1994   \\n\",\n       \"\\n\",\n       \"       ratings_count language_code  \\n\",\n       \"7326              14                \\n\",\n       \"33952             38                \\n\",\n       \"11012             53                \\n\",\n       \"31968             19           eng  \\n\",\n       \"31512             28                \"\n      ]\n     },\n     \"execution_count\": 6,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Limit the books metadata to selected fields\\n\",\n    \"books_metadata_selected = books_metadata[['book_id', 'average_rating', 'is_ebook', 'num_pages', \\n\",\n    \"                                          'publication_year', 'ratings_count', 'language_code']]\\n\",\n    \"books_metadata_selected.sample(5)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"Now that we have the data with selected fields, next, we'll run it through pandas profiler to perform preliminary exploratory data analysis to help us better understand the available data\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import pandas_profiling\\n\",\n    \"import numpy as np\\n\",\n    \"\\n\",\n    \"# replace blank cells with NaN\\n\",\n    \"books_metadata_selected.replace('', np.nan, inplace=True)\\n\",\n    \"\\n\",\n    \"# not taking book_id into the profiler report\\n\",\n    \"profile = pandas_profiling.ProfileReport(books_metadata_selected[['average_rating', 'is_ebook', 'num_pages', \\n\",\n    \"                                                                  'publication_year', 'ratings_count']])\\n\",\n    \"profile.to_file('./results/profiler_books_metadata_1.html')\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"\\n\",\n    \"Considering the results from the profiler, we'll perform following transformations to the dataset:\\n\",\n    \"- Replace the missing value of categorical values with another value to create a new category\\n\",\n    \"- Convert bin values for numeric variables into discrete intervals\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# using pandas cut method to convert fields into discrete intervals\\n\",\n    \"books_metadata_selected['num_pages'].replace(np.nan, -1, inplace=True)\\n\",\n    \"books_metadata_selected['num_pages'] = pd.to_numeric(books_metadata_selected['num_pages'])\\n\",\n    \"books_metadata_selected['num_pages'] = pd.cut(books_metadata_selected['num_pages'], bins=25)\\n\",\n    \"\\n\",\n    \"# rounding ratings to neares .5 score\\n\",\n    \"books_metadata_selected['average_rating'] = books_metadata_selected['average_rating'].apply(lambda x: round(x*2)/2)\\n\",\n    \"\\n\",\n    \"# using pandas qcut method to convert fields into quantile-based discrete intervals\\n\",\n    \"books_metadata_selected['ratings_count'] = pd.qcut(books_metadata_selected['ratings_count'], 25)\\n\",\n    \"\\n\",\n    \"# replacing missing values to year 2100\\n\",\n    \"books_metadata_selected['publication_year'].replace(np.nan, 2100, inplace=True)\\n\",\n    \"\\n\",\n    \"# replacing missing values to 'unknown'\\n\",\n    \"books_metadata_selected['language_code'].replace(np.nan, 'unknown', inplace=True)\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"# convert is_ebook column into 1/0 where true=1 and false=0\\n\",\n    \"books_metadata_selected['is_ebook'] = books_metadata_selected.is_ebook.map(\\n\",\n    \"    lambda x: 1.0*(x == 'true'))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"profile = pandas_profiling.ProfileReport(books_metadata_selected[['average_rating', 'is_ebook', 'num_pages', \\n\",\n    \"                                                        'publication_year', 'ratings_count']])\\n\",\n    \"profile.to_file('./results/profiler_books_metadata_2.html')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>average_rating</th>\\n\",\n       \"      <th>is_ebook</th>\\n\",\n       \"      <th>num_pages</th>\\n\",\n       \"      <th>publication_year</th>\\n\",\n       \"      <th>ratings_count</th>\\n\",\n       \"      <th>language_code</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>36017</th>\\n\",\n       \"      <td>25517162</td>\\n\",\n       \"      <td>3.5</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>(-11.961, 437.44]</td>\\n\",\n       \"      <td>2100</td>\\n\",\n       \"      <td>(12.0, 14.0]</td>\\n\",\n       \"      <td>ara</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>18331</th>\\n\",\n       \"      <td>7135851</td>\\n\",\n       \"      <td>3.5</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>(-11.961, 437.44]</td>\\n\",\n       \"      <td>2009</td>\\n\",\n       \"      <td>(21.0, 25.0]</td>\\n\",\n       \"      <td>unknown</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>12530</th>\\n\",\n       \"      <td>25606981</td>\\n\",\n       \"      <td>3.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>(-11.961, 437.44]</td>\\n\",\n       \"      <td>2014</td>\\n\",\n       \"      <td>(14.0, 16.0]</td>\\n\",\n       \"      <td>unknown</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>30351</th>\\n\",\n       \"      <td>28604835</td>\\n\",\n       \"      <td>4.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>(-11.961, 437.44]</td>\\n\",\n       \"      <td>2100</td>\\n\",\n       \"      <td>(10.0, 12.0]</td>\\n\",\n       \"      <td>eng</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>14159</th>\\n\",\n       \"      <td>11247623</td>\\n\",\n       \"      <td>2.5</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>(-11.961, 437.44]</td>\\n\",\n       \"      <td>2008</td>\\n\",\n       \"      <td>(7.0, 8.0]</td>\\n\",\n       \"      <td>ara</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"        book_id  average_rating  is_ebook          num_pages publication_year  \\\\\\n\",\n       \"36017  25517162             3.5       0.0  (-11.961, 437.44]             2100   \\n\",\n       \"18331   7135851             3.5       0.0  (-11.961, 437.44]             2009   \\n\",\n       \"12530  25606981             3.0       0.0  (-11.961, 437.44]             2014   \\n\",\n       \"30351  28604835             4.0       0.0  (-11.961, 437.44]             2100   \\n\",\n       \"14159  11247623             2.5       0.0  (-11.961, 437.44]             2008   \\n\",\n       \"\\n\",\n       \"      ratings_count language_code  \\n\",\n       \"36017  (12.0, 14.0]           ara  \\n\",\n       \"18331  (21.0, 25.0]       unknown  \\n\",\n       \"12530  (14.0, 16.0]       unknown  \\n\",\n       \"30351  (10.0, 12.0]           eng  \\n\",\n       \"14159    (7.0, 8.0]           ara  \"\n      ]\n     },\n     \"execution_count\": 10,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"books_metadata_selected.sample(5)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Data Inspection & Preparation: Interactions data\\n\",\n    \"\\n\",\n    \"As the first step, let's take a look at all the available fields, and sample data\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 11,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"array([u'book_id', u'date_added', u'date_updated', u'is_read', u'rating',\\n\",\n       \"       u'read_at', u'review_id', u'review_text_incomplete', u'started_at',\\n\",\n       \"       u'user_id'], dtype=object)\"\n      ]\n     },\n     \"execution_count\": 11,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"interactions.columns.values\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 12,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>date_added</th>\\n\",\n       \"      <th>date_updated</th>\\n\",\n       \"      <th>is_read</th>\\n\",\n       \"      <th>rating</th>\\n\",\n       \"      <th>read_at</th>\\n\",\n       \"      <th>review_id</th>\\n\",\n       \"      <th>review_text_incomplete</th>\\n\",\n       \"      <th>started_at</th>\\n\",\n       \"      <th>user_id</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1347680</th>\\n\",\n       \"      <td>439414</td>\\n\",\n       \"      <td>Mon Feb 21 15:54:21 -0800 2011</td>\\n\",\n       \"      <td>Mon Mar 07 21:57:46 -0800 2011</td>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>4</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>d5822adad9c6a2489c92d0b225c1798f</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>170f95b0339fb6056300f78e5d92d288</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2444877</th>\\n\",\n       \"      <td>175626</td>\\n\",\n       \"      <td>Mon Feb 03 17:21:39 -0800 2014</td>\\n\",\n       \"      <td>Mon Feb 03 17:21:39 -0800 2014</td>\\n\",\n       \"      <td>False</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>b2d7fc05f7ec173e92899436f26c42f7</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>b4043087709ce387d8293b1fd5a68af1</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>864186</th>\\n\",\n       \"      <td>158008</td>\\n\",\n       \"      <td>Tue Jun 23 00:14:03 -0700 2009</td>\\n\",\n       \"      <td>Tue Jun 23 00:14:03 -0700 2009</td>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>4</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>46696d8f43e97004be17e95e8ea4e010</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>b656a86781f91808753cd64f0d26aae5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2693971</th>\\n\",\n       \"      <td>238389</td>\\n\",\n       \"      <td>Sat Sep 02 03:44:52 -0700 2017</td>\\n\",\n       \"      <td>Sat Sep 02 03:44:53 -0700 2017</td>\\n\",\n       \"      <td>False</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>9e5f913fb46aebd974017937616fe33f</td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td></td>\\n\",\n       \"      <td>1dd9334eb5fd263a14da7bebcee9163b</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1296876</th>\\n\",\n       \"      <td>820905</td>\\n\",\n       \"      <td>Fri Oct 09 05:41:44 -0700 2015</td>\\n\",\n       \"      <td>Mon Nov 14 15:58:09 -0800 2016</td>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"      <td>Sun Nov 01 00:00:00 -0700 2015</td>\\n\",\n       \"      <td>4e226da1b805157de0b8bb7248ce6e17</td>\\n\",\n       \"      <td>Fabulous!</td>\\n\",\n       \"      <td>Fri Oct 09 00:00:00 -0700 2015</td>\\n\",\n       \"      <td>f17efdd7949a3d13324a7d12c1c762ce</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"         book_id                      date_added  \\\\\\n\",\n       \"1347680   439414  Mon Feb 21 15:54:21 -0800 2011   \\n\",\n       \"2444877   175626  Mon Feb 03 17:21:39 -0800 2014   \\n\",\n       \"864186    158008  Tue Jun 23 00:14:03 -0700 2009   \\n\",\n       \"2693971   238389  Sat Sep 02 03:44:52 -0700 2017   \\n\",\n       \"1296876   820905  Fri Oct 09 05:41:44 -0700 2015   \\n\",\n       \"\\n\",\n       \"                           date_updated  is_read  rating  \\\\\\n\",\n       \"1347680  Mon Mar 07 21:57:46 -0800 2011     True       4   \\n\",\n       \"2444877  Mon Feb 03 17:21:39 -0800 2014    False       0   \\n\",\n       \"864186   Tue Jun 23 00:14:03 -0700 2009     True       4   \\n\",\n       \"2693971  Sat Sep 02 03:44:53 -0700 2017    False       0   \\n\",\n       \"1296876  Mon Nov 14 15:58:09 -0800 2016     True       5   \\n\",\n       \"\\n\",\n       \"                                read_at                         review_id  \\\\\\n\",\n       \"1347680                                  d5822adad9c6a2489c92d0b225c1798f   \\n\",\n       \"2444877                                  b2d7fc05f7ec173e92899436f26c42f7   \\n\",\n       \"864186                                   46696d8f43e97004be17e95e8ea4e010   \\n\",\n       \"2693971                                  9e5f913fb46aebd974017937616fe33f   \\n\",\n       \"1296876  Sun Nov 01 00:00:00 -0700 2015  4e226da1b805157de0b8bb7248ce6e17   \\n\",\n       \"\\n\",\n       \"        review_text_incomplete                      started_at  \\\\\\n\",\n       \"1347680                                                          \\n\",\n       \"2444877                                                          \\n\",\n       \"864186                                                           \\n\",\n       \"2693971                                                          \\n\",\n       \"1296876              Fabulous!  Fri Oct 09 00:00:00 -0700 2015   \\n\",\n       \"\\n\",\n       \"                                  user_id  \\n\",\n       \"1347680  170f95b0339fb6056300f78e5d92d288  \\n\",\n       \"2444877  b4043087709ce387d8293b1fd5a68af1  \\n\",\n       \"864186   b656a86781f91808753cd64f0d26aae5  \\n\",\n       \"2693971  1dd9334eb5fd263a14da7bebcee9163b  \\n\",\n       \"1296876  f17efdd7949a3d13324a7d12c1c762ce  \"\n      ]\n     },\n     \"execution_count\": 12,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"interactions.sample(5)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 13,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"(2734350, 10)\"\n      ]\n     },\n     \"execution_count\": 13,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"interactions.shape\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"While all the available information is vital to extract contextual information to be able to train a better recommendation system, for this example, we'll only focus on the selected fields that require minimal manipulation.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 14,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>user_id</th>\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>is_read</th>\\n\",\n       \"      <th>rating</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1953585</th>\\n\",\n       \"      <td>00fa285b98be274a5795117c7eacbefb</td>\\n\",\n       \"      <td>159304</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1470153</th>\\n\",\n       \"      <td>c8b35bdfe636ecfad71ad94073ac8d09</td>\\n\",\n       \"      <td>6017893</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1600133</th>\\n\",\n       \"      <td>c525c7b37653b3323858a08b277104e3</td>\\n\",\n       \"      <td>31602</td>\\n\",\n       \"      <td>false</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>850988</th>\\n\",\n       \"      <td>fc7899ccbdff5743c974ba04b2aa0f41</td>\\n\",\n       \"      <td>461938</td>\\n\",\n       \"      <td>true</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>479841</th>\\n\",\n       \"      <td>d6fdb107be00c81e76c42f5a678283e7</td>\\n\",\n       \"      <td>27418</td>\\n\",\n       \"      <td>true</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                  user_id  book_id is_read  rating\\n\",\n       \"1953585  00fa285b98be274a5795117c7eacbefb   159304   false       0\\n\",\n       \"1470153  c8b35bdfe636ecfad71ad94073ac8d09  6017893   false       0\\n\",\n       \"1600133  c525c7b37653b3323858a08b277104e3    31602   false       0\\n\",\n       \"850988   fc7899ccbdff5743c974ba04b2aa0f41   461938    true       5\\n\",\n       \"479841   d6fdb107be00c81e76c42f5a678283e7    27418    true       3\"\n      ]\n     },\n     \"execution_count\": 14,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Limit the books metadata to selected fields\\n\",\n    \"interactions_selected = interactions[['user_id', 'book_id', 'is_read', 'rating']]\\n\",\n    \"\\n\",\n    \"# mapping boolean to string\\n\",\n    \"booleanDictionary = {True: 'true', False: 'false'}\\n\",\n    \"interactions_selected['is_read'] = interactions_selected['is_read'].replace(booleanDictionary)\\n\",\n    \"\\n\",\n    \"interactions_selected.sample(5)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 15,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"profile = pandas_profiling.ProfileReport(interactions_selected[['is_read', 'rating']])\\n\",\n    \"profile.to_file('./results/profiler_interactions.html')\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"\\n\",\n    \"Considering the results from the profiler, we'll perform following transformations to the dataset:\\n\",\n    \"- Convert is_read column to 1/0\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 16,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# convert is_read column into 1/0 where true=1 and false=0\\n\",\n    \"interactions_selected['is_read'] = interactions_selected.is_read.map(\\n\",\n    \"    lambda x: 1.0*(x == 'true'))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 17,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>user_id</th>\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>is_read</th>\\n\",\n       \"      <th>rating</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1565125</th>\\n\",\n       \"      <td>1f9f847ce20c58c12ac7f1e815df5d7f</td>\\n\",\n       \"      <td>22267492</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>38635</th>\\n\",\n       \"      <td>9cd8fb7c611544b2e09ef5226ce8dbcb</td>\\n\",\n       \"      <td>24874353</td>\\n\",\n       \"      <td>1.0</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>847749</th>\\n\",\n       \"      <td>f2bac05b3932fe7c68960041744e5058</td>\\n\",\n       \"      <td>310336</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1046835</th>\\n\",\n       \"      <td>700a4402bc8f09166fc98bc48bc0c525</td>\\n\",\n       \"      <td>11347806</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>486934</th>\\n\",\n       \"      <td>adfc7584a1fef507364c722ad6e3c106</td>\\n\",\n       \"      <td>12966360</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>860513</th>\\n\",\n       \"      <td>5a1c12910a59122c8653307c06cf130f</td>\\n\",\n       \"      <td>27494</td>\\n\",\n       \"      <td>1.0</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>656911</th>\\n\",\n       \"      <td>787f452029e97df574375ace96c5a782</td>\\n\",\n       \"      <td>119234</td>\\n\",\n       \"      <td>1.0</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1800541</th>\\n\",\n       \"      <td>ec728fff5c2c096888e400118c0bc1b0</td>\\n\",\n       \"      <td>76542</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>498215</th>\\n\",\n       \"      <td>c8ff09eaf35e3d6b519cd5594139224c</td>\\n\",\n       \"      <td>69547</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>12693</th>\\n\",\n       \"      <td>38c2feb6b72d473f1b710516e018244e</td>\\n\",\n       \"      <td>493428</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                  user_id   book_id  is_read  rating\\n\",\n       \"1565125  1f9f847ce20c58c12ac7f1e815df5d7f  22267492      0.0       0\\n\",\n       \"38635    9cd8fb7c611544b2e09ef5226ce8dbcb  24874353      1.0       3\\n\",\n       \"847749   f2bac05b3932fe7c68960041744e5058    310336      0.0       0\\n\",\n       \"1046835  700a4402bc8f09166fc98bc48bc0c525  11347806      0.0       0\\n\",\n       \"486934   adfc7584a1fef507364c722ad6e3c106  12966360      0.0       0\\n\",\n       \"860513   5a1c12910a59122c8653307c06cf130f     27494      1.0       5\\n\",\n       \"656911   787f452029e97df574375ace96c5a782    119234      1.0       3\\n\",\n       \"1800541  ec728fff5c2c096888e400118c0bc1b0     76542      0.0       0\\n\",\n       \"498215   c8ff09eaf35e3d6b519cd5594139224c     69547      0.0       0\\n\",\n       \"12693    38c2feb6b72d473f1b710516e018244e    493428      0.0       0\"\n      ]\n     },\n     \"execution_count\": 17,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"interactions_selected.sample(10)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"Since we have two fields denoting interaction between a user and a book, `is_read` and `rating` - let's see how many data points we have where the user hasn't read the book but have given the ratings.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 18,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th>rating</th>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <th>5</th>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>is_read</th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0.0</th>\\n\",\n       \"      <td>1420740.0</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1.0</th>\\n\",\n       \"      <td>84551.0</td>\\n\",\n       \"      <td>20497.0</td>\\n\",\n       \"      <td>64084.0</td>\\n\",\n       \"      <td>237942.0</td>\\n\",\n       \"      <td>405565.0</td>\\n\",\n       \"      <td>500971.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"rating           0        1        2         3         4         5\\n\",\n       \"is_read                                                           \\n\",\n       \"0.0      1420740.0      NaN      NaN       NaN       NaN       NaN\\n\",\n       \"1.0        84551.0  20497.0  64084.0  237942.0  405565.0  500971.0\"\n      ]\n     },\n     \"execution_count\": 18,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"interactions_selected.groupby(['rating', 'is_read']).size().reset_index().pivot(columns='rating', index='is_read', values=0)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"From the above results, we can conclusively infer that users with ratings >= 1 have all read the book. Therefore, we'll use the `ratings` as the final score, drop interactions where `is_read` is false, and limit interactions from random 500 users to limit the data size for further analysis \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 19,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>user_id</th>\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>rating</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>368671</th>\\n\",\n       \"      <td>08184d08ae08d26bacd5c00230141ce5</td>\\n\",\n       \"      <td>6130588</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1755032</th>\\n\",\n       \"      <td>cae938e69bbc27a393153aed6861b9fe</td>\\n\",\n       \"      <td>8744427</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2675075</th>\\n\",\n       \"      <td>26ef759d338ff493bdcd501fb05127db</td>\\n\",\n       \"      <td>1420</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2613547</th>\\n\",\n       \"      <td>44af3a5167e24e9fe4761ef20487e675</td>\\n\",\n       \"      <td>662635</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>794869</th>\\n\",\n       \"      <td>e471985fd3f272455e048aa7f7f05b73</td>\\n\",\n       \"      <td>1420</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>341820</th>\\n\",\n       \"      <td>d5708c6364fa01c5c0c2a31427c10664</td>\\n\",\n       \"      <td>6555075</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1160855</th>\\n\",\n       \"      <td>b8ee26f9150c1e6d46a900c423b44d7b</td>\\n\",\n       \"      <td>133619</td>\\n\",\n       \"      <td>5</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1015659</th>\\n\",\n       \"      <td>4e7d97a0afddd934f01e7d7c22eb3ef1</td>\\n\",\n       \"      <td>240258</td>\\n\",\n       \"      <td>4</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>917119</th>\\n\",\n       \"      <td>c6a203dbca8acc76a49bc68808ccce33</td>\\n\",\n       \"      <td>11030407</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1024204</th>\\n\",\n       \"      <td>cc8abbdd380a5dc5c0d068f99b4eab1c</td>\\n\",\n       \"      <td>160959</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                  user_id   book_id  rating\\n\",\n       \"368671   08184d08ae08d26bacd5c00230141ce5   6130588       5\\n\",\n       \"1755032  cae938e69bbc27a393153aed6861b9fe   8744427       5\\n\",\n       \"2675075  26ef759d338ff493bdcd501fb05127db      1420       5\\n\",\n       \"2613547  44af3a5167e24e9fe4761ef20487e675    662635       5\\n\",\n       \"794869   e471985fd3f272455e048aa7f7f05b73      1420       5\\n\",\n       \"341820   d5708c6364fa01c5c0c2a31427c10664   6555075       5\\n\",\n       \"1160855  b8ee26f9150c1e6d46a900c423b44d7b    133619       5\\n\",\n       \"1015659  4e7d97a0afddd934f01e7d7c22eb3ef1    240258       4\\n\",\n       \"917119   c6a203dbca8acc76a49bc68808ccce33  11030407       0\\n\",\n       \"1024204  cc8abbdd380a5dc5c0d068f99b4eab1c    160959       0\"\n      ]\n     },\n     \"execution_count\": 19,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"import random\\n\",\n    \"\\n\",\n    \"interactions_selected = interactions_selected.loc[interactions_selected['is_read']==1, ['user_id', 'book_id', 'rating']]\\n\",\n    \"\\n\",\n    \"interactions_selected = interactions_selected[interactions_selected['user_id'].isin(random.sample(list(interactions_selected['user_id'].unique()), \\n\",\n    \"                                                                                                  k=5000))]\\n\",\n    \"\\n\",\n    \"interactions_selected.sample(10)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 20,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"(23989, 3)\"\n      ]\n     },\n     \"execution_count\": 20,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"interactions_selected.shape\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Data Preprocessing\\n\",\n    \"\\n\",\n    \"Now, let's transform the available data into CSR sparse matrix that can be used for matrix operations. We will start by the process by creating books_metadata matrix which is np.float64 csr_matrix of shape ([n_books, n_books_features]) – Each row contains that book's weights over features. However, before we create a sparse matrix, we'll first create a item dictionar for future references\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 21,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"item_dict ={}\\n\",\n    \"df = books_metadata[['book_id', 'title']].sort_values('book_id').reset_index()\\n\",\n    \"\\n\",\n    \"for i in range(df.shape[0]):\\n\",\n    \"    item_dict[(df.loc[i,'book_id'])] = df.loc[i,'title']\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 22,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>average_rating_0.0</th>\\n\",\n       \"      <th>average_rating_1.0</th>\\n\",\n       \"      <th>average_rating_1.5</th>\\n\",\n       \"      <th>average_rating_2.0</th>\\n\",\n       \"      <th>average_rating_2.5</th>\\n\",\n       \"      <th>average_rating_3.0</th>\\n\",\n       \"      <th>average_rating_3.5</th>\\n\",\n       \"      <th>average_rating_4.0</th>\\n\",\n       \"      <th>average_rating_4.5</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>language_code_tel</th>\\n\",\n       \"      <th>language_code_tgl</th>\\n\",\n       \"      <th>language_code_tha</th>\\n\",\n       \"      <th>language_code_tlh</th>\\n\",\n       \"      <th>language_code_tur</th>\\n\",\n       \"      <th>language_code_ukr</th>\\n\",\n       \"      <th>language_code_unknown</th>\\n\",\n       \"      <th>language_code_urd</th>\\n\",\n       \"      <th>language_code_vie</th>\\n\",\n       \"      <th>language_code_zho</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <td>234</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <td>236</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <td>241</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <td>244</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <td>254</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>5 rows × 358 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"   book_id  average_rating_0.0  average_rating_1.0  average_rating_1.5  \\\\\\n\",\n       \"0      234                   0                   0                   0   \\n\",\n       \"1      236                   0                   0                   0   \\n\",\n       \"2      241                   0                   0                   0   \\n\",\n       \"3      244                   0                   0                   0   \\n\",\n       \"4      254                   0                   0                   0   \\n\",\n       \"\\n\",\n       \"   average_rating_2.0  average_rating_2.5  average_rating_3.0  \\\\\\n\",\n       \"0                   0                   0                   0   \\n\",\n       \"1                   0                   0                   0   \\n\",\n       \"2                   0                   0                   0   \\n\",\n       \"3                   0                   0                   0   \\n\",\n       \"4                   0                   0                   0   \\n\",\n       \"\\n\",\n       \"   average_rating_3.5  average_rating_4.0  average_rating_4.5  ...  \\\\\\n\",\n       \"0                   0                   1                   0  ...   \\n\",\n       \"1                   0                   1                   0  ...   \\n\",\n       \"2                   1                   0                   0  ...   \\n\",\n       \"3                   0                   1                   0  ...   \\n\",\n       \"4                   0                   1                   0  ...   \\n\",\n       \"\\n\",\n       \"   language_code_tel  language_code_tgl  language_code_tha  language_code_tlh  \\\\\\n\",\n       \"0                  0                  0                  0                  0   \\n\",\n       \"1                  0                  0                  0                  0   \\n\",\n       \"2                  0                  0                  0                  0   \\n\",\n       \"3                  0                  0                  0                  0   \\n\",\n       \"4                  0                  0                  0                  0   \\n\",\n       \"\\n\",\n       \"   language_code_tur  language_code_ukr  language_code_unknown  \\\\\\n\",\n       \"0                  0                  0                      1   \\n\",\n       \"1                  0                  0                      1   \\n\",\n       \"2                  0                  0                      1   \\n\",\n       \"3                  0                  0                      0   \\n\",\n       \"4                  0                  0                      1   \\n\",\n       \"\\n\",\n       \"   language_code_urd  language_code_vie  language_code_zho  \\n\",\n       \"0                  0                  0                  0  \\n\",\n       \"1                  0                  0                  0  \\n\",\n       \"2                  0                  0                  0  \\n\",\n       \"3                  0                  0                  0  \\n\",\n       \"4                  0                  0                  0  \\n\",\n       \"\\n\",\n       \"[5 rows x 358 columns]\"\n      ]\n     },\n     \"execution_count\": 22,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# dummify categorical features\\n\",\n    \"books_metadata_selected_transformed = pd.get_dummies(books_metadata_selected, columns = ['average_rating', 'is_ebook', 'num_pages', \\n\",\n    \"                                                                                         'publication_year', 'ratings_count', \\n\",\n    \"                                                                                         'language_code'])\\n\",\n    \"\\n\",\n    \"books_metadata_selected_transformed = books_metadata_selected_transformed.sort_values('book_id').reset_index().drop('index', axis=1)\\n\",\n    \"books_metadata_selected_transformed.head(5)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 23,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"<36514x357 sparse matrix of type '<type 'numpy.uint8'>'\\n\",\n       \"\\twith 219084 stored elements in Compressed Sparse Row format>\"\n      ]\n     },\n     \"execution_count\": 23,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# convert to csr matrix\\n\",\n    \"books_metadata_csr = csr_matrix(books_metadata_selected_transformed.drop('book_id', axis=1).values)\\n\",\n    \"books_metadata_csr\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"Next we'll create a iteractions matrix which is np.float64 csr_matrix of shape ([n_users, n_books]). We'll also create a user dictionary for future use cases\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 24,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th>book_id</th>\\n\",\n       \"      <th>234</th>\\n\",\n       \"      <th>236</th>\\n\",\n       \"      <th>254</th>\\n\",\n       \"      <th>284</th>\\n\",\n       \"      <th>289</th>\\n\",\n       \"      <th>290</th>\\n\",\n       \"      <th>291</th>\\n\",\n       \"      <th>292</th>\\n\",\n       \"      <th>459</th>\\n\",\n       \"      <th>462</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>35663570</th>\\n\",\n       \"      <th>35668923</th>\\n\",\n       \"      <th>35670989</th>\\n\",\n       \"      <th>35704999</th>\\n\",\n       \"      <th>35878020</th>\\n\",\n       \"      <th>35887236</th>\\n\",\n       \"      <th>36070215</th>\\n\",\n       \"      <th>36096745</th>\\n\",\n       \"      <th>36122873</th>\\n\",\n       \"      <th>36295400</th>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>user_id</th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>001404f6349ae5aa020fbd9e30196067</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>002071c96681a45ca8f8dac10d080275</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>003299b767208cf9f83950e311e6856d</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0035eb991e74d5411f6f3ee88c6baff1</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>003c3cbe1f0bf247fc1bae43984e333b</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0059f2af7ba41747be006788caa26f78</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>005d83a471aed1691c8447b52ce4baaa</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>006142c2fdbc566078193da9d3c11a4a</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>006679ea5ba690fe5238d11238643a5c</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>006cace4ca0fb1c344e7148f5e63f22a</th>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"      <td>0.0</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>10 rows × 7396 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"book_id                           234       236       254       284       \\\\\\n\",\n       \"user_id                                                                    \\n\",\n       \"001404f6349ae5aa020fbd9e30196067       0.0       0.0       0.0       0.0   \\n\",\n       \"002071c96681a45ca8f8dac10d080275       0.0       0.0       0.0       0.0   \\n\",\n       \"003299b767208cf9f83950e311e6856d       0.0       0.0       0.0       0.0   \\n\",\n       \"0035eb991e74d5411f6f3ee88c6baff1       0.0       0.0       0.0       0.0   \\n\",\n       \"003c3cbe1f0bf247fc1bae43984e333b       0.0       0.0       0.0       0.0   \\n\",\n       \"0059f2af7ba41747be006788caa26f78       0.0       0.0       0.0       0.0   \\n\",\n       \"005d83a471aed1691c8447b52ce4baaa       0.0       0.0       0.0       0.0   \\n\",\n       \"006142c2fdbc566078193da9d3c11a4a       0.0       0.0       0.0       0.0   \\n\",\n       \"006679ea5ba690fe5238d11238643a5c       0.0       0.0       0.0       0.0   \\n\",\n       \"006cace4ca0fb1c344e7148f5e63f22a       0.0       0.0       0.0       0.0   \\n\",\n       \"\\n\",\n       \"book_id                           289       290       291       292       \\\\\\n\",\n       \"user_id                                                                    \\n\",\n       \"001404f6349ae5aa020fbd9e30196067       0.0       0.0       0.0       0.0   \\n\",\n       \"002071c96681a45ca8f8dac10d080275       0.0       0.0       0.0       0.0   \\n\",\n       \"003299b767208cf9f83950e311e6856d       0.0       0.0       0.0       0.0   \\n\",\n       \"0035eb991e74d5411f6f3ee88c6baff1       0.0       0.0       0.0       0.0   \\n\",\n       \"003c3cbe1f0bf247fc1bae43984e333b       0.0       0.0       0.0       0.0   \\n\",\n       \"0059f2af7ba41747be006788caa26f78       0.0       0.0       0.0       0.0   \\n\",\n       \"005d83a471aed1691c8447b52ce4baaa       0.0       0.0       0.0       0.0   \\n\",\n       \"006142c2fdbc566078193da9d3c11a4a       0.0       0.0       0.0       0.0   \\n\",\n       \"006679ea5ba690fe5238d11238643a5c       0.0       0.0       0.0       0.0   \\n\",\n       \"006cace4ca0fb1c344e7148f5e63f22a       0.0       0.0       0.0       0.0   \\n\",\n       \"\\n\",\n       \"book_id                           459       462       ...  35663570  35668923  \\\\\\n\",\n       \"user_id                                               ...                       \\n\",\n       \"001404f6349ae5aa020fbd9e30196067       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"002071c96681a45ca8f8dac10d080275       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"003299b767208cf9f83950e311e6856d       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"0035eb991e74d5411f6f3ee88c6baff1       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"003c3cbe1f0bf247fc1bae43984e333b       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"0059f2af7ba41747be006788caa26f78       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"005d83a471aed1691c8447b52ce4baaa       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"006142c2fdbc566078193da9d3c11a4a       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"006679ea5ba690fe5238d11238643a5c       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"006cace4ca0fb1c344e7148f5e63f22a       0.0       0.0  ...       0.0       0.0   \\n\",\n       \"\\n\",\n       \"book_id                           35670989  35704999  35878020  35887236  \\\\\\n\",\n       \"user_id                                                                    \\n\",\n       \"001404f6349ae5aa020fbd9e30196067       0.0       0.0       0.0       0.0   \\n\",\n       \"002071c96681a45ca8f8dac10d080275       0.0       0.0       0.0       0.0   \\n\",\n       \"003299b767208cf9f83950e311e6856d       0.0       0.0       0.0       0.0   \\n\",\n       \"0035eb991e74d5411f6f3ee88c6baff1       0.0       0.0       0.0       0.0   \\n\",\n       \"003c3cbe1f0bf247fc1bae43984e333b       0.0       0.0       0.0       0.0   \\n\",\n       \"0059f2af7ba41747be006788caa26f78       0.0       0.0       0.0       0.0   \\n\",\n       \"005d83a471aed1691c8447b52ce4baaa       0.0       0.0       0.0       0.0   \\n\",\n       \"006142c2fdbc566078193da9d3c11a4a       0.0       0.0       0.0       0.0   \\n\",\n       \"006679ea5ba690fe5238d11238643a5c       0.0       0.0       0.0       0.0   \\n\",\n       \"006cace4ca0fb1c344e7148f5e63f22a       0.0       0.0       0.0       0.0   \\n\",\n       \"\\n\",\n       \"book_id                           36070215  36096745  36122873  36295400  \\n\",\n       \"user_id                                                                   \\n\",\n       \"001404f6349ae5aa020fbd9e30196067       0.0       0.0       0.0       0.0  \\n\",\n       \"002071c96681a45ca8f8dac10d080275       0.0       0.0       0.0       0.0  \\n\",\n       \"003299b767208cf9f83950e311e6856d       0.0       0.0       0.0       0.0  \\n\",\n       \"0035eb991e74d5411f6f3ee88c6baff1       0.0       0.0       0.0       0.0  \\n\",\n       \"003c3cbe1f0bf247fc1bae43984e333b       0.0       0.0       0.0       0.0  \\n\",\n       \"0059f2af7ba41747be006788caa26f78       0.0       0.0       0.0       0.0  \\n\",\n       \"005d83a471aed1691c8447b52ce4baaa       0.0       0.0       0.0       0.0  \\n\",\n       \"006142c2fdbc566078193da9d3c11a4a       0.0       0.0       0.0       0.0  \\n\",\n       \"006679ea5ba690fe5238d11238643a5c       0.0       0.0       0.0       0.0  \\n\",\n       \"006cace4ca0fb1c344e7148f5e63f22a       0.0       0.0       0.0       0.0  \\n\",\n       \"\\n\",\n       \"[10 rows x 7396 columns]\"\n      ]\n     },\n     \"execution_count\": 24,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"user_book_interaction = pd.pivot_table(interactions_selected, index='user_id', columns='book_id', values='rating')\\n\",\n    \"\\n\",\n    \"# fill missing values with 0\\n\",\n    \"user_book_interaction = user_book_interaction.fillna(0)\\n\",\n    \"\\n\",\n    \"user_book_interaction.head(10)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 25,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"user_id = list(user_book_interaction.index)\\n\",\n    \"user_dict = {}\\n\",\n    \"counter = 0 \\n\",\n    \"for i in user_id:\\n\",\n    \"    user_dict[i] = counter\\n\",\n    \"    counter += 1\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 26,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"<5000x7396 sparse matrix of type '<type 'numpy.float64'>'\\n\",\n       \"\\twith 22395 stored elements in Compressed Sparse Row format>\"\n      ]\n     },\n     \"execution_count\": 26,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# convert to csr matrix\\n\",\n    \"user_book_interaction_csr = csr_matrix(user_book_interaction.values)\\n\",\n    \"user_book_interaction_csr\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"### Model Training\\n\",\n    \"\\n\",\n    \"Ideally, we would build, train, and evaluate several models for our recommender system to determine which model holds the most promise for further optimization (hyper-parameter tuning).\\n\",\n    \"\\n\",\n    \"However, for this tutorial, we'll train the base model, with randomly selected input parameters for demonstrations.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 27,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"model = LightFM(loss='warp',\\n\",\n    \"                random_state=2016,\\n\",\n    \"                learning_rate=0.90,\\n\",\n    \"                no_components=150,\\n\",\n    \"                user_alpha=0.000005)\\n\",\n    \"\\n\",\n    \"model = model.fit(user_book_interaction_csr,\\n\",\n    \"                  epochs=100,\\n\",\n    \"                  num_threads=16, verbose=False)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"** **\\n\",\n    \"#### Top n Recommendations\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 28,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def sample_recommendation_user(model, interactions, user_id, user_dict, \\n\",\n    \"                               item_dict,threshold = 0,nrec_items = 5, show = True):\\n\",\n    \"    \\n\",\n    \"    n_users, n_items = interactions.shape\\n\",\n    \"    user_x = user_dict[user_id]\\n\",\n    \"    scores = pd.Series(model.predict(user_x,np.arange(n_items), item_features=books_metadata_csr))\\n\",\n    \"    scores.index = interactions.columns\\n\",\n    \"    scores = list(pd.Series(scores.sort_values(ascending=False).index))\\n\",\n    \"    \\n\",\n    \"    known_items = list(pd.Series(interactions.loc[user_id,:] \\\\\\n\",\n    \"                                 [interactions.loc[user_id,:] > threshold].index).sort_values(ascending=False))\\n\",\n    \"    \\n\",\n    \"    scores = [x for x in scores if x not in known_items]\\n\",\n    \"    return_score_list = scores[0:nrec_items]\\n\",\n    \"    known_items = list(pd.Series(known_items).apply(lambda x: item_dict[x]))\\n\",\n    \"    scores = list(pd.Series(return_score_list).apply(lambda x: item_dict[x]))\\n\",\n    \"    if show == True:\\n\",\n    \"        print (\\\"User: \\\" + str(user_id))\\n\",\n    \"        print(\\\"Known Likes:\\\")\\n\",\n    \"        counter = 1\\n\",\n    \"        for i in known_items:\\n\",\n    \"            print(str(counter) + '- ' + i)\\n\",\n    \"            counter+=1\\n\",\n    \"\\n\",\n    \"        print(\\\"\\\\n Recommended Items:\\\")\\n\",\n    \"        counter = 1\\n\",\n    \"        for i in scores:\\n\",\n    \"            print(str(counter) + '- ' + i)\\n\",\n    \"            counter+=1\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 171,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"User: ff52b7331f2ccab0582678644fed9d85\\n\",\n      \"Known Likes:\\n\",\n      \"1- Brown Girl Dreaming\\n\",\n      \"2- The Crossover\\n\",\n      \"3- Love, Dishonor, Marry, Die, Cherish, Perish\\n\",\n      \"4- Odysséen\\n\",\n      \"5- Iliaden\\n\",\n      \"6- The Weight of Water\\n\",\n      \"7- Fänrik Ståls sägner\\n\",\n      \"8- Eddan: De nordiska guda- och hjältesångerna\\n\",\n      \"9- V.\\n\",\n      \"10- Aniara: An Epic Science Fiction Poem\\n\",\n      \"11- The Melancholy Death of Oyster Boy and Other Stories\\n\",\n      \"12- Paradise Regained by John Milton\\n\",\n      \"13- The Tent\\n\",\n      \"14- Paradise Lost\\n\",\n      \"15- Hamlet\\n\",\n      \"\\n\",\n      \" Recommended Items:\\n\",\n      \"1- Bronx Masquerade\\n\",\n      \"2- La Navidad para un niño en Gales\\n\",\n      \"3- How We Fare\\n\",\n      \"4- Maya Angelou: The Complete Poetry\\n\",\n      \"5- Shakespeare's Love Sonnets\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"sample_recommendation_user(model, user_book_interaction, 'ff52b7331f2ccab0582678644fed9d85', user_dict, item_dict)\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 2\",\n   \"language\": \"python\",\n   \"name\": \"python2\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 2\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython2\",\n   \"version\": \"2.7.13\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "recommender/results/profiler_books_metadata_1.html",
    "content": "<!doctype html>\n\n<html lang=\"en\">\n<head>\n  <meta charset=\"utf-8\">\n\n  <title>Profile report</title>\n  <meta name=\"description\" content=\"Profile report generated by pandas-profiling. See GitHub.\">\n  <meta name=\"author\" content=\"pandas-profiling\">\n    <script src=\"https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js\"></script>\n\n    <link rel=\"stylesheet\" href=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css\"\n          integrity=\"sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7\" crossorigin=\"anonymous\">\n    <link rel=\"stylesheet\" href=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap-theme.min.css\"\n          integrity=\"sha384-fLW2N01lMqjakBkx3l/M9EahuwpSfeNvV63J5ezn3uZzapT0u7EYsXMjQV+0En5r\" crossorigin=\"anonymous\">\n    <script src=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js\" integrity=\"sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS\" crossorigin=\"anonymous\"></script>\n    <script>\n       $(function () {\n              $('[data-toggle=\"tooltip\"]').tooltip()\n        })\n    </script>\n</head>\n\n<body>\n    <meta charset=\"UTF-8\">\n\n<style>\n\n        .variablerow {\n            border: 1px solid #e1e1e8;\n            border-top: hidden;\n            padding-top: 2em;\n            padding-bottom: 2em;\n            padding-left: 1em;\n            padding-right: 1em;\n        }\n\n        .headerrow {\n            border: 1px solid #e1e1e8;\n            background-color: #f5f5f5;\n            padding: 2em;\n        }\n        .namecol {\n            margin-top: -1em;\n            overflow-x: auto;\n        }\n\n        .dl-horizontal dt {\n            text-align: left;\n            padding-right: 1em;\n            white-space: normal;\n        }\n\n        .dl-horizontal dd {\n            margin-left: 0;\n        }\n\n        .ignore {\n            opacity: 0.4;\n        }\n\n        .container.pandas-profiling {\n            max-width:975px;\n        }\n\n        .col-md-12 {\n            padding-left: 2em;\n        }\n\n        .indent {\n            margin-left: 1em;\n        }\n\n        /* Table example_values */\n            table.example_values {\n                border: 0;\n            }\n\n            .example_values th {\n                border: 0;\n                padding: 0 ;\n                color: #555;\n                font-weight: 600;\n            }\n\n            .example_values tr, .example_values td{\n                border: 0;\n                padding: 0;\n                color: #555;\n            }\n\n        /* STATS */\n            table.stats {\n                border: 0;\n            }\n\n            .stats th {\n                border: 0;\n                padding: 0 2em 0 0;\n                color: #555;\n                font-weight: 600;\n            }\n\n            .stats tr {\n                border: 0;\n            }\n\n            .stats tr:hover{\n                text-decoration: underline;\n            }\n\n            .stats td{\n                color: #555;\n                padding: 1px;\n                border: 0;\n            }\n\n\n        /* Sample table */\n            table.sample {\n                border: 0;\n                margin-bottom: 2em;\n                margin-left:1em;\n            }\n            .sample tr {\n                border:0;\n            }\n            .sample td, .sample th{\n                padding: 0.5em;\n                white-space: nowrap;\n                border: none;\n\n            }\n\n            .sample thead {\n                border-top: 0;\n                border-bottom: 2px solid #ddd;\n            }\n\n            .sample td {\n                width:100%;\n            }\n\n\n        /* There is no good solution available to make the divs equal height and then center ... */\n            .histogram {\n                margin-top: 3em;\n            }\n        /* Freq table */\n\n            table.freq {\n                margin-bottom: 2em;\n                border: 0;\n            }\n            table.freq th, table.freq tr, table.freq td {\n                border: 0;\n                padding: 0;\n            }\n\n            .freq thead {\n                font-weight: 600;\n                white-space: nowrap;\n                overflow: hidden;\n                text-overflow: ellipsis;\n\n            }\n\n            td.fillremaining{\n                width:auto;\n                max-width: none;\n            }\n\n            td.number, th.number {\n                text-align:right ;\n            }\n\n        /* Freq mini */\n            .freq.mini td{\n                width: 50%;\n                padding: 1px;\n                font-size: 12px;\n\n            }\n            table.freq.mini {\n                 width:100%;\n            }\n            .freq.mini th {\n                overflow: hidden;\n                text-overflow: ellipsis;\n                white-space: nowrap;\n                max-width: 5em;\n                font-weight: 400;\n                text-align:right;\n                padding-right: 0.5em;\n            }\n\n            .missing {\n                color: #a94442;\n            }\n            .alert, .alert > th, .alert > td {\n                color: #a94442;\n            }\n\n\n        /* Bars in tables */\n            .freq .bar{\n                float: left;\n                width: 0;\n                height: 100%;\n                line-height: 20px;\n                color: #fff;\n                text-align: center;\n                background-color: #337ab7;\n                border-radius: 3px;\n                margin-right: 4px;\n            }\n            .other .bar {\n                background-color: #999;\n            }\n            .missing .bar{\n                background-color: #a94442;\n            }\n            .tooltip-inner {\n                width: 100%;\n                white-space: nowrap;\n                text-align:left;\n            }\n\n            .extrapadding{\n                padding: 2em;\n            }\n\n</style>\n\n<div class=\"container pandas-profiling\">\n    <div class=\"row headerrow highlight\">\n        <h1>Overview</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-6 namecol\">\n        <p class=\"h4\">Dataset info</p>\n        <table class=\"stats\" style=\"margin-left: 1em;\">\n            <tbody>\n            <tr>\n                <th>Number of variables</th>\n                <td>5 </td>\n            </tr>\n            <tr>\n                <th>Number of observations</th>\n                <td>36514 </td>\n            </tr>\n            <tr>\n                <th>Total Missing (%)</th>\n                <td>7.3% </td>\n            </tr>\n            <tr>\n                <th>Total size in memory</th>\n                <td>1.4 MiB </td>\n            </tr>\n            <tr>\n                <th>Average record size in memory</th>\n                <td>40.0 B </td>\n            </tr>\n            </tbody>\n        </table>\n    </div>\n    <div class=\"col-md-6 namecol\">\n        <p class=\"h4\">Variables types</p>\n        <table class=\"stats\" style=\"margin-left: 1em;\">\n            <tbody>\n            <tr>\n                <th>Numeric</th>\n                <td>2 </td>\n            </tr>\n            <tr>\n                <th>Categorical</th>\n                <td>3 </td>\n            </tr>\n            <tr>\n                <th>Date</th>\n                <td>0 </td>\n            </tr>\n            <tr>\n                <th>Text (Unique)</th>\n                <td>0 </td>\n            </tr>\n            <tr>\n                <th>Rejected</th>\n                <td>0 </td>\n            </tr>\n            </tbody>\n        </table>\n    </div>\n    <div class=\"col-md-12\" style=\"padding-left: 1em;\">\n        <p class=\"h4\">Warnings</p>\n        <ul class=\"list-unstyled\"><li><code>num_pages</code> has 7505 / 20.6% missing values <span class=\"label label-default\">Missing</span></l><li><code>num_pages</code> has a high cardinality: 1059 distinct values  <span class=\"label label-warning\">Warning</span></l><li><code>publication_year</code> has 5816 / 15.9% missing values <span class=\"label label-default\">Missing</span></l><li><code>publication_year</code> has a high cardinality: 202 distinct values  <span class=\"label label-warning\">Warning</span></l><li><code>ratings_count</code> is highly skewed (γ1 = 99.17)</l><li>Dataset has 1158 duplicate rows <span class=\"label label-warning\">Warning</span></l> </ul>\n    </div>\n</div>\n    <div class=\"row headerrow highlight\">\n        <h1>Variables</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">average_rating<br/>\n            <small>Numeric</small>\n        </p>\n    </div><div class=\"col-md-6\">\n    <div class=\"row\">\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n                <tr>\n                    <th>Distinct count</th>\n                    <td>282</td>\n                </tr>\n                <tr>\n                    <th>Unique (%)</th>\n                    <td>0.8%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (n)</th>\n                    <td>0</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (n)</th>\n                    <td>0</td>\n                </tr>\n            </table>\n\n        </div>\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n\n                <tr>\n                    <th>Mean</th>\n                    <td>4.0638</td>\n                </tr>\n                <tr>\n                    <th>Minimum</th>\n                    <td>0</td>\n                </tr>\n                <tr>\n                    <th>Maximum</th>\n                    <td>5</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Zeros (%)</th>\n                    <td>0.0%</td>\n                </tr>\n            </table>\n        </div>\n    </div>\n</div>\n<div class=\"col-md-3 collapse in\" id=\"minihistogram4156873781702074801\">\n    <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAABLCAYAAAA1fMjoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAARhJREFUeJzt3cEJwkAQQFEjlmQR9uTZnizCntYG5INCzGreuwfm8pnsYZNljDEOwEvHrQeAmZ22HoDfcL7e337mcbusMMl32SAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCwZXbHfrk%2Buxe2SAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAsGFKVbzD/81tEEgCASCQCAIBIJD%2Bh/wlZL12CAQBAJBIBAEAsEhfTIO3HNZxhhj6yFgVl6xIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIDwBL6ASzFY6yiEAAAAASUVORK5CYII%3D\">\n\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#descriptives4156873781702074801,#minihistogram4156873781702074801\"\n       aria-expanded=\"false\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"row collapse col-md-12\" id=\"descriptives4156873781702074801\">\n    <ul class=\"nav nav-tabs\" role=\"tablist\">\n        <li role=\"presentation\" class=\"active\"><a href=\"#quantiles4156873781702074801\"\n                                                  aria-controls=\"quantiles4156873781702074801\" role=\"tab\"\n                                                  data-toggle=\"tab\">Statistics</a></li>\n        <li role=\"presentation\"><a href=\"#histogram4156873781702074801\" aria-controls=\"histogram4156873781702074801\"\n                                   role=\"tab\" data-toggle=\"tab\">Histogram</a></li>\n        <li role=\"presentation\"><a href=\"#common4156873781702074801\" aria-controls=\"common4156873781702074801\"\n                                   role=\"tab\" data-toggle=\"tab\">Common Values</a></li>\n        <li role=\"presentation\"><a href=\"#extreme4156873781702074801\" aria-controls=\"extreme4156873781702074801\"\n                                   role=\"tab\" data-toggle=\"tab\">Extreme Values</a></li>\n\n    </ul>\n\n    <div class=\"tab-content\">\n        <div role=\"tabpanel\" class=\"tab-pane active row\" id=\"quantiles4156873781702074801\">\n            <div class=\"col-md-4 col-md-offset-1\">\n                <p class=\"h4\">Quantile statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Minimum</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>5-th percentile</th>\n                        <td>3.39</td>\n                    </tr>\n                    <tr>\n                        <th>Q1</th>\n                        <td>3.84</td>\n                    </tr>\n                    <tr>\n                        <th>Median</th>\n                        <td>4.1</td>\n                    </tr>\n                    <tr>\n                        <th>Q3</th>\n                        <td>4.31</td>\n                    </tr>\n                    <tr>\n                        <th>95-th percentile</th>\n                        <td>4.67</td>\n                    </tr>\n                    <tr>\n                        <th>Maximum</th>\n                        <td>5</td>\n                    </tr>\n                    <tr>\n                        <th>Range</th>\n                        <td>5</td>\n                    </tr>\n                    <tr>\n                        <th>Interquartile range</th>\n                        <td>0.47</td>\n                    </tr>\n                </table>\n            </div>\n            <div class=\"col-md-4 col-md-offset-2\">\n                <p class=\"h4\">Descriptive statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Standard deviation</th>\n                        <td>0.39996</td>\n                    </tr>\n                    <tr>\n                        <th>Coef of variation</th>\n                        <td>0.09842</td>\n                    </tr>\n                    <tr>\n                        <th>Kurtosis</th>\n                        <td>5.0568</td>\n                    </tr>\n                    <tr>\n                        <th>Mean</th>\n                        <td>4.0638</td>\n                    </tr>\n                    <tr>\n                        <th>MAD</th>\n                        <td>0.29887</td>\n                    </tr>\n                    <tr class=\"\">\n                        <th>Skewness</th>\n                        <td>-1.0022</td>\n                    </tr>\n                    <tr>\n                        <th>Sum</th>\n                        <td>148390</td>\n                    </tr>\n                    <tr>\n                        <th>Variance</th>\n                        <td>0.15997</td>\n                    </tr>\n                    <tr>\n                        <th>Memory size</th>\n                        <td>285.3 KiB</td>\n                    </tr>\n                </table>\n            </div>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-8 col-md-offset-2\" id=\"histogram4156873781702074801\">\n            <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAlgAAAGQCAYAAAByNR6YAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAIABJREFUeJzt3Xt0VOXd/v8ryZBADpMDkKgxBeQQyUFUxNhICQcRzwqCIX0UqVgBozERLAJSRFHwIRgt4QGilUp12amBKkQFGhXFlrZqKw4RPARYKF8wg8wYEsMhyfz%2B8OfUEZQg93Yyw/u11qzAfc/c%2B7M/a6/kWnvv7IR5vV6vAAAAYEx4oAsAAAAINQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGCYLdAFnCpcrgPG1wwPD1NSUoz2729Ua6vX%2BPqnMnprHXprLfprHXprHSt727VrnNH12oozWEEsPDxMYWFhCg8PC3QpIYfeWofeWov%2BWofeWicUe0vAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDbIEuAACAYHHBzLWBLuGEvFJ8caBLOGVxBgsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGNauA9bGjRuVm5urkpISv/H77rtP2dnZfq%2BMjAxNnz5dknTvvfcqIyPDb/6CCy7wfd7j8ai4uFi5ubkaOHCgZs6cqYMHD/rmt27dqhtvvFH9%2B/fXpZdeqqeeeuqn2WEAABAS2m3AeuKJJzR37lx169btqLm5c%2BfK6XT6Xv/5z3901lln6bLLLvO9Z/LkyX7veeedd3xzs2bNUlNTk6qqqrRy5UrV1taqtLRUknTw4EFNnDhRF110kTZu3KiysjItW7ZM69evt36nAQBASGi3ASsqKkqVlZXHDFjf9fTTT%2BuMM85QXl7ecd%2B7b98%2BVVdXq6SkRElJSUpJSdHtt9%2BulStX6siRI9qwYYOOHDmiyZMnKzo6WpmZmRozZowcDoeJ3QIAAKeAdhuwxo0bp7i4uOO%2Br76%2BXkuXLtU999zjN/6Pf/xD1113nc477zyNHj1aW7ZskfT15b%2BIiAilp6f73puZmamvvvpK27dvV01NjdLT0xUREeGbz8jI8H0eAADgeGyBLuBkPfPMMxowYIB69%2B7tG0tLS1N4eLjuuusuxcTEqLy8XLfccovWrVsnj8ej2NhYhYWF%2Bd4fHx8vSXK73fJ4PLLb7X7bSEhIkMfjUWtrq8LDj59J6%2Brq5HK5/MZstmglJyefzK4eJSIi3O8rzKG31qG31qK/1gnGntpswVFzKB63QR2wWlpa9Oyzz2rhwoV%2B44WFhX7/v%2Beee1RVVaXq6mp17NhRXq/3hLf17UB2PA6HQ%2BXl5UfVVFRUdMLbbQu7vZMl64LeWoneWov%2BQpISE2MCXcIJCaXjNqgD1ttvv63Dhw/7/YbgsUREROj0009XXV2dzj33XDU0NKilpcV3GdDj8UiSOnfurKSkJO3cudPv8x6PRwkJCW06eyVJ%2Bfn5Gjp0qN%2BYzRYtt7uxjXvWNhER4bLbO6m%2BvkktLa1G1z7V0Vvr0Ftr0V/rBOPZFdM/d6xi5XEbqJAZ1AHr1Vdf1UUXXSSb7b%2B74fV6NX/%2BfI0cOVJnn322JOnw4cPatWuX0tLS1LdvX3m9Xm3btk2ZmZmSJKfTKbvdrh49eigrK0vPPfecmpubfes6nU7169evzXUlJycfdTnQ5Tqg5mZrvtm1tLRatvapjt5ah95ai/5CUtAdA6F03AZfHP%2BWrVu36swzz/QbCwsL02effaY5c%2Bbo888/V2Njo0pLS9WhQwddcsklSkpK0ogRI/TYY49p//792rt3rxYvXqzRo0fLZrMpLy9PsbGxWrJkiZqamrR582ZVVlaqoKAgQHsJAACCTbs9g5WdnS1Jam5uliRVV1dL%2Bvps0jdcLpe6dOly1GcfeughPfLIIxo1apQaGhp0zjnn6Omnn1Z0dLQk6YEHHtDs2bM1bNgwdejQQVdddZXvYaaRkZFaunSpZs%2BerYqKCnXp0kUlJSUaPHiwlbsLAABCSJj3x9zxjRPmch0wvqbNFq7ExBi53Y0hc0q1vaC31qG31qK/1rHZwjW8dGOgyzghrxRfHOgS2sTK47Zr1%2BM/8skKQX2JEAAAoD0iYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBh7Tpgbdy4Ubm5uSopKfEbX7Vqlc4%2B%2B2xlZ2f7vd5//31JUmtrq8rKyjRs2DANGDBAEyZM0Keffur7vMfjUXFxsXJzczVw4EDNnDlTBw8e9M1v3bpVN954o/r3769LL71UTz311E%2BzwwAAICS024D1xBNPaO7cuerWrdsx5wcMGCCn0%2Bn3OueccyRJzz77rNasWaOKigq9/vrr6t69uwoLC%2BX1eiVJs2bNUlNTk6qqqrRy5UrV1taqtLRUknTw4EFNnDhRF110kTZu3KiysjItW7ZM69ev/2l2HAAABL12G7CioqJUWVn5vQHrhzgcDo0fP149e/ZUbGysSkpKVFtbq82bN2vfvn2qrq5WSUmJkpKSlJKSottvv10rV67UkSNHtGHDBh05ckSTJ09WdHS0MjMzNWbMGDkcDgv2EgAAhKJ2G7DGjRunuLi4753fs2ePfvWrX2nAgAEaNmyYXnzxRUlfn4H65JNPlJGR4XtvbGysunXrJqfTqa1btyoiIkLp6em%2B%2BczMTH311Vfavn27ampqlJ6eroiICN98RkaGtmzZYsFeAgCAUGQLdAE/RlJSkrp37667775bvXr10l//%2Blf95je/UXJyss466yx5vV7Fx8f7fSY%2BPl5ut1sJCQmKjY1VWFiY35wkud1ueTwe2e12v88mJCTI4/GotbVV4eHHz6R1dXVyuVx%2BYzZbtJKTk3/sLh9TRES431eYQ2%2BtQ2%2BtRX%2BtE4w9tdmCo%2BZQPG6DMmANHjxYgwcP9v3/yiuv1F//%2BletWrVKU6dOlSTf/VbH8kNz3%2Bfbgex4HA6HysvL/cYKCwtVVFR0wtttC7u9kyXrgt5aid5ai/5CkhITYwJdwgkJpeM2KAPWsaSmpmrLli1KSEhQeHi4PB6P37zH41Hnzp2VlJSkhoYGtbS0%2BC4DfvPeb%2BZ37tx51Ge/Wbct8vPzNXToUL8xmy1abnfjj9y7Y4uICJfd3kn19U1qaWk1uvapjt5ah95ai/5aJxjPrpj%2BuWMVK4/bQIXMoAxYzz33nOLj43XFFVf4xmpra5WWlqaoqCj17t1bNTU1uvDCCyVJ9fX12rVrl8455xylpqbK6/Vq27ZtyszMlCQ5nU7Z7Xb16NFDWVlZeu6559Tc3Cybzeab79evX5vrS05OPupyoMt1QM3N1nyza2lptWztUx29tQ69tRb9haSgOwZC6bgNvjgu6fDhw3rwwQfldDp15MgRVVVV6c0339TYsWMlSQUFBVqxYoVqa2vV0NCg0tJS9e3bV9nZ2UpKStKIESP02GOPaf/%2B/dq7d68WL16s0aNHy2azKS8vT7GxsVqyZImampq0efNmVVZWqqCgIMB7DQAAgkW7PYOVnZ0tSWpubpYkVVdXS/r6bNK4cePU2Niou%2B66Sy6XS2eeeaYWL16srKwsSdLYsWPlcrl00003qbGxUTk5OX73RD3wwAOaPXu2hg0bpg4dOuiqq67yPcw0MjJSS5cu1ezZs1VRUaEuXbqopKTE754vAACAHxLm/TF3fOOEuVwHjK9ps4UrMTFGbndjyJxSbS/orXXorbXor3VstnANL90Y6DJOyCvFFwe6hDax8rjt2vX7H/lkpaC8RAgAANCeEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAsHYdsDZu3Kjc3FyVlJQcNbd%2B/Xpdc801Ou%2B88zRixAj9%2Bc9/9s0tWrRIffv2VXZ2tt9r3759kqRDhw7pt7/9rQYNGqScnBwVFRXJ7Xb7Pr97927ddtttysnJ0ZAhQ7RgwQK1trZav8MAACAktNuA9cQTT2ju3Lnq1q3bUXPvv/%2B%2Bpk6dqqKiIr399tuaMWOGHnjgAb3zzju%2B91x77bVyOp1%2Bry5dukiSysrKVFNTI4fDoXXr1snr9Wr69Om%2Bz955551KSUlRdXW1li9frurqaj399NPW7zQAAAgJ7TZgRUVFqbKy8pgBy%2BPxaOLEibrkkktks9mUl5enPn36%2BAWs79Pc3KzKykrdfvvtOv3005WQkKDi4mJt2LBBn3/%2BuZxOp7Zt26apU6cqLi5O3bt31/jx4%2BVwOKzYTQAAEIJsgS7g%2B4wbN%2B575wYNGqRBgwb5/t/c3CyXy6WUlBTf2IcffqixY8fqo48%2B0umnn67p06dr4MCB2rVrlw4cOKDMzEzfe3v27KmOHTuqpqZGdXV1Sk1NVXx8vG8%2BMzNTO3bsUENDg2JjY49be11dnVwul9%2BYzRat5OTkNu17W0VEhPt9hTn01jr01lr01zrB2FObLThqDsXjtt0GrBNRWlqq6OhoXXHFFZKk0047TWlpaZoyZYqSk5PlcDg0adIkrV69Wh6PR5Jkt9v91rDb7XK73fJ4PEfNfRO23G53mwKWw%2BFQeXm531hhYaGKiop%2B9D7%2BELu9kyXrgt5aid5ai/5CkhITYwJdwgkJpeM2qAOW1%2BtVaWmpqqqqtGLFCkVFRUmSxowZozFjxvjeN378eL300ktavXq178yX1%2Bv9wXVPRn5%2BvoYOHeo3ZrNFy%2B1uPKl1vysiIlx2eyfV1zeppYWb8E2it9aht9aiv9YJxrMrpn/uWMXK4zZQITNoA1Zra6umT5%2Bu999/X88995zS0tJ%2B8P2pqamqq6tTUlKSpK/v44qJ%2BW/Tv/zyS3Xu3FktLS2%2Bs1zf8Hg8CgsL8332eJKTk4%2B6HOhyHVBzszXf7FpaWi1b%2B1RHb61Db61FfyEp6I6BUDpugy%2BO//8efvhhffzxx8cMV//3f/%2BnTZs2%2BY3V1tYqLS1NaWlpio%2BPV01NjW/uo48%2B0uHDh5WVlaWsrCzt2bNH%2B/fv9807nU716tXLL5ABAAB8n6AMWO%2B%2B%2B65Wr16tiooKJSQkHDXv8Xg0Z84cbd%2B%2BXYcOHdJTTz2lXbt2aeTIkYqIiNANN9ygpUuXas%2BePXK73Xr00Uc1fPhwdenSRRkZGcrOztbChQvV0NCg2tpaLV%2B%2BXAUFBQHYUwAAEIza7SXC7OxsSV//hqAkVVdXS/r6bNLKlSt14MABDRkyxO8zAwYM0FNPPaUpU6ZI%2BvreK4/Ho169eukPf/iDTjvtNElSUVGRGhsbde2116q5uVlDhgzR/fff71vnd7/7nWbNmqWLL75YsbGxGjt2rH75y19avcsAACBEhHlP9o5utInLdcD4mjZbuBITY%2BR2N4bMNev2gt5ah95ai/5ax2YL1/DSjYEu44S8UnxxoEtoEyuP265d44yu11ZBeYkQAACgPSNgAQAAGEbAAgAAMIyABQAAYJjxgDV06FCVl5drz549ppcGAAAICsYD1vXXX6%2BXX35Zl1xyiW699VatX7/e96gFAACAU4HxgFVYWKiXX35Zf/7zn9W7d289/PDDysvL04IFC7Rjxw7TmwMAAGh3LLsHKzMzU9OmTdPrr7%2BuGTNm6M9//rOuuOIKTZgwQe%2B//75VmwUAAAg4ywLWkSNH9PLLL%2BvXv/61pk2bppSUFE2fPl19%2B/bV%2BPHjtWbNGqs2DQAAEFDG/1RObW2tKisr9cILL6ixsVEjRozQ008/rf79%2B/veM2DAAN1///26%2BuqrTW8eAAAg4IwHrCuvvFI9evTQxIkTdd111x3zjzHn5eVp//79pjcNAADQLhgPWCtWrNCFF1543Pdt3rzZ9KYBAADaBeP3YKWnp2vSpEmqrq72jf3hD3/Qr3/9a3k8HtObAwAAaHeMB6x58%2BbpwIED6tWrl29s8ODBam1t1fz5801vDgAAoN0xfonwrbfe0po1a5SYmOgb6969u0pLS3XVVVeZ3hwAAEC7Y/wM1sGDBxUVFXX0hsLD1dTUZHpzAAAA7Y7xgDVgwADNnz9fX375pW/s888/15w5c/we1QAAABCqjF8inDFjhm655Rb9/Oc/V2xsrFpbW9XY2Ki0tDT98Y9/NL05AACAdsd4wEpLS9NLL72kN998U7t27VJ4eLh69OihgQMHKiIiwvTmAABB7PLH/hboEgBLGA9YkhQZGalLLrnEiqUBAADaPeMB69NPP9XChQv18ccf6%2BDBg0fNv/rqq6Y3CQAA0K5Ycg9WXV2dBg4cqOjoaNPLAwAAtHvGA9aWLVv06quvKikpyfTSAAAAQcH4Yxo6d%2B7MmSsAAHBKMx6wJk6cqPLycnm9XtNLAwAABAXjlwjffPNN/fvf/9aqVat05plnKjzcP8P96U9/Mr1JAACAdsV4wIqNjdWgQYNMLwsAABA0jAesefPmmV4SAAAgqBi/B0uStm/frkWLFmn69Om%2Bsf/85z9WbAoAAKDdMR6wNm3apGuuuUbr169XVVWVpK8fPjpu3DgeMgoAAE4JxgNWWVmZ7rnnHq1Zs0ZhYWGSvv77hPPnz9fixYtPaK2NGzcqNzdXJSUlR829/PLLuvrqq3Xeeedp1KhReuutt3xzra2tKisr07BhwzRgwABNmDBBn376qW/e4/GouLhYubm5GjhwoGbOnOn31PmtW7fqxhtvVP/%2B/XXppZfqqaeeOtE2AACAU5jxgPXRRx%2BpoKBAknwBS5Iuu%2Bwy1dbWtnmdJ554QnPnzlW3bt2Omtu6daumTZumqVOn6h//%2BIfGjx%2BvO%2B64Q3v37pUkPfvss1qzZo0qKir0%2Buuvq3v37iosLPQ9OmLWrFlqampSVVWVVq5cqdraWpWWlkqSDh48qIkTJ%2Bqiiy7Sxo0bVVZWpmXLlmn9%2BvU/uicAAODUYjxgxcXFHfNvENbV1SkyMrLN60RFRamysvKYAev5559XXl6e8vLyFBUVpWuuuUZ9%2BvTR6tWrJUkOh0Pjx49Xz549FRsbq5KSEtXW1mrz5s3at2%2BfqqurVVJSoqSkJKWkpOj222/XypUrdeTIEW3YsEFHjhzR5MmTFR0drczMTI0ZM0YOh%2BPHNwUAAJxSjP8W4fnnn6%2BHH35Y9913n29sx44dmj17tn7%2B85%2B3eZ1x48Z971xNTY3y8vL8xjIyMuR0OnXw4EF98sknysjI8M3FxsaqW7ducjqdOnDggCIiIpSenu6bz8zM1FdffaXt27erpqZG6enpioiI8Fv7%2Beefb3PtdXV1crlcfmM2W7SSk5PbvEZbRESE%2B32FOfTWOvTWWvQX32azBcdxEIrHrfGANX36dN18883KyclRS0uLzj//fDU1Nal3796aP3%2B%2BkW14PB7Fx8f7jcXHx%2BuTTz7Rl19%2BKa/Xe8x5t9uthIQExcbG%2Bl2%2B/Oa9brdbHo9Hdrvd77MJCQnyeDxqbW096sGpx%2BJwOFReXu43VlhYqKKiohPaz7ay2ztZsi7orZXorbXoLyQpMTEm0CWckFA6bo0HrNNOO01VVVV64403tGPHDnXs2FE9evTQxRdf7BdqTtbx/hTPD83/mD/jcyK15%2Bfna%2BjQoX5jNlu03O7GE97uD4mICJfd3kn19U1qaWk1uvapjt5ah95ai/7i20z/3LGKlcdtoEKm8YAlSR06dNAll1xixdKSpMTERHk8Hr8xj8ejpKQkJSQkKDw8/JjznTt3VlJSkhoaGtTS0uK7DPjNe7%2BZ37lz51Gf/WbdtkhOTj7qcqDLdUDNzdZ8s2tpabVs7VMdvbUOvbUW/YWkoDsGQum4NR6whg4d%2BoNne0w8CysrK0tbtmzxG3M6nbryyisVFRWl3r17q6amRhdeeKEkqb6%2BXrt27dI555yj1NRUeb1ebdu2TZmZmb7P2u129ejRQ1lZWXruuefU3Nwsm83mm%2B/Xr99J1w0AAE4Nxu8mu%2BKKK/xeI0aMUJ8%2BfXTo0CGNHTvWyDZuuOEG/f3vf9eGDRt06NAhVVZWaufOnbrmmmskSQUFBVqxYoVqa2vV0NCg0tJS9e3bV9nZ2UpKStKIESP02GOPaf/%2B/dq7d68WL16s0aNHy2azKS8vT7GxsVqyZImampq0efNmVVZW%2Bh49AQAAcDzGz2BNnTr1mOPr1q3TP//5zzavk52dLUlqbm6WJFVXV0v6%2BmxSnz59VFpaqnnz5mn37t3q1auXli1bpq5du0qSxo4dK5fLpZtuukmNjY3Kycnxu%2Bn8gQce0OzZszVs2DB16NBBV111le9hppGRkVq6dKlmz56tiooKdenSRSUlJRo8ePAJ9wIAAJyawrw/5o7vH6GlpUW5ubknFLJCict1wPiaNlu4EhNj5HY3hsw16/aC3lqH3lor2Pp7%2BWN/C3QJIe2V4osDXUKbWHncdu0aZ3S9tvrJHjjxwQcf/Kjf3gMAAAg2xi8RHus%2Bq6amJtXW1urSSy81vTkAAIB2x3jA6t69%2B1G/RRgVFaXRo0drzJgxpjcHAADQ7hgPWKae1g4AABCsjAesF154oc3vve6660xvHgAAIOCMB6yZM2eqtbX1qBvaw8LC/MbCwsIIWAAAICQZD1hPPvmknnrqKU2aNEnp6enyer368MMP9cQTT%2BjGG29UTk6O6U0CAAC0K5bcg1VRUaGUlBTf2AUXXKC0tDRNmDBBVVVVpjcJAADQrhh/DtbOnTsVHx9/1Ljdbtfu3btNbw4AAKDdMR6wUlNTNX/%2BfLndbt9YfX29Fi5cqJ/97GemNwcAANDuGL9EOGPGDE2ZMkUOh0MxMTEKDw9XQ0ODOnbsqMWLF5veHAAAQLtjPGANHDhQGzZs0BtvvKG9e/fK6/UqJSVFv/jFLxQXF5i/BwQAAPBTMh6wJKlTp04aNmyY9u7dq7S0NCs2AQAA0G4Zvwfr4MGDmjZtms477zxdfvnlkr6%2BB%2BvWW29VfX296c0BAAC0O8YD1oIFC7R161aVlpYqPPy/y7e0tKi0tNT05gAAANod4wFr3bp1%2Bt3vfqfLLrvM90ef7Xa75s2bp/Xr15veHAAAQLtjPGA1Njaqe/fuR40nJSXpq6%2B%2BMr05AACAdsd4wPrZz36mf/7zn5Lk97cH165dqzPOOMP05gAAANod479F%2BMtf/lJ33nmnrr/%2BerW2tmr58uXasmWL1q1bp5kzZ5reHAAAQLtjPGDl5%2BfLZrPpmWeeUUREhJYuXaoePXqotLRUl112menNAQAAtDvGA9b%2B/ft1/fXX6/rrrze9NAAAQFAwfg/WsGHD/O69AgAAONUYD1g5OTl65ZVXTC8LAAAQNIxfIjz99NP10EMPqaKiQj/72c/UoUMHv/mFCxea3iQAAEC7YjxgffLJJzrrrLMkSW632/TyAAAA7Z6xgFVSUqKysjL98Y9/9I0tXrxYhYWFpjYBAAAQFIzdg/Xaa68dNVZRUWFqeQAAgKBhLGAd6zcH%2BW1CAABwKjIWsL75w87HGwMAAAh1xm9y/ym8/fbbuuWWW/zGvF6vjhw5ohUrVmjcuHGKjIz0m//f//1fXX755ZKkFStW6Nlnn5XL5VJ6erpmzpyprKwsSdKhQ4f00EMPacOGDTp06JBycnI0Z84cJSYm/jQ7BwAAgl5QBqwBAwbI6XT6jS1dulTbtm2TJKWmph7znjDp63vFFi1apCeffFLp6elasWKFJk2apPXr1ys6OlplZWWqqamRw%2BFQp06dNGvWLE2fPl1Lly61fL8AAEBoMBawjhw5oilTphx3zIrnYP2///f/tHz5cv3lL3/Rp59%2B%2BoPvdTgcGjVqlPr16ydJuvXWW7VixQq9/vrrGjFihCorK/XII4/o9NNPlyQVFxfryiuv1Oeff66UlBTjtQMAgNBj7B6s/v37q66uzu91rDErPP7447r%2B%2But1xhlnSJIaGxtVWFionJwc/eIXv9Dy5ct9N9zX1NQoIyPD99nw8HD17dtXTqdTu3bt0oEDB5SZmemb79mzpzp27KiamhpLagcAAKHH2Bmsbz//6qf02Wefaf369Vq/fr0kKTY2Vn369NHNN9%2BssrIy/etf/9Jdd92luLg4jR49Wh6PR/Hx8X5rxMfHy%2B12y%2BPxSJLsdrvfvN1uP6GHptbV1cnlcvmN2WzRSk5O/jG7%2BL0iIsL9vsIcemsdemst%2Botvs9mC4zgIxeM2KO/B%2BrZnn31Wl156qbp27SpJyszM9At7AwcO1NixY7Vq1SqNHj1a0vEfH3Gyj5dwOBwqLy/3GyssLFRRUdFJrft97PZOlqwLemslemst%2BgtJSkyMCXQJJySUjtugD1jr1q3TtGnTfvA9qampWrdunSQpMTHRd6bqGx6PR71791ZSUpLv/zEx/z0ov/zyS3Xu3LmnlYbeAAATxElEQVTNNeXn52vo0KF%2BYzZbtNzuxjav0RYREeGy2zupvr5JLS2tRtc%2B1dFb69Bba9FffJvpnztWsfK4DVTIDOqAtXXrVu3evVsXX3yxb%2ByVV16R2%2B3WL3/5S9/Y9u3blZaWJknKyspSTU2NRo4cKUlqaWnRBx98oNGjRystLU3x8fGqqalRamqqJOmjjz7S4cOHfY9xaIvk5OSjLge6XAfU3GzNN7uWllbL1j7V0Vvr0Ftr0V9ICrpjIJSO26C%2B2PnBBx8oISFBsbGxvrEOHTrokUce0VtvvaUjR47ob3/7m1auXKmCggJJUkFBgV544QW99957ampq0pIlSxQZGanBgwcrIiJCN9xwg5YuXao9e/bI7Xbr0Ucf1fDhw9WlS5dA7SYAAAgyQX0Ga9%2B%2Bfb57r75xySWXaMaMGXrwwQe1Z88edenSRTNmzNCll14qSRo0aJDuvvtuFRcX64svvlB2drYqKirUsWNHSVJRUZEaGxt17bXXqrm5WUOGDNH999//U%2B8aAAAIYmFe/mDgT8LlOmB8TZstXImJMXK7G0PmlGp7QW%2BtQ2%2BtFWz9vfyxvwW6hJD2SvHFx39TO2Dlcdu1a5zR9doqqC8RAgAAtEcELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADAsaANWenq6srKylJ2d7Xs9%2BOCDkqRNmzZp9OjROv/883XllVdq9erVfp9dsWKFRowYofPPP18FBQXasmWLb%2B7QoUP67W9/q0GDBiknJ0dFRUVyu90/6b4BAIDgZgt0ASdj7dq1OvPMM/3G6urqdPvtt2vmzJm6%2Buqr9e6772ry5Mnq0aOHsrOz9dprr2nRokV68sknlZ6erhUrVmjSpElav369oqOjVVZWppqaGjkcDnXq1EmzZs3S9OnTtXTp0gDtJQAACDZBewbr%2B6xZs0bdu3fX6NGjFRUVpdzcXA0dOlTPP/%2B8JMnhcGjUqFHq16%2BfOnbsqFtvvVWS9Prrr6u5uVmVlZW6/fbbdfrppyshIUHFxcXasGGDPv/880DuFgAACCJBHbAWLlyowYMH64ILLtCsWbPU2NiompoaZWRk%2BL0vIyPDdxnwu/Ph4eHq27evnE6ndu3apQMHDigzM9M337NnT3Xs2FE1NTU/zU4BAICgF7SXCM8991zl5ubqkUce0aeffqri4mLNmTNHHo9HKSkpfu9NSEjw3Ufl8XgUHx/vNx8fHy%2B32y2PxyNJstvtfvN2u/2E7sOqq6uTy%2BXyG7PZopWcnNzmNdoiIiLc7yvMobfWobfWor/4NpstOI6DUDxugzZgORwO37979uypqVOnavLkyerfv/9xP%2Bv1ek9qvi21lZeX%2B40VFhaqqKjopNb9PnZ7J0vWBb21Er21Fv2FJCUmxgS6hBMSSsdt0Aas7zrzzDPV0tKi8PBw35mob7jdbiUlJUmSEhMTj5r3eDzq3bu37z0ej0cxMf89KL/88kt17ty5zbXk5%2Bdr6NChfmM2W7Tc7sYT2qfjiYgIl93eSfX1TWppaTW69qmO3lqH3lqL/uLbTP/csYqVx22gQmZQBqwPPvhAq1ev1r333usbq62tVWRkpPLy8vSXv/zF7/1btmxRv379JElZWVmqqanRyJEjJUktLS364IMPNHr0aKWlpSk%2BPl41NTVKTU2VJH300Uc6fPiwsrKy2lxfcnLyUZcDXa4Dam625ptdS0urZWuf6uitdeittegvJAXdMRBKx21QXuzs3LmzHA6HKioqdPjwYe3YsUOPP/648vPzde2112r37t16/vnndejQIb3xxht64403dMMNN0iSCgoK9MILL%2Bi9995TU1OTlixZosjISA0ePFgRERG64YYbtHTpUu3Zs0dut1uPPvqohg8fri5dugR4rwEAQLAIyjNYKSkpqqio0MKFC30BaeTIkSopKVFUVJSWLVumuXPnas6cOUpNTdWCBQt09tlnS5IGDRqku%2B%2B%2BW8XFxfriiy%2BUnZ2tiooKdezYUZJUVFSkxsZGXXvttWpubtaQIUN0//33B3BvAQBAsAnznuwd3WgTl%2BuA8TVttnAlJsbI7W4MmVOq7QW9tQ69tVaw9ffyx/4W6BJC2ivFFwe6hDax8rjt2jXO6HptFZSXCAEAANozAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYFpTPwQIAAMcXTI/BeOehywJdglGcwQIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwL2oC1e/duFRYWKicnR7m5ubr33ntVX1%2Bvzz77TOnp6crOzvZ7/f73v/d99uWXX9bVV1%2Bt8847T6NGjdJbb73lm2ttbVVZWZmGDRumAQMGaMKECfr0008DsYsAACBIBW3AmjRpkux2u1577TWtWrVKH3/8sR555BHfvNPp9HtNmDBBkrR161ZNmzZNU6dO1T/%2B8Q%2BNHz9ed9xxh/bu3StJevbZZ7VmzRpVVFTo9ddfV/fu3VVYWCiv1xuQ/QQAAMEnKANWfX29srKyNGXKFMXExOi0007TyJEj9c477xz3s88//7zy8vKUl5enqKgoXXPNNerTp49Wr14tSXI4HBo/frx69uyp2NhYlZSUqLa2Vps3b7Z6twAAQIiwBbqAH8Nut2vevHl%2BY3v27FFycrLv/7/5zW/097//Xc3NzRozZoyKiorUoUMH1dTUKC8vz%2B%2BzGRkZcjqdOnjwoD755BNlZGT45mJjY9WtWzc5nU6de%2B65baqvrq5OLpfLb8xmi/arz4SIiHC/rzCH3lqH3lqL/iKYhdJxG5QB67ucTqeeeeYZLVmyRJGRkTrvvPM0fPhwPfTQQ9q6davuvPNO2Ww23XXXXfJ4PIqPj/f7fHx8vD755BN9%2BeWX8nq9x5x3u91trsfhcKi8vNxvrLCwUEVFRT9%2BJ3%2BA3d7JknVBb61Eb61FfxGMQum4DfqA9e6772ry5MmaMmWKcnNzJUl/%2BtOffPPnnHOOJk6cqGXLlumuu%2B6SpOPeT3Wy91vl5%2Bdr6NChfmM2W7Tc7saTWve7IiLCZbd3Un19k1paWo2ufaqjt9aht9aivwhmVhy3iYkxRtdrq6AOWK%2B99pruuecezZo1S9ddd933vi81NVX79u2T1%2BtVYmKiPB6P37zH41FSUpISEhIUHh5%2BzPnOnTu3ua7k5OSjLge6XAfU3GzNN7uWllbL1j7V0Vvr0Ftr0V8Eo1A6boP2Yue///1vTZs2TY8//rhfuNq0aZOWLFni997t27crNTVVYWFhysrK0pYtW/zmnU6n%2BvXrp6ioKPXu3Vs1NTW%2Bufr6eu3atUvnnHOOtTsEAABCRlAGrObmZt13332aOnWqBg4c6DcXFxenxYsX68UXX9SRI0fkdDr1%2B9//XgUFBZKkG264QX//%2B9%2B1YcMGHTp0SJWVldq5c6euueYaSVJBQYFWrFih2tpaNTQ0qLS0VH379lV2dvZPvp8AACA4BeUlwvfee0%2B1tbWaO3eu5s6d6ze3du1alZWVqby8XL/97W8VFxenm266STfffLMkqU%2BfPiotLdW8efO0e/du9erVS8uWLVPXrl0lSWPHjpXL5dJNN92kxsZG5eTkHHXDOgAAwA8J8/IEzZ%2BEy3XA%2BJo2W7gSE2PkdjeGzDXr9oLeWofeWstmC9fw0o2BLgM4Ye88dJkl3xe6do0zul5bBeUlQgAAgPaMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhtkAXAADt3eWP/S3QJQAIMpzBAgAAMIyAdQy7d%2B/WbbfdppycHA0ZMkQLFixQa2troMsCAABBgkuEx3DnnXcqMzNT1dXV%2BuKLLzRx4kR16dJFv/rVrwJdGgAACAKcwfoOp9Opbdu2aerUqYqLi1P37t01fvx4ORyOQJcGAACCBGewvqOmpkapqamKj4/3jWVmZmrHjh1qaGhQbGzscdeoq6uTy%2BXyG7PZopWcnGy01oiIcL%2BvMIfeWiciIlwXzFwb6DIAtEOh9D2XgPUdHo9Hdrvdb%2BybsOV2u9sUsBwOh8rLy/3G7rjjDt15553mCtXXQe7pp59Ufn6%2B8fB2qqO31qmrq9PNp31Mby1SV1cnh8NBfy1Ab61TV1enRYsWhVRvQycqGuT1ek/q8/n5%2BVq1apXfKz8/31B1/%2BVyuVReXn7U2TKcPHprHXprLfprHXprnVDsLWewviMpKUkej8dvzOPxKCwsTElJSW1aIzk5OWQSOAAAOHGcwfqOrKws7dmzR/v37/eNOZ1O9erVSzExMQGsDAAABAsC1ndkZGQoOztbCxcuVENDg2pra7V8%2BXIVFBQEujQAABAkIu6///77A11Ee/OLX/xCVVVVevDBB/XSSy9p9OjRmjBhgsLCwgJd2lFiYmJ04YUXcnbNAvTWOvTWWvTXOvTWOqHW2zDvyd7RDQAAAD9cIgQAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAVhHbv3q3bbrtNOTk5GjJkiBYsWKDW1tZAlxUyNm7cqNzcXJWUlAS6lJCze/duFRYWKicnR7m5ubr33ntVX18f6LJCwrZt23TzzTerf//%2Bys3NVXFxsVwuV6DLCjkPP/yw0tPTA11GSElPT1dWVpays7N9rwcffDDQZZ00AlYQuvPOO5WSkqLq6motX75c1dXVevrppwNdVkh44oknNHfuXHXr1i3QpYSkSZMmyW6367XXXtOqVav08ccf65FHHgl0WUHv8OHDuuWWW3ThhRdq06ZNqqqq0hdffCH%2B1KxZW7du1YsvvhjoMkLS2rVr5XQ6fa9Zs2YFuqSTRsAKMk6nU9u2bdPUqVMVFxen7t27a/z48XI4HIEuLSRERUWpsrKSgGWB%2Bvp6ZWVlacqUKYqJidFpp52mkSNH6p133gl0aUGvqalJJSUlmjhxoiIjI5WUlKThw4fr448/DnRpIaO1tVWzZ8/W%2BPHjA10KggQBK8jU1NQoNTVV8fHxvrHMzEzt2LFDDQ0NAawsNIwbN05xcXGBLiMk2e12zZs3T126dPGN7dmzR8nJyQGsKjTEx8drzJgxstlskqTt27frL3/5iy6//PIAVxY6/vSnPykqKkpXX311oEsJSQsXLtTgwYN1wQUXaNasWWpsbAx0SSeNgBVkPB6P7Ha739g3YcvtdgeiJOBHcTqdeuaZZzR58uRAlxIydu/eraysLF1xxRXKzs5WUVFRoEsKCfv27dOiRYs0e/bsQJcSks4991zl5uZq/fr1cjgceu%2B99zRnzpxAl3XSCFhByOv1BroE4KS8%2B%2B67mjBhgqZMmaLc3NxAlxMyUlNT5XQ6tXbtWu3cuVO/%2Bc1vAl1SSJg3b55GjRqlXr16BbqUkORwODRmzBhFRkaqZ8%2Bemjp1qqqqqnT48OFAl3ZSCFhBJikpSR6Px2/M4/EoLCxMSUlJAaoKaLvXXntNt912m2bMmKFx48YFupyQExYWpu7du6ukpERVVVXav39/oEsKaps2bdJ//vMfFRYWBrqUU8aZZ56plpYWffHFF4Eu5aQQsIJMVlaW9uzZ4/dN0%2Bl0qlevXoqJiQlgZcDx/fvf/9a0adP0%2BOOP67rrrgt0OSFj06ZNGjFihN/jWsLDv/723qFDh0CVFRJWr16tL774QkOGDFFOTo5GjRolScrJydFLL70U4OqC3wcffKD58%2Bf7jdXW1ioyMjLo788kYAWZjIwMZWdna%2BHChWpoaFBtba2WL1%2BugoKCQJcG/KDm5mbdd999mjp1qgYOHBjockJKVlaWGhoatGDBAjU1NWn//v1atGiRLrjgAn5p4yTde%2B%2B9WrdunV588UW9%2BOKLqqiokCS9%2BOKLGjp0aICrC36dO3eWw%2BFQRUWFDh8%2BrB07dujxxx9Xfn6%2BIiIiAl3eSQnzckNP0Nm7d69mzZqlf/3rX4qNjdXYsWN1xx13KCwsLNClBb3s7GxJX4cBSb7fynI6nQGrKVS88847%2Bp//%2BR9FRkYeNbd27VqlpqYGoKrQ8eGHH2ru3Ll6//33FR0drYsuukj33nuvUlJSAl1aSPnss880bNgwffjhh4EuJWS8/fbbWrhwoT788ENFRkZq5MiRKikpUVRUVKBLOykELAAAAMO4RAgAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhv1/kH7hZ/EnQSsAAAAASUVORK5CYII%3D\"/>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\" id=\"common4156873781702074801\">\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">4.0</td>\n        <td class=\"number\">1246</td>\n        <td class=\"number\">3.4%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.33</td>\n        <td class=\"number\">627</td>\n        <td class=\"number\">1.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:3%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.5</td>\n        <td class=\"number\">601</td>\n        <td class=\"number\">1.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.25</td>\n        <td class=\"number\">525</td>\n        <td class=\"number\">1.4%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.23</td>\n        <td class=\"number\">515</td>\n        <td class=\"number\">1.4%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.2</td>\n        <td class=\"number\">512</td>\n        <td class=\"number\">1.4%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.18</td>\n        <td class=\"number\">494</td>\n        <td class=\"number\">1.4%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.29</td>\n        <td class=\"number\">493</td>\n        <td class=\"number\">1.4%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.06</td>\n        <td class=\"number\">467</td>\n        <td class=\"number\">1.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.17</td>\n        <td class=\"number\">467</td>\n        <td class=\"number\">1.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"other\">\n        <td class=\"fillremaining\">Other values (272)</td>\n        <td class=\"number\">30567</td>\n        <td class=\"number\">83.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\"  id=\"extreme4156873781702074801\">\n            <p class=\"h4\">Minimum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">0.0</td>\n        <td class=\"number\">11</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.0</td>\n        <td class=\"number\">3</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:27%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.33</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:10%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.5</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:10%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.55</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:10%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n            <p class=\"h4\">Maximum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">4.94</td>\n        <td class=\"number\">7</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.95</td>\n        <td class=\"number\">3</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.96</td>\n        <td class=\"number\">3</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.97</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">5.0</td>\n        <td class=\"number\">415</td>\n        <td class=\"number\">1.1%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n    </div>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">is_ebook<br/>\n            <small>Categorical</small>\n        </p>\n    </div><div class=\"col-md-3\">\n    <table class=\"stats \">\n        <tr class=\"\">\n            <th>Distinct count</th>\n            <td>2</td>\n        </tr>\n        <tr>\n            <th>Unique (%)</th>\n            <td>0.0%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (%)</th>\n            <td>0.0%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (n)</th>\n            <td>0</td>\n        </tr>\n    </table>\n</div>\n<div class=\"col-md-6 collapse in\" id=\"minifreqtable-8506970005563105745\">\n    <table class=\"mini freq\">\n        <tr class=\"\">\n    <th>false</th>\n    <td>\n        <div class=\"bar\" style=\"width:100%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 92.2%\">\n            33658\n        </div>\n        \n    </td>\n</tr><tr class=\"\">\n    <th>true</th>\n    <td>\n        <div class=\"bar\" style=\"width:9%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 7.8%\">\n            &nbsp;\n        </div>\n        2856\n    </td>\n</tr>\n    </table>\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#freqtable-8506970005563105745, #minifreqtable-8506970005563105745\"\n       aria-expanded=\"true\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"col-md-12 extrapadding collapse\" id=\"freqtable-8506970005563105745\">\n    \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">false</td>\n        <td class=\"number\">33658</td>\n        <td class=\"number\">92.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">true</td>\n        <td class=\"number\">2856</td>\n        <td class=\"number\">7.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">num_pages<br/>\n            <small>Categorical</small>\n        </p>\n    </div><div class=\"col-md-3\">\n    <table class=\"stats \">\n        <tr class=\"alert\">\n            <th>Distinct count</th>\n            <td>1059</td>\n        </tr>\n        <tr>\n            <th>Unique (%)</th>\n            <td>3.7%</td>\n        </tr>\n        <tr class=\"alert\">\n            <th>Missing (%)</th>\n            <td>20.6%</td>\n        </tr>\n        <tr class=\"alert\">\n            <th>Missing (n)</th>\n            <td>7505</td>\n        </tr>\n    </table>\n</div>\n<div class=\"col-md-6 collapse in\" id=\"minifreqtable401320848774450276\">\n    <table class=\"mini freq\">\n        <tr class=\"\">\n    <th>96</th>\n    <td>\n        <div class=\"bar\" style=\"width:5%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 3.5%\">\n            &nbsp;\n        </div>\n        1287\n    </td>\n</tr><tr class=\"\">\n    <th>80</th>\n    <td>\n        <div class=\"bar\" style=\"width:4%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 2.8%\">\n            &nbsp;\n        </div>\n        1012\n    </td>\n</tr><tr class=\"\">\n    <th>32</th>\n    <td>\n        <div class=\"bar\" style=\"width:3%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 2.0%\">\n            &nbsp;\n        </div>\n        714\n    </td>\n</tr><tr class=\"other\">\n    <th>Other values (1055)</th>\n    <td>\n        <div class=\"bar\" style=\"width:100%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 71.2%\">\n            25996\n        </div>\n        \n    </td>\n</tr><tr class=\"missing\">\n    <th>(Missing)</th>\n    <td>\n        <div class=\"bar\" style=\"width:29%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 20.6%\">\n            7505\n        </div>\n        \n    </td>\n</tr>\n    </table>\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#freqtable401320848774450276, #minifreqtable401320848774450276\"\n       aria-expanded=\"true\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"col-md-12 extrapadding collapse\" id=\"freqtable401320848774450276\">\n    \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">96</td>\n        <td class=\"number\">1287</td>\n        <td class=\"number\">3.5%</td>\n        <td>\n            <div class=\"bar\" style=\"width:6%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">80</td>\n        <td class=\"number\">1012</td>\n        <td class=\"number\">2.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">32</td>\n        <td class=\"number\">714</td>\n        <td class=\"number\">2.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:4%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">64</td>\n        <td class=\"number\">674</td>\n        <td class=\"number\">1.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:4%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">128</td>\n        <td class=\"number\">660</td>\n        <td class=\"number\">1.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:3%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">112</td>\n        <td class=\"number\">643</td>\n        <td class=\"number\">1.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:3%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">88</td>\n        <td class=\"number\">573</td>\n        <td class=\"number\">1.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:3%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">160</td>\n        <td class=\"number\">462</td>\n        <td class=\"number\">1.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:3%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">48</td>\n        <td class=\"number\">441</td>\n        <td class=\"number\">1.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">72</td>\n        <td class=\"number\">436</td>\n        <td class=\"number\">1.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"other\">\n        <td class=\"fillremaining\">Other values (1048)</td>\n        <td class=\"number\">22107</td>\n        <td class=\"number\">60.5%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"missing\">\n        <td class=\"fillremaining\">(Missing)</td>\n        <td class=\"number\">7505</td>\n        <td class=\"number\">20.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:34%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">publication_year<br/>\n            <small>Categorical</small>\n        </p>\n    </div><div class=\"col-md-3\">\n    <table class=\"stats \">\n        <tr class=\"alert\">\n            <th>Distinct count</th>\n            <td>202</td>\n        </tr>\n        <tr>\n            <th>Unique (%)</th>\n            <td>0.7%</td>\n        </tr>\n        <tr class=\"alert\">\n            <th>Missing (%)</th>\n            <td>15.9%</td>\n        </tr>\n        <tr class=\"alert\">\n            <th>Missing (n)</th>\n            <td>5816</td>\n        </tr>\n    </table>\n</div>\n<div class=\"col-md-6 collapse in\" id=\"minifreqtable-1127510253279857554\">\n    <table class=\"mini freq\">\n        <tr class=\"\">\n    <th>2013</th>\n    <td>\n        <div class=\"bar\" style=\"width:7%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 4.7%\">\n            &nbsp;\n        </div>\n        1719\n    </td>\n</tr><tr class=\"\">\n    <th>2014</th>\n    <td>\n        <div class=\"bar\" style=\"width:7%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 4.6%\">\n            &nbsp;\n        </div>\n        1669\n    </td>\n</tr><tr class=\"\">\n    <th>2015</th>\n    <td>\n        <div class=\"bar\" style=\"width:7%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 4.3%\">\n            &nbsp;\n        </div>\n        1570\n    </td>\n</tr><tr class=\"other\">\n    <th>Other values (198)</th>\n    <td>\n        <div class=\"bar\" style=\"width:100%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 70.5%\">\n            25740\n        </div>\n        \n    </td>\n</tr><tr class=\"missing\">\n    <th>(Missing)</th>\n    <td>\n        <div class=\"bar\" style=\"width:23%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 15.9%\">\n            5816\n        </div>\n        \n    </td>\n</tr>\n    </table>\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#freqtable-1127510253279857554, #minifreqtable-1127510253279857554\"\n       aria-expanded=\"true\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"col-md-12 extrapadding collapse\" id=\"freqtable-1127510253279857554\">\n    \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">2013</td>\n        <td class=\"number\">1719</td>\n        <td class=\"number\">4.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:11%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2014</td>\n        <td class=\"number\">1669</td>\n        <td class=\"number\">4.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:11%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2015</td>\n        <td class=\"number\">1570</td>\n        <td class=\"number\">4.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:10%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2012</td>\n        <td class=\"number\">1545</td>\n        <td class=\"number\">4.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:10%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2009</td>\n        <td class=\"number\">1440</td>\n        <td class=\"number\">3.9%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2011</td>\n        <td class=\"number\">1393</td>\n        <td class=\"number\">3.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2016</td>\n        <td class=\"number\">1382</td>\n        <td class=\"number\">3.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2010</td>\n        <td class=\"number\">1334</td>\n        <td class=\"number\">3.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2008</td>\n        <td class=\"number\">1332</td>\n        <td class=\"number\">3.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2007</td>\n        <td class=\"number\">1303</td>\n        <td class=\"number\">3.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"other\">\n        <td class=\"fillremaining\">Other values (191)</td>\n        <td class=\"number\">16011</td>\n        <td class=\"number\">43.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"missing\">\n        <td class=\"fillremaining\">(Missing)</td>\n        <td class=\"number\">5816</td>\n        <td class=\"number\">15.9%</td>\n        <td>\n            <div class=\"bar\" style=\"width:36%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">ratings_count<br/>\n            <small>Numeric</small>\n        </p>\n    </div><div class=\"col-md-6\">\n    <div class=\"row\">\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n                <tr>\n                    <th>Distinct count</th>\n                    <td>1725</td>\n                </tr>\n                <tr>\n                    <th>Unique (%)</th>\n                    <td>4.7%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (n)</th>\n                    <td>0</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (n)</th>\n                    <td>0</td>\n                </tr>\n            </table>\n\n        </div>\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n\n                <tr>\n                    <th>Mean</th>\n                    <td>279.69</td>\n                </tr>\n                <tr>\n                    <th>Minimum</th>\n                    <td>0</td>\n                </tr>\n                <tr>\n                    <th>Maximum</th>\n                    <td>1029527</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Zeros (%)</th>\n                    <td>0.3%</td>\n                </tr>\n            </table>\n        </div>\n    </div>\n</div>\n<div class=\"col-md-3 collapse in\" id=\"minihistogram4614937497148659063\">\n    <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAABLCAYAAAA1fMjoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAAPtJREFUeJzt1bEJAlEQRVFXLGmLsCdje7IIexpzkQsbyN/gnHzgJZfZZmYuwE/X1QPgzG6rB3zbH6/DN%2B/n/Q9LwAeBJBAIAoEgEAgCgSAQCAKBIBAIAoEgEAgCgSAQCAKBIBAIAoEgEAgCgSAQCAKBIBAIAoEgEAgCgSAQCAKBIBAIAoEgEAgCgSAQCAKBIBAIAoEgEAgCgSAQCAKBIBAIAoEgEAgCgSAQCAKBIBAIAoEgEAjbzMzqEXBWPggEgUAQCASBQBAIBIFAEAgEgUAQCASBQBAIBIFAEAgEgUAQCASBQBAIBIFAEAgEgUAQCASBQBAIBIFAEAiED9obC49m6JdFAAAAAElFTkSuQmCC\">\n\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#descriptives4614937497148659063,#minihistogram4614937497148659063\"\n       aria-expanded=\"false\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"row collapse col-md-12\" id=\"descriptives4614937497148659063\">\n    <ul class=\"nav nav-tabs\" role=\"tablist\">\n        <li role=\"presentation\" class=\"active\"><a href=\"#quantiles4614937497148659063\"\n                                                  aria-controls=\"quantiles4614937497148659063\" role=\"tab\"\n                                                  data-toggle=\"tab\">Statistics</a></li>\n        <li role=\"presentation\"><a href=\"#histogram4614937497148659063\" aria-controls=\"histogram4614937497148659063\"\n                                   role=\"tab\" data-toggle=\"tab\">Histogram</a></li>\n        <li role=\"presentation\"><a href=\"#common4614937497148659063\" aria-controls=\"common4614937497148659063\"\n                                   role=\"tab\" data-toggle=\"tab\">Common Values</a></li>\n        <li role=\"presentation\"><a href=\"#extreme4614937497148659063\" aria-controls=\"extreme4614937497148659063\"\n                                   role=\"tab\" data-toggle=\"tab\">Extreme Values</a></li>\n\n    </ul>\n\n    <div class=\"tab-content\">\n        <div role=\"tabpanel\" class=\"tab-pane active row\" id=\"quantiles4614937497148659063\">\n            <div class=\"col-md-4 col-md-offset-1\">\n                <p class=\"h4\">Quantile statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Minimum</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>5-th percentile</th>\n                        <td>2</td>\n                    </tr>\n                    <tr>\n                        <th>Q1</th>\n                        <td>9</td>\n                    </tr>\n                    <tr>\n                        <th>Median</th>\n                        <td>23</td>\n                    </tr>\n                    <tr>\n                        <th>Q3</th>\n                        <td>69</td>\n                    </tr>\n                    <tr>\n                        <th>95-th percentile</th>\n                        <td>477.7</td>\n                    </tr>\n                    <tr>\n                        <th>Maximum</th>\n                        <td>1029527</td>\n                    </tr>\n                    <tr>\n                        <th>Range</th>\n                        <td>1029527</td>\n                    </tr>\n                    <tr>\n                        <th>Interquartile range</th>\n                        <td>60</td>\n                    </tr>\n                </table>\n            </div>\n            <div class=\"col-md-4 col-md-offset-2\">\n                <p class=\"h4\">Descriptive statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Standard deviation</th>\n                        <td>7633.4</td>\n                    </tr>\n                    <tr>\n                        <th>Coef of variation</th>\n                        <td>27.293</td>\n                    </tr>\n                    <tr>\n                        <th>Kurtosis</th>\n                        <td>11514</td>\n                    </tr>\n                    <tr>\n                        <th>Mean</th>\n                        <td>279.69</td>\n                    </tr>\n                    <tr>\n                        <th>MAD</th>\n                        <td>438.12</td>\n                    </tr>\n                    <tr class=\"alert\">\n                        <th>Skewness</th>\n                        <td>99.17</td>\n                    </tr>\n                    <tr>\n                        <th>Sum</th>\n                        <td>10212535</td>\n                    </tr>\n                    <tr>\n                        <th>Variance</th>\n                        <td>58269000</td>\n                    </tr>\n                    <tr>\n                        <th>Memory size</th>\n                        <td>285.3 KiB</td>\n                    </tr>\n                </table>\n            </div>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-8 col-md-offset-2\" id=\"histogram4614937497148659063\">\n            <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAlgAAAGQCAYAAAByNR6YAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAIABJREFUeJzt3Xl4VOXd//FPkpEgJJNkwER/kQpliWQBASE2IGERUHFFtliLVFCWSEyECoIIrmgBUYEHia08IramgFWhIhZZ1Err44aTAVQCFEyDCWSGLCSQ5fz%2B8GEeR9BEuWFm4P26rlzW%2B3vOue/zbTJ%2B5pyTSYhlWZYAAABgTKi/FwAAAHC2IWABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMNs/l7AuaKkpNz4MUNDQ%2BRwNFdpaaXq6y3jxz8X0VOz6Kd59NQ8empeIPX0ggsi/TIvV7CCWGhoiEJCQhQaGuLvpZw16KlZ9NM8emoePTWPnhKwAAAAjCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwm78XgFNz%2BYy3/L2ERluX3dPfSwAA4IzgChYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgWMAGrJ07d%2Br2229Xt27dlJaWpuzsbJWUlOhf//qXEhISlJKS4vO1bt06777Lly/XoEGD1LVrV2VkZCg/P99bO3r0qB588EH17t1bqampysrKktvt9tYLCwt11113KTU1VX379tXcuXNVX19/Rs8dAAAEt4AMWMeOHdMdd9yhHj16aOvWrVq7dq0OHTqk2bNnS5Li4%2BPldDp9vq655hpJ0saNG7Vw4UL9/ve/1wcffKC%2Bfftq/PjxOnLkiCRpwYIFcrlcysvL0/r162VZlu6//37v3JMmTVJcXJw2bNigZcuWacOGDXrxxRfPeA8AAEDwCsiAVVVVpZycHI0bN05NmjSRw%2BHQgAED9NVXXzW4b15enoYMGaLOnTuradOmGjt2rCRp06ZNqq2t1apVqzRx4kRddNFFio6OVnZ2tjZv3qxvvvlGTqdTO3fu1JQpUxQZGanWrVtr9OjRysvLO92nDAAAziIBGbCioqI0bNgw2Wzf/iWf3bt3669//av3KlVlZaUyMzOVmpqqK6%2B8UsuWLZNlWZIkl8ulxMRE77FCQ0PVsWNHOZ1O7du3T%2BXl5UpKSvLW27Ztq6ZNm8rlcsnlcik%2BPl5RUVHeelJSkvbs2aOKioozceoAAOAsENB/i7CwsFCDBg1SbW2thg8frqysLO3cuVMdOnTQ7bffrgULFujDDz/UPffco8jISA0dOlQej8cnIEnfBja32y2PxyNJstvtPnW73e6tf792/Fhut1sRERGNWndxcbFKSkp8xmy2ZoqNjf1J59%2BQsLCAzMc/yGYL/PUe72mw9TZQ0U/z6Kl59NQ8ehrgAev4s1b//ve/9eCDD%2Bq%2B%2B%2B7T/Pnz9dJLL3m36dWrl0aOHKlXX31VQ4cOlSTv1awf8mP1hvZtjLy8PC1atMhnLDMzU1lZWad87GAWE9Pc30toNLv9fH8v4axCP82jp%2BbRU/PO5Z4GdMCSpJCQELVu3Vo5OTkaOXKkZsyYIYfD4bNNfHy81q9fL0mKiYnxXqk6zuPxqH379t79PB6Pmjf/v//YHz58WC1atFBdXd1J9w0JCTlhzh8zYsQI9evXz2fMZmsmt7uy0cdojGB7Z2D6/E%2BHsLBQ2e3nq6ysSnV1/PboqaKf5tFT8%2BipeYHUU3%2B9uQ/IgLV161bNnj1b69atU2jotyHi%2BD%2B3bNmiqqoq3Xrrrd7td%2B/erVatWkmSkpOT5XK5dPPNN0uS6urqtH37dg0dOlStWrVSVFSU91krSfryyy917NgxJScnq7i4WEVFRSotLfUGKqfTqXbt2vkEsobExsaecDuwpKRctbXn9g9uMJ1/XV19UK030NFP8%2BipefTUvHO5pwF5CSQ5OVkVFRWaO3euqqqqVFpaqoULF%2Bryyy9XZGSknnzySb3//vuqqanRP/7xD61evVoZGRmSpIyMDL322mv67LPPVFVVpSVLlqhJkybq06ePwsLCNHz4cD333HMqKiqS2%2B3WU089pQEDBqhly5ZKTExUSkqK5s%2Bfr4qKChUUFGjZsmXeYwMAADRGQF7BioyM1AsvvKBHH31UV1xxhZo1a6YrrrhCjz32mOLi4jR9%2BnQ98sgjKioqUsuWLTV9%2BnQNHDhQktS7d2/de%2B%2B9ys7O1qFDh5SSkqLc3Fw1bdpUkpSVlaXKykrdeOONqq2tVd%2B%2Bfb2fryVJzz77rGbOnKmePXsqIiJCI0eO9LlaBgAA0JAQy8RT3WhQSUm58WPabKEaMO8948c9XdZl9/T3Ehpks4UqJqa53O7Kc/aytkn00zx6ah49NS%2BQenrBBZF%2BmTcgbxECAAAEMwIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMCxgA9bOnTt1%2B%2B23q1u3bkpLS1N2drZKSkokSVu3btXQoUPVtWtXDR48WG%2B88YbPvsuXL9egQYPUtWtXZWRkKD8/31s7evSoHnzwQfXu3VupqanKysqS2%2B321gsLC3XXXXcpNTVVffv21dy5c1VfX39mThoAAJwVAjJgHTt2THfccYd69OihrVu3au3atTp06JBmz56t4uJiTZw4USNHjtTWrVs1Y8YMzZw5U06nU5K0ceNGLVy4UL///e/1wQcfqG/fvho/fryOHDkiSVqwYIFcLpfy8vK0fv16WZal%2B%2B%2B/3zv3pEmTFBcXpw0bNmjZsmXasGGDXnzxRb/0AQAABKeADFhVVVXKycnRuHHj1KRJEzkcDg0YMEBfffWV1qxZo9atW2vo0KEKDw9XWlqa%2BvXrp5UrV0qS8vLyNGTIEHXu3FlNmzbV2LFjJUmbNm1SbW2tVq1apYkTJ%2Bqiiy5SdHS0srOztXnzZn3zzTdyOp3auXOnpkyZosjISLVu3VqjR49WXl6eP9sBAACCTEAGrKioKA0bNkw2m02StHv3bv31r3/VNddcI5fLpcTERJ/tExMTvbcBv18PDQ1Vx44d5XQ6tW/fPpWXlyspKclbb9u2rZo2bSqXyyWXy6X4%2BHhFRUV560lJSdqzZ48qKipO5ykDAICziM3fC/gxhYWFGjRokGprazV8%2BHBlZWXpzjvvVFxcnM920dHR3ueoPB6PT0CSvg1sbrdbHo9HkmS3233qdrvdW/9%2B7fix3G63IiIiGrXu4uJi7/Nix9lszRQbG9uo/RsrLCwg8/EPstkCf73HexpsvQ1U9NM8emoePTWPngZ4wIqPj5fT6dS///1vPfjgg7rvvvsatZ9lWT%2B73tC%2BjZGXl6dFixb5jGVmZiorK%2BuUjx3MYmKa%2B3sJjWa3n%2B/vJZxV6Kd59NQ8emreudzTgA5YkhQSEqLWrVsrJydHI0eOVHp6uvdK1HFut1sOh0OSFBMTc0Ld4/Goffv23m08Ho%2BaN/%2B//9gfPnxYLVq0UF1d3Un3DQkJ8e7bGCNGjFC/fv18xmy2ZnK7Kxt9jMYItncGps//dAgLC5Xdfr7KyqpUV8dvj54q%2BmkePTWPnpoXSD3115v7gAxYW7du1ezZs7Vu3TqFhn4bIo7/s1OnTlq/fr3P9vn5%2BercubMkKTk5WS6XSzfffLMkqa6uTtu3b9fQoUPVqlUrRUVFeZ%2B1kqQvv/xSx44dU3JysoqLi1VUVKTS0lJvoHI6nWrXrp1PIGtIbGzsCbcDS0rKVVt7bv/gBtP519XVB9V6Ax39NI%2BemkdPzTuXexqQl0CSk5NVUVGhuXPnqqqqSqWlpVq4cKEuv/xyZWRkqLCwUCtXrtTRo0e1ZcsWbdmyRcOHD5ckZWRk6LXXXtNnn32mqqoqLVmyRE2aNFGfPn0UFham4cOH67nnnlNRUZHcbreeeuopDRgwQC1btlRiYqJSUlI0f/58VVRUqKCgQMuWLVNGRoafOwIAAIJJQAasyMhIvfDCC8rPz9cVV1yhwYMHKzIyUk899ZRatGihpUuXasWKFerWrZsef/xxzZ07V5deeqkkqXfv3rr33nuVnZ2tHj166IMPPlBubq6aNm0qScrKylLnzp114403qn///mrevLkee%2Bwx79zPPvusiouL1bNnT40aNUo33XSTbr31Vr/0AQAABKcQy8RT3WhQSUm58WPabKEaMO8948c9XdZl9/T3Ehpks4UqJqa53O7Kc/aytkn00zx6ah49NS%2BQenrBBZF%2BmTcgr2ABAAAEMwIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMCxgA1ZhYaEyMzOVmpqqtLQ0TZs2TWVlZfr666%2BVkJCglJQUn68//vGP3n3ffPNNXX/99erSpYuGDBmi999/31urr6/XggUL1L9/f3Xv3l1jxozR/v37vXWPx6Ps7GylpaWpV69emjFjhqqrq8/ouQMAgOAWsAFr/Pjxstvt2rhxo1599VV99dVXevLJJ711p9Pp8zVmzBhJ0o4dOzR16lRNmTJF//znPzV69GjdfffdOnDggCTp5Zdf1po1a5Sbm6tNmzapdevWyszMlGVZkqSZM2eqqqpKa9eu1erVq1VQUKB58%2Bad%2BQYAAICgFZABq6ysTMnJyZo8ebKaN2%2BuCy%2B8UDfffLM%2B%2BuijBvdduXKl0tPTlZ6ervDwcN1www3q0KGD3njjDUlSXl6eRo8erbZt2yoiIkI5OTkqKCjQtm3bdPDgQW3YsEE5OTlyOByKi4vTxIkTtXr1atXU1Jzu0wYAAGcJm78XcDJ2u11z5szxGSsqKlJsbKz33%2B%2B77z598MEHqq2t1bBhw5SVlaXzzjtPLpdL6enpPvsmJibK6XSqurpau3btUmJiorcWERGhSy65RE6nU%2BXl5QoLC1NCQoK3npSUpCNHjmj37t0%2B4z%2BmuLhYJSUlPmM2WzOf9ZsQFhaQ%2BfgH2WyBv97jPQ223gYq%2BmkePTWPnppHTwM0YH2f0%2BnUihUrtGTJEjVp0kRdunTRgAED9Nhjj2nHjh2aNGmSbDab7rnnHnk8HkVFRfnsHxUVpV27dunw4cOyLOukdbfbrejoaEVERCgkJMSnJklut7vR683Ly9OiRYt8xjIzM5WVlfVTT/2sEhPT3N9LaDS7/Xx/L%2BGsQj/No6fm0VPzzuWeBnzA%2BvjjjzVhwgRNnjxZaWlpkqRXXnnFW%2B/UqZPGjRunpUuX6p577pEk7/NUP%2BTH6g3t2xgjRoxQv379fMZstmZyuytP%2BdjfFWzvDEyf/%2BkQFhYqu/18lZVVqa6u3t/LCXr00zx6ah49NS%2BQeuqvN/cBHbA2btyo3/3ud5o5c6ZuuummH9wuPj5eBw8elGVZiomJkcfj8al7PB45HA5FR0crNDT0pPUWLVrI4XCooqJCdXV1CgsL89YkqUWLFo1ed2xs7Am3A0tKylVbe27/4AbT%2BdfV1QfVegMd/TSPnppHT807l3sasJdAPvnkE02dOlXPPPOMT7jaunWrlixZ4rPt7t27FR8fr5CQECUnJys/P9%2Bn7nQ61blzZ4WHh6t9%2B/ZyuVzeWllZmfbt26dOnTqpY8eOsixLO3fu9NnXbrerTZs2p%2BlMAQDA2SYgA1Ztba0eeOABTZkyRb169fKpRUZGavHixXr99ddVU1Mjp9OpP/7xj8rIyJAkDR8%2BXB988IE2b96so0ePatWqVdq7d69uuOEGSVJGRoaWL1%2BugoICVVRUaN68eerYsaNSUlLkcDg0aNAgPf300yotLdWBAwe0ePFiDR06VDZbQF/sAwAAASTEMvHQkWEfffSRfv3rX6tJkyYn1N566y1t375dixYt0t69exUZGanf/OY3uvPOOxUa%2Bm1efPvttzV//nwVFhaqXbt2mjFjhrp37y7p22esFi5cqFdeeUWVlZVKTU3Vww8/rAsvvFCSVF5erlmzZmnTpk0677zzdN1112natGknXctPUVJSfkr7n4zNFqoB894zftzTZV12T38voUE2W6hiYprL7a48Zy9rm0Q/zaOn5tFT8wKppxdcEOmXeQMyYJ2NCFgErHMR/TSPnppHT80LpJ76K2AF5C1CAACAYEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhmPGD169dPixYtUlFRkelDAwAABAXjAeuWW27Rm2%2B%2Bqauuukpjx47V22%2B/rdraWtPTAAAABCzjASszM1Nvvvmm/vKXv6h9%2B/Z6/PHHlZ6errlz52rPnj2mpwMAAAg4p%2B0ZrKSkJE2dOlWbNm3S9OnT9Ze//EXXXnutxowZo88///x0TQsAAOB3py1g1dTU6M0339Sdd96pqVOnKi4uTvfff786duyo0aNHa82aNadragAAAL%2BymT5gQUGBVq1apddee02VlZUaNGiQXnzxRXXr1s27Tffu3TV79mxdf/31pqcHAADwO%2BMBa/DgwWrTpo3GjRunm266SdHR0Sdsk56ertLSUtNTAwAABATjAWv58uXq0aNHg9tt27bN9NQAAAABwfgzWAkJCRo/frw2bNjgHfvv//5v3XnnnfJ4PKanAwAACDjGA9acOXNUXl6udu3aecf69Omj%2Bvp6PfHEE6anAwAACDjGbxG%2B//77WrNmjWJiYrxjrVu31rx583TdddeZng4AACDgGL%2BCVV1drfDw8BMnCg1VVVWV6ekAAAACjvGA1b17dz3xxBM6fPiwd%2Bybb77RQw895PNRDQAAAGcr47cIp0%2BfrjvuuEO/%2BtWvFBERofr6elVWVqpVq1Z66aWXTE8HAAAQcIwHrFatWulvf/ub3n33Xe3bt0%2BhoaFq06aNevXqpbCwMNPTAQAABBzjAUuSmjRpoquuuup0HBoAACDgGQ9Y%2B/fv1/z58/XVV1%2Bpurr6hPo777xjekoAAICAclqewSouLlavXr3UrFkz04cHAAAIeMYDVn5%2Bvt555x05HA7ThwYAAAgKxj%2BmoUWLFly5AgAA5zTjAWvcuHFatGiRLMsyfWgAAICgYPwW4bvvvqtPPvlEr776qi6%2B%2BGKFhvpmuFdeecX0lAAAAAHFeMCKiIhQ7969T/k4hYWFevzxx/XRRx8pLCxMvXv31vTp02W327Vjxw499thj2rFjh1q0aKGRI0fqjjvu8O775ptvasmSJfr666/Vpk0b3XvvverVq5ckqb6%2BXs8884zWrl2rsrIyderUSbNnz1arVq0kSR6PR7Nnz9aHH36o0NBQpaena%2BbMmWratOkpnxMAADg3GA9Yc%2BbMMXKc8ePHKzk5WRs3blR5ebkyMzP15JNPaubMmRo3bpyGDx%2Bu3Nxc7dmzR3fccYcuvvhiDRw4UDt27NDUqVO1aNEiXXHFFVq/fr3uvvtuvfXWW7rwwgv18ssva82aNXr%2B%2BecVFxenBQsWKDMzU6%2B//rpCQkI0c%2BZMHTt2TGvXrlVNTY3uuecezZs3Tw888ICR8wIAAGc/489gSdLu3bu1cOFC3X///d6xTz/9tNH7l5WVKTk5WZMnT1bz5s114YUX6uabb9ZHH32kzZs3q6amRhMmTFCzZs2UlJSkYcOGKS8vT5K0cuVKpaenKz09XeHh4brhhhvUoUMHvfHGG5KkvLw8jR49Wm3btlVERIRycnJUUFCgbdu26eDBg9qwYYNycnLkcDgUFxeniRMnavXq1aqpqTHbJAAAcNYyfgVr69atuvPOO9WmTRvt3btXc%2BbM0f79%2BzVq1Cg9/fTT6t%2B/f4PHsNvtJ1wJKyoqUmxsrFwulxISEnz%2B7E5iYqJWrlwpSXK5XEpPT/fZNzExUU6nU9XV1dq1a5cSExO9tYiICF1yySVyOp0qLy9XWFiYEhISvPWkpCQdOXJEu3fv9hn/McXFxSopKfEZs9maKTY2tlH7N1ZY2GnJx6eNzRb46z3e02DrbaCin%2BbRU/PoqXn09DQErAULFuh3v/udbr/9dnXq1EnSt3%2Bf8IknntDixYsbFbC%2Bz%2Bl0asWKFVqyZInWrVsnu93uU4%2BOjpbH41F9fb08Ho%2BioqJ86lFRUdq1a5cOHz4sy7JOWne73YqOjlZERIRCQkJ8apLkdrsbvd68vDwtWrTIZywzM1NZWVmNPsbZKCamub%2BX0Gh2%2B/n%2BXsJZhX6aR0/No6fmncs9NR6wvvzyS61YsUKSfILK1VdfrenTp//k43388ceaMGGCJk%2BerLS0NK1bt%2B6k2313roY%2BIuLH6iY%2BXmLEiBHq16%2Bfz5jN1kxud%2BUpH/u7gu2dgenzPx3CwkJlt5%2BvsrIq1dXV%2B3s5QY9%2BmkdPzaOn5gVST/315t54wIqMjFR1dbWaNGniM15cXHzCWEM2btyo3/3ud5o5c6ZuuukmSZLD4dDevXt9tvN4PIqOjlZoaKhiYmLk8XhOqDscDu82J6u3aNFCDodDFRUVqqur896CPL5tixYtGr3u2NjYE24HlpSUq7b23P7BDabzr6urD6r1Bjr6aR49NY%2Bemncu99T4JZCuXbvq8ccfV0VFhXdsz549mjp1qn71q181%2BjiffPKJpk6dqmeeecYbriQpOTlZX3zxhWpra71jTqdTnTt39tbz8/N9jnW8Hh4ervbt28vlcnlrZWVl2rdvnzp16qSOHTvKsizt3LnTZ1%2B73a42bdo0vgkAAOCcZjxg3X///fr000%2BVmpqqo0ePqmvXrrr22mvl8Xg0bdq0Rh2jtrZWDzzwgKZMmeL9/Krj0tPTFRERoSVLlqiqqkrbtm3TqlWrlJGRIUkaPny4PvjgA23evFlHjx7VqlWrtHfvXt1www2SpIyMDC1fvlwFBQWqqKjQvHnz1LFjR6WkpMjhcGjQoEF6%2BumnVVpaqgMHDmjx4sUaOnSobDbjF/sAAMBZKsQ6DX/TpqamRlu2bNGePXvUtGlTtWnTRj179vR5TurHfPTRR/r1r3990luKb731liorKzVr1izl5%2BerZcuWuvPOO3Xrrbd6t3n77bc1f/58FRYWql27dpoxY4a6d%2B8u6dtnrBYuXKhXXnlFlZWVSk1N1cMPP6wLL7xQklReXq5Zs2Zp06ZNOu%2B883Tddddp2rRpP/n25veVlJSf0v4nY7OFasC894wf93RZl93T30tokM0WqpiY5nK7K8/Zy9om0U/z6Kl59NS8QOrpBRdE%2BmXe0xKwcCICFgHrXEQ/zaOn5tFT8wKpp/4KWMbve/Xr1%2B9Hr1S98847pqcEAAAIKMYD1rXXXusTsOrq6rRnzx45nU7dfvvtpqcDAAAIOMYD1pQpU046vn79ev3rX/8yPR0AAEDAOWOfVHnVVVfpb3/725maDgAAwG/OWMDavn27kU9JBwAACHTGbxGOHDnyhLGqqioVFBRo4MCBpqcDAAAIOMYDVuvWrU/4LcLw8HANHTpUw4YNMz0dAABAwDEesJ544gnThwQAAAgqxgPWa6%2B91uhtv/s3BgEAAM4WxgPWjBkzVF9ff8ID7SEhIT5jISEhBCwAAHBWMh6w/vCHP%2BiFF17Q%2BPHjlZCQIMuy9MUXX%2Bj555/XbbfdptTUVNNTAgAABJTT8gxWbm6u4uLivGOXX365WrVqpTFjxmjt2rWmpwQAAAgoxj8Ha%2B/evYqKijph3G63q7Cw0PR0AAAAAcd4wIqPj9cTTzwht9vtHSsrK9P8%2BfP1i1/8wvR0AAAAAcf4LcLp06dr8uTJysvLU/PmzRUaGqqKigo1bdpUixcvNj0dAABAwDEesHr16qXNmzdry5YtOnDggCzLUlxcnK688kpFRkaang4AACDgGA9YknT%2B%2Beerf//%2BOnDggFq1anU6pgAAAAhYxp/Bqq6u1tSpU9WlSxddc801kr59Bmvs2LEqKyszPR0AAEDAMR6w5s6dqx07dmjevHkKDf2/w9fV1WnevHmmpwMAAAg4xgPW%2BvXr9eyzz%2Brqq6/2/tFnu92uOXPm6O233zY9HQAAQMAxHrAqKyvVunXrE8YdDoeOHDliejoAAICAYzxg/eIXv9C//vUvSfL524NvvfWW/t//%2B3%2BmpwMAAAg4xn%2BL8NZbb9WkSZN0yy23qL6%2BXsuWLVN%2Bfr7Wr1%2BvGTNmmJ4OAAAg4BgPWCNGjJDNZtOKFSsUFham5557Tm3atNG8efN09dVXm54OAAAg4BgPWKWlpbrlllt0yy23mD40AABAUDD%2BDFb//v19nr0CAAA41xgPWKmpqVq3bp3pwwIAAAQN47cIL7roIj322GPKzc3VL37xC5133nk%2B9fnz55ueEgAAIKAYD1i7du3SL3/5S0mS2%2B02fXgAAICAZyxg5eTkaMGCBXrppZe8Y4sXL1ZmZqapKQAAAIKCsWewNm7ceMJYbm7uKR3zvffeU1pamnJycnzGX331VV166aVKSUnx%2Bfr8888lSfX19VqwYIH69%2B/Ba4eaAAAcoUlEQVSv7t27a8yYMdq/f793f4/Ho%2BzsbKWlpalXr16aMWOGqqurvfUdO3botttuU7du3TRw4EC98MILp3QeAADg3GIsYJ3sNwdP5bcJn3/%2BeT366KO65JJLTlrv3r27nE6nz1enTp0kSS%2B//LLWrFmj3Nxcbdq0Sa1bt1ZmZqZ3PTNnzlRVVZXWrl2r1atXq6CgwPuHqKurqzVu3DhdccUVeu%2B997RgwQItXbqUv6MIAAAazVjAOv6HnRsaa6zw8HCtWrXqBwPWj8nLy9Po0aPVtm1bRUREKCcnRwUFBdq2bZsOHjyoDRs2KCcnRw6HQ3FxcZo4caJWr16tmpoabd68WTU1NZowYYKaNWumpKQkDRs2THl5eT/7XAAAwLnF%2BMc0mDJq1ChFRkb%2BYL2oqEi//e1v1b17d/Xv31%2Bvv/66pG%2BvQO3atUuJiYnebSMiInTJJZfI6XRqx44dCgsLU0JCgreelJSkI0eOaPfu3XK5XEpISFBYWJi3npiYqPz8/NNwlgAA4Gxk/LcIzwSHw6HWrVvr3nvvVbt27fT3v/9d9913n2JjY/XLX/5SlmUpKirKZ5%2BoqCi53W5FR0crIiLC5%2Bra8W3dbrc8Ho/sdrvPvtHR0fJ4PKqvr1doaMOZtLi4WCUlJT5jNlszxcbG/txTPqmwsIDNxydlswX%2Beo/3NNh6G6jop3n01Dx6ah49NRiwampqNHny5AbHTHwOVp8%2BfdSnTx/vvw8ePFh///vf9eqrr2rKlCmSfvz5r5/zbNhPud2Zl5enRYsW%2BYxlZmYqKyvrJ897NomJae7vJTSa3X6%2Bv5dwVqGf5tFT8%2BipeedyT40FrG7duqm4uLjBsdMlPj5e%2Bfn5io6OVmhoqDwej0/d4/GoRYsWcjgcqqioUF1dnfc24PFtj9f37t17wr7Hj9sYI0aMUL9%2B/XzGbLZmcrsrf%2BbZnVywvTMwff6nQ1hYqOz281VWVqW6unp/Lyfo0U/z6Kl59NS8QOqpv97cGwtY3/38q9Ptz3/%2Bs6KionTttdd6xwoKCtSqVSuFh4erffv2crlc6tGjhySprKxM%2B/btU6dOnRQfHy/LsrRz504lJSVJkpxOp%2Bx2u9q0aaPk5GT9%2Bc9/Vm1trWw2m7feuXPnRq8vNjb2hNuBJSXlqq09t39wg%2Bn86%2Brqg2q9gY5%2BmkdPzaOn5p3LPQ2uSyD/69ixY3rkkUfkdDpVU1OjtWvX6t1339XIkSMlSRkZGVq%2BfLkKCgpUUVGhefPmqWPHjkpJSZHD4dCgQYP09NNPq7S0VAcOHNDixYs1dOhQ2Ww2paenKyIiQkuWLFFVVZW2bdumVatWKSMjw89nDQAAgkXAPuSekpIiSaqtrZUkbdiwQdK3V5NGjRqlyspK3XPPPSopKdHFF1%2BsxYsXKzk5WZI0cuRIlZSU6De/%2BY0qKyuVmprq80zUww8/rFmzZql///4677zzdN1113k/zLRJkyZ67rnnNGvWLOXm5qply5bKycnxeeYLAADgx4RYp/JpoGi0kpJy48e02UI1YN57xo97uqzL7unvJTTIZgtVTExzud2V5%2BxlbZPop3n01Dx6al4g9fSCC374I59Op6C8RQgAABDICFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAsIAOWO%2B9957S0tKUk5NzQu3NN9/U9ddfry5dumjIkCF6//33vbX6%2BnotWLBA/fv3V/fu3TVmzBjt37/fW/d4PMrOzlZaWpp69eqlGTNmqLq62lvfsWOHbrvtNnXr1k0DBw7UCy%2B8cHpPFAAAnFUCNmA9//zzevTRR3XJJZecUNuxY4emTp2qKVOm6J///KdGjx6tu%2B%2B%2BWwcOHJAkvfzyy1qzZo1yc3O1adMmtW7dWpmZmbIsS5I0c%2BZMVVVVae3atVq9erUKCgo0b948SVJ1dbXGjRunK664Qu%2B9954WLFigpUuX6u233z5zJw8AAIJawAas8PBwrVq16qQBa%2BXKlUpPT1d6errCw8N1ww03qEOHDnrjjTckSXl5eRo9erTatm2riIgI5eTkqKCgQNu2bdPBgwe1YcMG5eTkyOFwKC4uThMnTtTq1atVU1OjzZs3q6amRhMmTFCzZs2UlJSkYcOGKS8v70y3AAAABKmADVijRo1SZGTkSWsul0uJiYk%2BY4mJiXI6naqurtauXbt86hEREbrkkkvkdDq1Y8cOhYWFKSEhwVtPSkrSkSNHtHv3brlcLiUkJCgsLMzn2Pn5%2BYbPEAAAnK1s/l7Az%2BHxeBQVFeUzFhUVpV27dunw4cOyLOukdbfbrejoaEVERCgkJMSnJklut1sej0d2u91n3%2BjoaHk8HtXX1ys0tOFMWlxcrJKSEp8xm62ZYmNjf9J5NiQsLGDz8UnZbIG/3uM9DbbeBir6aR49NY%2BemkdPgzRgSfI%2BT/Vz6g3tezLfDWQNycvL06JFi3zGMjMzlZWV9ZPnPZvExDT39xIazW4/399LOKvQT/PoqXn01LxzuadBGbBiYmLk8Xh8xjwejxwOh6KjoxUaGnrSeosWLeRwOFRRUaG6ujrvbcDj2x6v792794R9jx%2B3MUaMGKF%2B/fr5jNlszeR2V/6U02xQsL0zMH3%2Bp0NYWKjs9vNVVlalurp6fy8n6NFP8%2BipefTUvEDqqb/e3AdlwEpOTj7hmSin06nBgwcrPDxc7du3l8vlUo8ePSRJZWVl2rdvnzp16qT4%2BHhZlqWdO3cqKSnJu6/dblebNm2UnJysP//5z6qtrZXNZvPWO3fu3Oj1xcbGnnA7sKSkXLW15/YPbjCdf11dfVCtN9DRT/PoqXn01LxzuafBdQnkfw0fPlwffPCBNm/erKNHj2rVqlXau3evbrjhBklSRkaGli9froKCAlVUVGjevHnq2LGjUlJS5HA4NGjQID399NMqLS3VgQMHtHjxYg0dOlQ2m03p6emKiIjQkiVLVFVVpW3btmnVqlXKyMjw81kDAIBgEbBXsFJSUiRJtbW1kqQNGzZI%2BvZqUocOHTRv3jzNmTNHhYWFateunZYuXaoLLrhAkjRy5EiVlJToN7/5jSorK5WamurzTNTDDz%2BsWbNmqX///jrvvPN03XXXeT/MtEmTJnruuec0a9Ys5ebmqmXLlsrJyVGfPn3O4NkDAIBgFmL9nCe%2B8ZOVlJQbP6bNFqoB894zftzTZV12T38voUE2W6hiYprL7a48Zy9rm0Q/zaOn5tFT8wKppxdccPKPfDrdgvIWIQAAQCAjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDgjZgJSQkKDk5WSkpKd6vRx55RJK0detWDR06VF27dtXgwYP1xhtv%2BOy7fPlyDRo0SF27dlVGRoby8/O9taNHj%2BrBBx9U7969lZqaqqysLLnd7jN6bgAAILjZ/L2AU/HWW2/p4osv9hkrLi7WxIkTNWPGDF1//fX6%2BOOPNWHCBLVp00YpKSnauHGjFi5cqD/84Q9KSEjQ8uXLNX78eL399ttq1qyZFixYIJfLpby8PJ1//vmaOXOm7r//fj333HN%2BOksAABBsgvYK1g9Zs2aNWrduraFDhyo8PFxpaWnq16%2BfVq5cKUnKy8vTkCFD1LlzZzVt2lRjx46VJG3atEm1tbVatWqVJk6cqIsuukjR0dHKzs7W5s2b9c033/jztAAAQBAJ6itY8%2BfP16effqqKigpdc801mjZtmlwulxITE322S0xM1Lp16yRJLpdL1157rbcWGhqqjh07yul0qmPHjiovL1dSUpK33rZtWzVt2lQul0txcXGNWldxcbFKSkp8xmy2ZoqNjf25p3pSYWHBlY9ttsBf7/GeBltvAxX9NI%2BemkdPzaOnQRywLrvsMqWlpenJJ5/U/v37lZ2drYceekgej%2BeEIBQdHe19jsrj8SgqKsqnHhUVJbfbLY/HI0my2%2B0%2Bdbvd/pOew8rLy9OiRYt8xjIzM5WVldXoY5yNYmKa%2B3sJjWa3n%2B/vJZxV6Kd59NQ8emreudzToA1YeXl53v/dtm1bTZkyRRMmTFC3bt0a3NeyrFOqN2TEiBHq16%2Bfz5jN1kxud%2BUpHff7gu2dgenzPx3CwkJlt5%2BvsrIq1dXV%2B3s5QY9%2BmkdPzaOn5gVST/315j5oA9b3XXzxxaqrq1NoaKj3StRxbrdbDodDkhQTE3NC3ePxqH379t5tPB6Pmjf/v/9DDh8%2BrBYtWjR6LbGxsSfcDiwpKVdt7bn9gxtM519XVx9U6w109NM8emoePTXvXO5pcF0C%2BV/bt2/XE0884TNWUFCgJk2aKD093edjFyQpPz9fnTt3liQlJyfL5XJ5a3V1ddq%2Bfbs6d%2B6sVq1aKSoqyqf%2B5Zdf6tixY0pOTj6NZwQAAM4mQRmwWrRooby8POXm5urYsWPas2ePnnnmGY0YMUI33nijCgsLtXLlSh09elRbtmzRli1bNHz4cElSRkaGXnvtNX322WeqqqrSkiVL1KRJE/Xp00dhYWEaPny4nnvuORUVFcntduupp57SgAED1LJlSz%2BfNQAACBZBeYswLi5Oubm5mj9/vjcg3XzzzcrJyVF4eLiWLl2qRx99VA899JDi4%2BM1d%2B5cXXrppZKk3r17695771V2drYOHTqklJQU5ebmqmnTppKkrKwsVVZW6sYbb1Rtba369u2r2bNn%2B/FsAQBAsAmxTvWJbjRKSUm58WPabKEaMO8948c9XdZl9/T3Ehpks4UqJqa53O7Kc/a5AZPop3n01Dx6al4g9fSCCyL9Mm9Q3iIEAAAIZAQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgB6yQKCwt11113KTU1VX379tXcuXNVX1/v72UBAIAgYfP3AgLRpEmTlJSUpA0bNujQoUMaN26cWrZsqd/%2B9rf%2BXhoAAAgCXMH6HqfTqZ07d2rKlCmKjIxU69atNXr0aOXl5fl7aQAAIEhwBet7XC6X4uPjFRUV5R1LSkrSnj17VFFRoYiIiAaPUVxcrJKSEp8xm62ZYmNjja41LCy48rHNFvjrPd7TYOttoKKf5tFT8%2BipefSUgHUCj8cju93uM3Y8bLnd7kYFrLy8PC1atMhn7O6779akSZPMLVTfBrnbL/xKI0aMMB7ezlXFxcV68cU/0FND6Kd59NQ8emoePeUW4UlZlnVK%2B48YMUKvvvqqz9eIESMMre7/lJSUaNGiRSdcLcPPR0/Nop/m0VPz6Kl59JQrWCdwOBzyeDw%2BYx6PRyEhIXI4HI06Rmxs7Dmb2AEAAFewTpCcnKyioiKVlpZ6x5xOp9q1a6fmzZv7cWUAACBYELC%2BJzExUSkpKZo/f74qKipUUFCgZcuWKSMjw99LAwAAQSJs9uzZs/29iEBz5ZVXau3atXrkkUf0t7/9TUOHDtWYMWMUEhLi76WdoHnz5urRowdX1wyip2bRT/PoqXn01Lxzvach1qk%2B0Q0AAAAf3CIEAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyAFYQKCwt11113KTU1VX379tXcuXNVX1/v72WdcYWFhcrMzFRqaqrS0tI0bdo0lZWVSZJ27Nih2267Td26ddPAgQP1wgsv%2BOz75ptv6vrrr1eXLl00ZMgQvf/%2B%2B95afX29FixYoP79%2B6t79%2B4aM2aM9u/f7617PB5lZ2crLS1NvXr10owZM1RdXe2tNzR3MHj88ceVkJDg/fetW7dq6NCh6tq1qwYPHqw33njDZ/vly5dr0KBB6tq1qzIyMpSfn%2B%2BtHT16VA8%2B%2BKB69%2B6t1NRUZWVlye12e%2BsNfT83NHcwWLJkiXr16qXLLrtMo0eP1tdffy2Jvv4c27dv16hRo3T55ZerZ8%2BemjJlikpLSyXRz5/ivffeU1pamnJyck6o%2BfP18VTmDjgWgs7NN99sPfDAA1ZZWZm1Z88ea%2BDAgdYLL7zg72Wdcdddd501bdo0q6KiwioqKrKGDBliTZ8%2B3aqqqrKuvPJKa%2BHChVZlZaWVn59v9ejRw1q/fr1lWZa1fft2Kzk52dq8ebNVXV1tvf7661bnzp2toqIiy7Isa/ny5Vbfvn2tXbt2WeXl5dbDDz9sXX/99VZ9fb1lWZZ19913W3fddZd16NAh68CBA9aIESOsRx55xLIsq8G5g8H27dutHj16WB06dLAsy7K%2B%2BeYb67LLLrNWrlxpVVdXW//4xz%2BsTp06WZ9//rllWZb1zjvvWJdffrn12WefWVVVVdbSpUutnj17WpWVlZZlWdacOXOsIUOGWP/5z38st9tt3X333da4ceO88/3Y93NDcweDFStWWFdffbVVUFBglZeXW4888oj1yCOP0NefoaamxurZs6c1f/586%2BjRo1Zpaan129/%2B1po0aRL9/Alyc3OtgQMHWiNHjrSys7N9av58fTzVuQMNASvIfP7551bHjh0tj8fjHfvTn/5kDRo0yI%2BrOvMOHz5sTZs2zSopKfGOvfTSS9bAgQOtdevWWVdccYVVW1vrrc2dO9e64447LMuyrIceesjKzMz0Od6wYcOspUuXWpZlWYMHD7ZefPFFb628vNxKTEy0Pv30U6ukpMS69NJLrR07dnjrW7ZssS677DLr2LFjDc4d6Orq6qxhw4ZZ//Vf/%2BUNWH/4wx%2Bsm266yWe77Oxsa%2BbMmZZlWdZdd91lPf744z7H6Nmzp7V27VqrpqbG6tatm7VhwwZvfdeuXVZCQoJ14MCBBr%2BfG5o7GPTr1%2B%2BkAZu%2B/nT/%2Bc9/rA4dOli7du3yjv3pT3%2ByrrrqKvr5E7z44otWWVmZNXXq1BMClj9fH09l7kDELcIg43K5FB8fr6ioKO9YUlKS9uzZo4qKCj%2Bu7Myy2%2B2aM2eOWrZs6R0rKipSbGysXC6XEhISFBYW5q0lJiZ6bwe4XC4lJib6HC8xMVFOp1PV1dXatWuXTz0iIkKXXHKJnE6nduzYobCwMJ/bZ0lJSTpy5Ih2797d4NyB7pVXXlF4eLiuv/5679gP9euH%2BhkaGqqOHTvK6XRq3759Ki8vV1JSkrfetm1bNW3aVC6Xq8Hv54bmDnTffPONvv76ax0%2BfFjXXnut99ZTaWkpff0Z4uLi1LFjR%2BXl5amyslKHDh3S22%2B/rT59%2BtDPn2DUqFGKjIw8ac2fr4%2BnMncgImAFGY/HI7vd7jN2/If%2Bu88LnGucTqdWrFihCRMmnLRH0dHR8ng8qq%2Bvl8fj8XmhlL7todvt1uHDh2VZ1g/WPR6PIiIiFBIS4lOT5K3/2NyB7ODBg1q4cKFmzZrlM/5D53T8%2B%2B3H%2BunxeCTphP3tdvsP9qsx/QyW7/UDBw5Ikt566y0tW7ZMr7/%2Bug4cOKAHHniAvv4MoaGhWrhwod555x117dpVaWlpqq2t1eTJk%2BmnIf58fTyVuQMRASsIWZbl7yUElI8//lhjxozR5MmTlZaW9oPbffeHvqEe/lj95/T/u3MHqjlz5mjIkCFq167dT973TPczWBw/t7FjxyouLk4XXnihJk2apI0bN/6k/X9O/Wzs67FjxzR%2B/HhdffXV%2Buijj/Tuu%2B8qMjJSU6ZMadT%2B9LNx/Pn6eCpzBxoCVpBxOBzed1vHeTwehYSEyOFw%2BGlV/rNx40bdddddmj59ukaNGiXp2x59/x2Nx%2BNRdHS0QkNDFRMTc9IeOhwO7zYnq7do0UIOh0MVFRWqq6vzqUny1n9s7kC1detWffrpp8rMzDyhdrJ%2Bud1u7/fbj/Xz%2BDbfrx8%2BfNjbrx/7fm5o7kB3/Bb2d9%2B1x8fHy7Is1dTU0NefaOvWrfr666917733KjIyUnFxccrKytLf//73k/7c0s%2Bfzp%2Bvj6cydyAK3Fd8nFRycrKKioq8v5YsfXt7rF27dmrevLkfV3bmffLJJ5o6daqeeeYZ3XTTTd7x5ORkffHFF6qtrfWOOZ1Ode7c2Vv//rMRx%2Bvh4eFq3769XC6Xt1ZWVqZ9%2B/apU6dO6tixoyzL0s6dO332tdvtatOmTYNzB6o33nhDhw4dUt%2B%2BfZWamqohQ4ZIklJTU9WhQ4cT%2BpWfn%2B/Tz%2B/2q66uTtu3b1fnzp3VqlUrRUVF%2BdS//PJLHTt2TMnJyQ1%2BP6ekpPzo3IHuwgsvVEREhHbs2OEdKyws1Hnnnaf09HT6%2BhPV1dWpvr7e5yrGsWPHJElpaWn00wB/vj6eytwB6Uw9TQ9zhg0bZk2fPt0qLy%2B3du3aZfXr189asWKFv5d1RtXU1FjXXHON9corr5xQO3r0qNW3b1/r2WeftY4cOWJ99tln1uWXX25t2rTJsizL%2BuKLL6yUlBRr06ZNVnV1tbVy5UqrS5cuVnFxsWVZ3/52UJ8%2Bfby/Cjxz5kzrlltu8R4/OzvbGjt2rHXo0CGrqKjIuuWWW6wnnniiUXMHKo/HYxUVFXm/Pv30U6tDhw5WUVGRVVhYaHXp0sX6y1/%2BYlVXV1ubN2%2B2OnXq5P1NoS1btljdunWzPv30U%2BvIkSPWwoULrfT0dKuqqsqyrG9/S%2Bjmm2%2B2/vOf/1ilpaXWuHHjrEmTJnnn/rHv54MHD/7o3MHg8ccft/r372/t3bvXOnjwoDVixAhr2rRpDZ4bfT1RaWmp1aNHD%2Bupp56yjhw5YpWWllrjx4%2B3fv3rX9PPn%2BFkv0Xoz9fHU5070BCwglBRUZE1duxYq1OnTlZaWpr17LPPBuzngJwu//M//2N16NDBSk5OPuHr66%2B/tr744gtr5MiRVnJystWnTx/r5Zdf9tl//fr11sCBA62kpCTrxhtvtD788ENvrb6%2B3nrmmWesX/3qV1anTp2sO%2B%2B80/s5LJZlWWVlZVZOTo512WWXWd27d7ceeugh6%2BjRo956Q3MHg/3793s/psGyLOvDDz%2B0brjhBispKckaOHDgCR878PLLL1vp6elWcnKylZGRYX3xxRfe2tGjR63Zs2db3bt3t7p06WLde%2B%2B9VllZmbfe0PdzQ3MHuu%2Be/2WXXWZNnTrVqqiosCyLvv4cTqfTuu2226zLL7/cSktLs7Kzs60DBw5YlkU/G%2Bv4a%2BWll15qXXrppd5/P86fr4%2BnMnegCbGsIHpiDAAAIAjwDBYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGPb/AWKwlzKvpdGtAAAAAElFTkSuQmCC\"/>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\" id=\"common4614937497148659063\">\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">3</td>\n        <td class=\"number\">1211</td>\n        <td class=\"number\">3.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2</td>\n        <td class=\"number\">1201</td>\n        <td class=\"number\">3.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1</td>\n        <td class=\"number\">1199</td>\n        <td class=\"number\">3.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4</td>\n        <td class=\"number\">1186</td>\n        <td class=\"number\">3.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">5</td>\n        <td class=\"number\">1113</td>\n        <td class=\"number\">3.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">6</td>\n        <td class=\"number\">1088</td>\n        <td class=\"number\">3.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">7</td>\n        <td class=\"number\">993</td>\n        <td class=\"number\">2.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:4%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">8</td>\n        <td class=\"number\">951</td>\n        <td class=\"number\">2.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:4%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">9</td>\n        <td class=\"number\">936</td>\n        <td class=\"number\">2.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:4%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">11</td>\n        <td class=\"number\">854</td>\n        <td class=\"number\">2.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:4%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"other\">\n        <td class=\"fillremaining\">Other values (1715)</td>\n        <td class=\"number\">25782</td>\n        <td class=\"number\">70.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\"  id=\"extreme4614937497148659063\">\n            <p class=\"h4\">Minimum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">0</td>\n        <td class=\"number\">96</td>\n        <td class=\"number\">0.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1</td>\n        <td class=\"number\">1199</td>\n        <td class=\"number\">3.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:99%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2</td>\n        <td class=\"number\">1201</td>\n        <td class=\"number\">3.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:99%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">3</td>\n        <td class=\"number\">1211</td>\n        <td class=\"number\">3.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4</td>\n        <td class=\"number\">1186</td>\n        <td class=\"number\">3.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:97%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n            <p class=\"h4\">Maximum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">247108</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">304689</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">526122</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">680823</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1029527</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n    </div>\n</div>\n</div>\n    <div class=\"row headerrow highlight\">\n        <h1>Sample</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-12\" style=\"overflow:scroll; width: 100%%; overflow-y: hidden;\">\n        <table border=\"1\" class=\"dataframe sample\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>average_rating</th>\n      <th>is_ebook</th>\n      <th>num_pages</th>\n      <th>publication_year</th>\n      <th>ratings_count</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>3.83</td>\n      <td>false</td>\n      <td>80</td>\n      <td>1887</td>\n      <td>3</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>3.83</td>\n      <td>false</td>\n      <td>128</td>\n      <td>2015</td>\n      <td>37</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>4.38</td>\n      <td>false</td>\n      <td>NaN</td>\n      <td>2008</td>\n      <td>45</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>3.71</td>\n      <td>false</td>\n      <td>190</td>\n      <td>1964</td>\n      <td>115</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>5.00</td>\n      <td>false</td>\n      <td>118</td>\n      <td>2015</td>\n      <td>9</td>\n    </tr>\n  </tbody>\n</table>\n    </div>\n</div>\n</div>\n</body>\n</html>"
  },
  {
    "path": "recommender/results/profiler_books_metadata_2.html",
    "content": "<!doctype html>\n\n<html lang=\"en\">\n<head>\n  <meta charset=\"utf-8\">\n\n  <title>Profile report</title>\n  <meta name=\"description\" content=\"Profile report generated by pandas-profiling. See GitHub.\">\n  <meta name=\"author\" content=\"pandas-profiling\">\n    <script src=\"https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js\"></script>\n\n    <link rel=\"stylesheet\" href=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css\"\n          integrity=\"sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7\" crossorigin=\"anonymous\">\n    <link rel=\"stylesheet\" href=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap-theme.min.css\"\n          integrity=\"sha384-fLW2N01lMqjakBkx3l/M9EahuwpSfeNvV63J5ezn3uZzapT0u7EYsXMjQV+0En5r\" crossorigin=\"anonymous\">\n    <script src=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js\" integrity=\"sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS\" crossorigin=\"anonymous\"></script>\n    <script>\n       $(function () {\n              $('[data-toggle=\"tooltip\"]').tooltip()\n        })\n    </script>\n</head>\n\n<body>\n    <meta charset=\"UTF-8\">\n\n<style>\n\n        .variablerow {\n            border: 1px solid #e1e1e8;\n            border-top: hidden;\n            padding-top: 2em;\n            padding-bottom: 2em;\n            padding-left: 1em;\n            padding-right: 1em;\n        }\n\n        .headerrow {\n            border: 1px solid #e1e1e8;\n            background-color: #f5f5f5;\n            padding: 2em;\n        }\n        .namecol {\n            margin-top: -1em;\n            overflow-x: auto;\n        }\n\n        .dl-horizontal dt {\n            text-align: left;\n            padding-right: 1em;\n            white-space: normal;\n        }\n\n        .dl-horizontal dd {\n            margin-left: 0;\n        }\n\n        .ignore {\n            opacity: 0.4;\n        }\n\n        .container.pandas-profiling {\n            max-width:975px;\n        }\n\n        .col-md-12 {\n            padding-left: 2em;\n        }\n\n        .indent {\n            margin-left: 1em;\n        }\n\n        /* Table example_values */\n            table.example_values {\n                border: 0;\n            }\n\n            .example_values th {\n                border: 0;\n                padding: 0 ;\n                color: #555;\n                font-weight: 600;\n            }\n\n            .example_values tr, .example_values td{\n                border: 0;\n                padding: 0;\n                color: #555;\n            }\n\n        /* STATS */\n            table.stats {\n                border: 0;\n            }\n\n            .stats th {\n                border: 0;\n                padding: 0 2em 0 0;\n                color: #555;\n                font-weight: 600;\n            }\n\n            .stats tr {\n                border: 0;\n            }\n\n            .stats tr:hover{\n                text-decoration: underline;\n            }\n\n            .stats td{\n                color: #555;\n                padding: 1px;\n                border: 0;\n            }\n\n\n        /* Sample table */\n            table.sample {\n                border: 0;\n                margin-bottom: 2em;\n                margin-left:1em;\n            }\n            .sample tr {\n                border:0;\n            }\n            .sample td, .sample th{\n                padding: 0.5em;\n                white-space: nowrap;\n                border: none;\n\n            }\n\n            .sample thead {\n                border-top: 0;\n                border-bottom: 2px solid #ddd;\n            }\n\n            .sample td {\n                width:100%;\n            }\n\n\n        /* There is no good solution available to make the divs equal height and then center ... */\n            .histogram {\n                margin-top: 3em;\n            }\n        /* Freq table */\n\n            table.freq {\n                margin-bottom: 2em;\n                border: 0;\n            }\n            table.freq th, table.freq tr, table.freq td {\n                border: 0;\n                padding: 0;\n            }\n\n            .freq thead {\n                font-weight: 600;\n                white-space: nowrap;\n                overflow: hidden;\n                text-overflow: ellipsis;\n\n            }\n\n            td.fillremaining{\n                width:auto;\n                max-width: none;\n            }\n\n            td.number, th.number {\n                text-align:right ;\n            }\n\n        /* Freq mini */\n            .freq.mini td{\n                width: 50%;\n                padding: 1px;\n                font-size: 12px;\n\n            }\n            table.freq.mini {\n                 width:100%;\n            }\n            .freq.mini th {\n                overflow: hidden;\n                text-overflow: ellipsis;\n                white-space: nowrap;\n                max-width: 5em;\n                font-weight: 400;\n                text-align:right;\n                padding-right: 0.5em;\n            }\n\n            .missing {\n                color: #a94442;\n            }\n            .alert, .alert > th, .alert > td {\n                color: #a94442;\n            }\n\n\n        /* Bars in tables */\n            .freq .bar{\n                float: left;\n                width: 0;\n                height: 100%;\n                line-height: 20px;\n                color: #fff;\n                text-align: center;\n                background-color: #337ab7;\n                border-radius: 3px;\n                margin-right: 4px;\n            }\n            .other .bar {\n                background-color: #999;\n            }\n            .missing .bar{\n                background-color: #a94442;\n            }\n            .tooltip-inner {\n                width: 100%;\n                white-space: nowrap;\n                text-align:left;\n            }\n\n            .extrapadding{\n                padding: 2em;\n            }\n\n</style>\n\n<div class=\"container pandas-profiling\">\n    <div class=\"row headerrow highlight\">\n        <h1>Overview</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-6 namecol\">\n        <p class=\"h4\">Dataset info</p>\n        <table class=\"stats\" style=\"margin-left: 1em;\">\n            <tbody>\n            <tr>\n                <th>Number of variables</th>\n                <td>5 </td>\n            </tr>\n            <tr>\n                <th>Number of observations</th>\n                <td>36514 </td>\n            </tr>\n            <tr>\n                <th>Total Missing (%)</th>\n                <td>0.0% </td>\n            </tr>\n            <tr>\n                <th>Total size in memory</th>\n                <td>928.0 KiB </td>\n            </tr>\n            <tr>\n                <th>Average record size in memory</th>\n                <td>26.0 B </td>\n            </tr>\n            </tbody>\n        </table>\n    </div>\n    <div class=\"col-md-6 namecol\">\n        <p class=\"h4\">Variables types</p>\n        <table class=\"stats\" style=\"margin-left: 1em;\">\n            <tbody>\n            <tr>\n                <th>Numeric</th>\n                <td>2 </td>\n            </tr>\n            <tr>\n                <th>Categorical</th>\n                <td>3 </td>\n            </tr>\n            <tr>\n                <th>Date</th>\n                <td>0 </td>\n            </tr>\n            <tr>\n                <th>Text (Unique)</th>\n                <td>0 </td>\n            </tr>\n            <tr>\n                <th>Rejected</th>\n                <td>0 </td>\n            </tr>\n            </tbody>\n        </table>\n    </div>\n    <div class=\"col-md-12\" style=\"padding-left: 1em;\">\n        <p class=\"h4\">Warnings</p>\n        <ul class=\"list-unstyled\"><li><code>is_ebook</code> has 33658 / 92.2% zeros</l><li><code>publication_year</code> has a high cardinality: 202 distinct values  <span class=\"label label-warning\">Warning</span></l><li>Dataset has 29002 duplicate rows <span class=\"label label-warning\">Warning</span></l> </ul>\n    </div>\n</div>\n    <div class=\"row headerrow highlight\">\n        <h1>Variables</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">average_rating<br/>\n            <small>Numeric</small>\n        </p>\n    </div><div class=\"col-md-6\">\n    <div class=\"row\">\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n                <tr>\n                    <th>Distinct count</th>\n                    <td>10</td>\n                </tr>\n                <tr>\n                    <th>Unique (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (n)</th>\n                    <td>0</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (n)</th>\n                    <td>0</td>\n                </tr>\n            </table>\n\n        </div>\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n\n                <tr>\n                    <th>Mean</th>\n                    <td>4.0685</td>\n                </tr>\n                <tr>\n                    <th>Minimum</th>\n                    <td>0</td>\n                </tr>\n                <tr>\n                    <th>Maximum</th>\n                    <td>5</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Zeros (%)</th>\n                    <td>0.0%</td>\n                </tr>\n            </table>\n        </div>\n    </div>\n</div>\n<div class=\"col-md-3 collapse in\" id=\"minihistogram4156873781702074801\">\n    <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAABLCAYAAAA1fMjoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAARZJREFUeJzt28EJwkAQQFEjlmQR9uTZnizCntYG5INCzGreuwfm8pkcdpYxxjgALx23HgBmdtp6AH7D%2BXp/%2B5vH7bLCJN9lg0AQCASBQBAIBIFAEAgEgUAQCASBQBAIBIFAEAgEgUAQCASBQHAPwmr%2B4YbEBoEgEAgCgSAQCAKBIBAIAoEgEAgCgSAQCAKBIBAIAoEgEAgCgSAQCA6mduiTQ6a9skEgCASCQCAIBIJAIAgEgkAgCASCQCAIBIKnJn/A05H12CAQbJDJ2AZzWcYYY%2BshYFZ%2BsSAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCA8AW1OEsybR9ArAAAAAElFTkSuQmCC\">\n\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#descriptives4156873781702074801,#minihistogram4156873781702074801\"\n       aria-expanded=\"false\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"row collapse col-md-12\" id=\"descriptives4156873781702074801\">\n    <ul class=\"nav nav-tabs\" role=\"tablist\">\n        <li role=\"presentation\" class=\"active\"><a href=\"#quantiles4156873781702074801\"\n                                                  aria-controls=\"quantiles4156873781702074801\" role=\"tab\"\n                                                  data-toggle=\"tab\">Statistics</a></li>\n        <li role=\"presentation\"><a href=\"#histogram4156873781702074801\" aria-controls=\"histogram4156873781702074801\"\n                                   role=\"tab\" data-toggle=\"tab\">Histogram</a></li>\n        <li role=\"presentation\"><a href=\"#common4156873781702074801\" aria-controls=\"common4156873781702074801\"\n                                   role=\"tab\" data-toggle=\"tab\">Common Values</a></li>\n        <li role=\"presentation\"><a href=\"#extreme4156873781702074801\" aria-controls=\"extreme4156873781702074801\"\n                                   role=\"tab\" data-toggle=\"tab\">Extreme Values</a></li>\n\n    </ul>\n\n    <div class=\"tab-content\">\n        <div role=\"tabpanel\" class=\"tab-pane active row\" id=\"quantiles4156873781702074801\">\n            <div class=\"col-md-4 col-md-offset-1\">\n                <p class=\"h4\">Quantile statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Minimum</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>5-th percentile</th>\n                        <td>3.5</td>\n                    </tr>\n                    <tr>\n                        <th>Q1</th>\n                        <td>4</td>\n                    </tr>\n                    <tr>\n                        <th>Median</th>\n                        <td>4</td>\n                    </tr>\n                    <tr>\n                        <th>Q3</th>\n                        <td>4.5</td>\n                    </tr>\n                    <tr>\n                        <th>95-th percentile</th>\n                        <td>4.5</td>\n                    </tr>\n                    <tr>\n                        <th>Maximum</th>\n                        <td>5</td>\n                    </tr>\n                    <tr>\n                        <th>Range</th>\n                        <td>5</td>\n                    </tr>\n                    <tr>\n                        <th>Interquartile range</th>\n                        <td>0.5</td>\n                    </tr>\n                </table>\n            </div>\n            <div class=\"col-md-4 col-md-offset-2\">\n                <p class=\"h4\">Descriptive statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Standard deviation</th>\n                        <td>0.4289</td>\n                    </tr>\n                    <tr>\n                        <th>Coef of variation</th>\n                        <td>0.10542</td>\n                    </tr>\n                    <tr>\n                        <th>Kurtosis</th>\n                        <td>3.6622</td>\n                    </tr>\n                    <tr>\n                        <th>Mean</th>\n                        <td>4.0685</td>\n                    </tr>\n                    <tr>\n                        <th>MAD</th>\n                        <td>0.31287</td>\n                    </tr>\n                    <tr class=\"\">\n                        <th>Skewness</th>\n                        <td>-0.76383</td>\n                    </tr>\n                    <tr>\n                        <th>Sum</th>\n                        <td>148560</td>\n                    </tr>\n                    <tr>\n                        <th>Variance</th>\n                        <td>0.18396</td>\n                    </tr>\n                    <tr>\n                        <th>Memory size</th>\n                        <td>285.3 KiB</td>\n                    </tr>\n                </table>\n            </div>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-8 col-md-offset-2\" id=\"histogram4156873781702074801\">\n            <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAlgAAAGQCAYAAAByNR6YAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAIABJREFUeJzt3Xt0VOW9//FPkiGBXCYXIFFDBAqYkguUIsZGSrjjXUFu6VGgYuUSjURQBKSIouAhNFrDEYOVSvXYqYEWRAUaFUtPaSu24hDBS4AFUjCDzBgSA%2BQyvz/8Oe0ISpBnO5nh/VprVsrz7Hn2d3%2B7V/ysvfdMwrxer1cAAAAwJjzQBQAAAIQaAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMMwW6ALOFy7XMeNrhoeHKSkpRkeP1qm52Wt8/fMZvbUOvbUW/bUOvbWOlb3t2DHO6HotxRWsIBYeHqawsDCFh4cFupSQQ2%2BtQ2%2BtRX%2BtQ2%2BtE4q9JWABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGG2QBcAAECwuHTexkCXcFZenXFFoEs4b3EFCwAAwDACFgAAgGEELAAAAMMIWAAAAIa16oC1detW5ebmqqioyG/8/vvvV3Z2tt8rIyNDc%2BbMkSTdd999ysjI8Ju/9NJLfe/3eDyaMWOGcnNz1b9/f82bN0/Hjx/3ze/atUs333yz%2Bvbtq%2BHDh%2BuZZ575bg4YAACEhFYbsFauXKlFixapc%2BfOp8wtWrRITqfT9/rnP/%2Bp733ve7ryyit920ybNs1vm%2B3bt/vm5s%2Bfr/r6em3YsEFr1qxRVVWViouLJUnHjx/XlClTdPnll2vr1q0qKSnRU089pc2bN1t/0AAAICS02oAVFRWl8vLy0wasr3r22Wd10UUXKS8v74zbHjlyRBUVFSoqKlJSUpJSUlI0ffp0rVmzRg0NDdqyZYsaGho0bdo0RUdHKzMzU2PGjJHD4TBxWAAA4DzQagPWhAkTFBcXd8btampqtGLFCt1zzz1%2B43/961914403qk%2BfPho9erR27twp6YvbfxEREUpPT/dtm5mZqc8//1x79uxRZWWl0tPTFRER4ZvPyMjwvR8AAOBMgv6LRp977jn169dPPXr08I2lpaUpPDxcd911l2JiYlRaWqpbb71VmzZtksfjUWxsrMLCwnzbx8fHS5Lcbrc8Ho/sdrvfPhISEuTxeNTc3Kzw8DNn0urqarlcLr8xmy1aycnJ53Kop4iICPf7CXPorXXorbXor3WCsac2W3DUHIrnbVAHrKamJj3//PNatmyZ33hBQYHfv%2B%2B55x5t2LBBFRUVatu2rbxe71nv6z8D2Zk4HA6VlpaeUlNhYeFZ77cl7PZ2lqwLemslemst%2BgtJSkyMCXQJZyWUztugDlhvvfWWTp486fcJwdOJiIjQhRdeqOrqav3gBz9QbW2tmpqafLcBPR6PJKl9%2B/ZKSkrSvn37/N7v8XiUkJDQoqtXkjRu3DgNHjzYb8xmi5bbXdfCI2uZiIhw2e3tVFNTr6amZqNrn%2B/orXXorbXor3WC8eqK6f/uWMXK8zZQITOoA9Zrr72myy%2B/XDbbvw/D6/VqyZIlGjlypL7//e9Lkk6ePKn9%2B/crLS1NPXv2lNfr1e7du5WZmSlJcjqdstvt6tq1q7KysvTCCy%2BosbHRt67T6VTv3r1bXFdycvIptwNdrmNqbLTml11TU7Nla5/v6K116K216C8kBd05EErnbfDF8f%2Bwa9cuderUyW8sLCxMH3/8sRYuXKhPPvlEdXV1Ki4uVps2bTR06FAlJSVpxIgReuyxx3T06FEdPnxYy5cv1%2BjRo2Wz2ZSXl6fY2Fg9%2BeSTqq%2Bv144dO1ReXq78/PwAHSUAAAg2rfYKVnZ2tiSpsbFRklRRUSHpi6tJX3K5XOrQocMp73344Yf16KOPatSoUaqtrVWvXr307LPPKjo6WpL04IMPasGCBRoyZIjatGmja6%2B91vdlppGRkVqxYoUWLFigsrIydejQQUVFRRo4cKCVhwsAAEJImPfbPPGNs%2BZyHTO%2Bps0WrsTEGLnddSFzSbW1oLfWobfWor/WsdnCNax4a6DLOCuvzrgi0CW0iJXnbceOZ/7KJysE9S1CAACA1oiABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAxr1QFr69atys3NVVFRkd/42rVr9f3vf1/Z2dl%2Br3fffVeS1NzcrJKSEg0ZMkT9%2BvXT5MmTdeDAAd/7PR6PZsyYodzcXPXv31/z5s3T8ePHffO7du3SzTffrL59%2B2r48OF65plnvpsDBgAAIaHVBqyVK1dq0aJF6ty582nn%2B/XrJ6fT6ffq1auXJOn555/XSy%2B9pLKyMr3xxhvq0qWLCgoK5PV6JUnz589XfX29NmzYoDVr1qiqqkrFxcWSpOPHj2vKlCm6/PLLtXXrVpWUlOipp57S5s2bv5sDBwAAQa/VBqyoqCiVl5d/bcD6Jg6HQ5MmTVK3bt0UGxuroqIiVVVVaceOHTpy5IgqKipUVFSkpKQkpaSkaPr06VqzZo0aGhq0ZcsWNTQ0aNq0aYqOjlZmZqbGjBkjh8NhwVECAIBQ1GoD1oQJExQXF/e184cOHdJPf/pT9evXT0OGDNG6deskfXEF6qOPPlJGRoZv29jYWHXu3FlOp1O7du1SRESE0tPTffOZmZn6/PPPtWfPHlVWVio9PV0RERG%2B%2BYyMDO3cudOCowQAAKHIFugCvo2kpCR16dJFd999t7p3764//vGPuvfee5WcnKzvfe978nq9io%2BP93tPfHy83G63EhISFBsbq7CwML85SXK73fJ4PLLb7X7vTUhIkMfjUXNzs8LDz5xJq6ur5XK5/MZstmglJyd/20M%2BrYiIcL%2BfMIfeWofeWov%2BWicYe2qzBUfNoXjeBmXAGjhwoAYOHOj79zXXXKM//vGPWrt2rWbNmiVJvuetTueb5r7OfwayM3E4HCotLfUbKygoUGFh4VnvtyXs9naWrAt6ayV6ay36C0lKTIwJdAlnJZTO26AMWKeTmpqqnTt3KiEhQeHh4fJ4PH7zHo9H7du3V1JSkmpra9XU1OS7Dfjltl/O79u375T3frluS4wbN06DBw/2G7PZouV2133Lozu9iIhw2e3tVFNTr6amZqNrn%2B/orXXorbXor3WC8eqK6f/uWMXK8zZQITMoA9YLL7yg%2BPh4XX311b6xqqoqpaWlKSoqSj169FBlZaUuu%2BwySVJNTY3279%2BvXr16KTU1VV6vV7t371ZmZqYkyel0ym63q2vXrsrKytILL7ygxsZG2Ww233zv3r1bXF9ycvIptwNdrmNqbLTml11TU7Nla5/v6K116K216C8kBd05EErnbfDFcUknT57UQw89JKfTqYaGBm3YsEF/%2BtOfNH78eElSfn6%2BVq9eraqqKtXW1qq4uFg9e/ZUdna2kpKSNGLECD322GM6evSoDh8%2BrOXLl2v06NGy2WzKy8tTbGysnnzySdXX12vHjh0qLy9Xfn5%2BgI8aAAAEi1Z7BSs7O1uS1NjYKEmqqKiQ9MXVpAkTJqiurk533XWXXC6XOnXqpOXLlysrK0uSNH78eLlcLt1yyy2qq6tTTk6O3zNRDz74oBYsWKAhQ4aoTZs2uvbaa31fZhoZGakVK1ZowYIFKisrU4cOHVRUVOT3zBcAAMA3CfN%2Bmye%2BcdZcrmPG17TZwpWYGCO3uy5kLqm2FvTWOvTWWvTXOjZbuIYVbw10GWfl1RlXBLqEFrHyvO3Y8eu/8slKQXmLEAAAoDUjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGteqAtXXrVuXm5qqoqOiUuc2bN%2Bv6669Xnz59NGLECP3ud7/zzT3xxBPq2bOnsrOz/V5HjhyRJJ04cUI///nPNWDAAOXk5KiwsFBut9v3/oMHD%2Br2229XTk6OBg0apKVLl6q5udn6AwYAACGh1QaslStXatGiRercufMpc%2B%2B%2B%2B65mzZqlwsJCvfXWW5o7d64efPBBbd%2B%2B3bfNDTfcIKfT6ffq0KGDJKmkpESVlZVyOBzatGmTvF6v5syZ43vvnXfeqZSUFFVUVGjVqlWqqKjQs88%2Ba/1BAwCAkNBqA1ZUVJTKy8tPG7A8Ho%2BmTJmioUOHymazKS8vT5dccolfwPo6jY2NKi8v1/Tp03XhhRcqISFBM2bM0JYtW/TJJ5/I6XRq9%2B7dmjVrluLi4tSlSxdNmjRJDofDisMEAAAhyBboAr7OhAkTvnZuwIABGjBggO/fjY2NcrlcSklJ8Y29//77Gj9%2BvD744ANdeOGFmjNnjvr376/9%2B/fr2LFjyszM9G3brVs3tW3bVpWVlaqurlZqaqri4%2BN985mZmdq7d69qa2sVGxt7xtqrq6vlcrn8xmy2aCUnJ7fo2FsqIiLc7yfMobfWobfWor/WCcae2mzBUXMonretNmCdjeLiYkVHR%2Bvqq6%2BWJF1wwQVKS0vTzJkzlZycLIfDoalTp2r9%2BvXyeDySJLvd7reG3W6X2%2B2Wx%2BM5Ze7LsOV2u1sUsBwOh0pLS/3GCgoKVFhY%2BK2P8ZvY7e0sWRf01kr01lr0F5KUmBgT6BLOSiidt0EdsLxer4qLi7VhwwatXr1aUVFRkqQxY8ZozJgxvu0mTZqkl19%2BWevXr/dd%2BfJ6vd%2B47rkYN26cBg8e7Ddms0XL7a47p3W/KiIiXHZ7O9XU1KupiYfwTaK31qG31qK/1gnGqyum/7tjFSvP20CFzKANWM3NzZozZ47effddvfDCC0pLS/vG7VNTU1VdXa2kpCRJXzzHFRPz76Z/9tlnat%2B%2BvZqamnxXub7k8XgUFhbme%2B%2BZJCcnn3I70OU6psZGa37ZNTU1W7b2%2BY7eWofeWov%2BQlLQnQOhdN4GXxz//x555BF9%2BOGHpw1X//M//6Nt27b5jVVVVSktLU1paWmKj49XZWWlb%2B6DDz7QyZMnlZWVpaysLB06dEhHjx71zTudTnXv3t0vkAEAAHydoAxYb7/9ttavX6%2BysjIlJCScMu/xeLRw4ULt2bNHJ06c0DPPPKP9%2B/dr5MiRioiI0NixY7VixQodOnRIbrdbv/jFLzRs2DB16NBBGRkZys7O1rJly1RbW6uqqiqtWrVK%2Bfn5AThSAAAQjFrtLcLs7GxJX3xCUJIqKiokfXE1ac2aNTp27JgGDRrk955%2B/frpmWee0cyZMyV98eyVx%2BNR9%2B7d9etf/1oXXHCBJKmwsFB1dXW64YYb1NjYqEGDBumBBx7wrfPLX/5S8%2BfP1xVXXKHY2FiNHz9eP/nJT6w%2BZAAAECLCvOf6RDdaxOU6ZnxNmy1ciYkxcrvrQuaedWtBb61Db61Ff61js4VrWPHWQJdxVl6dcUWgS2gRK8/bjh3jjK7XUkF5ixAAAKA1I2ABAAAYRsACAAAwjIAFAABgWKv9FCEAADg3Vz32f4EuocW2P3xloEswiitYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDDjAWvw4MEqLS3VoUOHTC8NAAAQFIwHrJtuukmvvPKKhg4dqttuu02bN29WY2Oj6d0AAAC0WsYDVkFBgV555RX97ne/U48ePfTII48oLy9PS5cu1d69e03vDgAAoNWx7BmszMxMzZ49W2%2B88Ybmzp2r3/3ud7r66qs1efJkvfvuu1btFgAAIOAsC1gNDQ165ZVX9LOf/UyzZ89WSkqK5syZo549e2rSpEl66aWXrNo1AABAQBn/W4RVVVUqLy/XH/7wB9XV1WnEiBF69tln1bdvX982/fr10wMPPKDrrrvO9O4BAAACznjAuuaaa9S1a1dNmTJFN954oxISEk7ZJi8vT0ePHjW9awAAgFbBeMBavXq1LrvssjNut2PHDtO7BgAAaBWMP4OVnp6uqVOnqqKiwjf261//Wj/72c/k8XhM7w4AAKDVMR6wFi9erGPHjql79%2B6%2BsYEDB6q5uVlLliwxvTsAAIBWx/gtwj//%2Bc966aWXlJiY6Bvr0qWLiouLde2115reHQAAQKtj/ArW8ePHFRUVdeqOwsNVX19vencAAACtjvGA1a9fPy1ZskSfffaZb%2ByTTz7RwoUL/b6qAQAAIFQZv0U4d%2B5c3XrrrfrRj36k2NhYNTc3q66uTmlpafrNb35jencAAACtjvGAlZaWppdffll/%2BtOftH//foWHh6tr167q37%2B/IiIiTO8OAACg1TEesCQpMjJSQ4cOtWJpAACAVs94wDpw4ICWLVumDz/8UMePHz9l/rXXXjO9SwAAgFbFkmewqqur1b9/f0VHR5teHgAAoNUzHrB27typ1157TUlJSaaXBgAACArGv6ahffv2XLkCAADnNeMBa8qUKSotLZXX6z3ntbZu3arc3FwVFRWdMvfKK6/ouuuuU58%2BfTRq1Cj9%2Bc9/9s01NzerpKREQ4YMUb9%2B/TR58mQdOHDAN%2B/xeDRjxgzl5uaqf//%2Bmjdvnt/zYrt27dLNN9%2Bsvn37avjw4XrmmWfO%2BVgAAMD5w/gtwj/96U/6xz/%2BobVr16pTp04KD/fPcL/97W9btM7KlStVXl6uzp07nzK3a9cuzZ49W6Wlpbr88su1adMm3XHHHdq4caMuuOACPf/883rppZe0cuVKpaSkqKSkRAUFBVq3bp3CwsI0f/58nTx5Uhs2bFBDQ4PuuusuFRcX6/7779fx48c1ZcoUjR07VmVlZdq7d69uvfVWderUScOHDzfSIwAAENqMX8GKjY3VgAEDlJeXp27duqlr165%2Br5aKior62oD14osvKi8vT3l5eYqKitL111%2BvSy65ROvXr5ckORwOTZo0Sd26dVNsbKyKiopUVVWlHTt26MiRI6qoqFBRUZGSkpKUkpKi6dOna82aNWpoaNCWLVvU0NCgadOmKTo6WpmZmRozZowcDoexHgEAgNBm/ArW4sWLjawzYcKEr52rrKxUXl6e31hGRoacTqeOHz%2Bujz76SBkZGb652NhYde7cWU6nU8eOHVNERITS09N985mZmfr888%2B1Z88eVVZWKj093e9LUTMyMvTiiy%2B2uPbq6mq5XC6/MZstWsnJyS1eoyUiIsL9fsIcemsdemst%2Bmsdemq9UOqxJV80umfPHr388sv617/%2B5Qtc//znP9WnTx8j63s8HsXHx/uNxcfH66OPPtJnn30mr9d72nm3262EhATFxsYqLCzMb06S3G63PB6P7Ha733sTEhLk8XjU3Nx8yi3P03E4HCotLfUbKygoUGFh4VkdZ0vZ7e0sWRf01kr01lr0F8EolM5b4wFr27Zt%2BtnPfqauXbtq3759Wrx4sQ4cOKAJEyboscce05AhQ4zs50wP0X/T/Ld5AP8/A9mZjBs3ToMHD/Ybs9mi5XbXnfV%2Bv0lERLjs9naqqalXU1Oz0bXPd/TWOvTWWvTXOqF0daW1suK8TUyMMbpeSxkPWCUlJbrnnns0ceJE9erVS9IXf59wyZIlWr58uZGAlZiYKI/H4zfm8XiUlJSkhIQEhYeHn3a%2Bffv2SkpKUm1trZqamny3Ab/c9sv5ffv2nfLeL9dtieTk5FNuB7pcx9TYaM0vu6amZsvWPt/RW%2BvQW2vRXwSjUDpvjcfxDz74QPn5%2BZL8r/pceeWVqqqqMrKPrKws7dy502/M6XSqd%2B/eioqKUo8ePVRZWembq6mp0f79%2B9WrVy/17NlTXq9Xu3fv9nuv3W5X165dlZWVpffff1%2BNjY2nrA0AANASxgNWXFzcaf8GYXV1tSIjI43sY%2BzYsfrLX/6iLVu26MSJEyovL9e%2Bfft0/fXXS5Ly8/O1evVqVVVVqba2VsXFxerZs6eys7OVlJSkESNG6LHHHtPRo0d1%2BPBhLV%2B%2BXKNHj5bNZlNeXp5iY2P15JNPqr6%2BXjt27FB5ebkvNAIAAJyJ8VuEP/zhD/XII4/o/vvv943t3btXCxYs0I9%2B9KMWr5OdnS1JvitJFRUVkr64mnTJJZeouLhYixcv1sGDB9W9e3c99dRT6tixoyRp/PjxcrlcuuWWW1RXV6ecnBy/h84ffPBBLViwQEOGDFGbNm107bXX%2Br7MNDIyUitWrNCCBQtUVlamDh06qKioSAMHDjynvgAAgPNHmNfEV67/h8OHD2vixIn6%2BOOP1dTUpOjoaNXX16tHjx5asWKFLrroIpO7Cxou1zHja9ps4UpMjJHbXRcy96xbC3prHXprLfprHZstXMOKtwa6jJC1/eErLTlvO3aMM7peSxm/gnXBBRdow4YNevPNN7V37161bdtWXbt21RVXXHFWn8QDAAAIVpZ8D1abNm00dOhQK5YGAABo9YwHrMGDB3/jlarXXnvN9C4BAABaFeMB6%2Bqrr/YLWE1NTdq7d6%2BcTqcmTpxoencAAACtjvGANWvWrNOOb9q0SX/7299M7w4AAKDV%2Bc6%2B93/o0KF6%2BeWXv6vdAQAABMx3FrDee%2B%2B9b/U3AAEAAIKN8VuE48ePP2Wsvr5eVVVVGj58uOndAQAAtDrGA1aXLl1O%2BRRhVFSURo8erTFjxpjeHQAAQKtjPGAtWbLE9JIAAABBxXjA%2BsMf/tDibW%2B88UbTuwcAAAg44wFr3rx5am5uPuWB9rCwML%2BxsLAwAhYAAAhJxgPW008/rWeeeUZTp05Venq6vF6v3n//fa1cuVI333yzcnJyTO8SAACgVbHkGayysjKlpKT4xi699FKlpaVp8uTJ2rBhg%2BldAgAAtCrGvwdr3759io%2BPP2Xcbrfr4MGDpncHAADQ6hgPWKmpqVqyZIncbrdvrKamRsuWLdPFF19sencAAACtjvFbhHPnztXMmTPlcDgUExOj8PBw1dbWqm3btlq%2BfLnp3QEAALQ6xgNW//79tWXLFr355ps6fPiwvF6vUlJS9OMf/1hxcXGmdwcAANDqGA9YktSuXTsNGTJEhw8fVlpamhW7AAAAaLWMP4N1/PhxzZ49W3369NFVV10l6YtnsG677TbV1NSY3h0AAECrYzxgLV26VLt27VJxcbHCw/%2B9fFNTk4qLi03vDgAAoNUxHrA2bdqkX/7yl7ryyit9f/TZbrdr8eLF2rx5s%2BndAQAAtDrGA1ZdXZ26dOlyynhSUpI%2B//xz07sDAABodYwHrIsvvlh/%2B9vfJMnvbw9u3LhRF110kendAQAAtDrGP0X4k5/8RHfeeaduuukmNTc3a9WqVdq5c6c2bdqkefPmmd4dAABAq2M8YI0bN042m03PPfecIiIitGLFCnXt2lXFxcW68sorTe8OAACg1TEesI4ePaqbbrpJN910k%2BmlAQAAgoLxZ7CGDBni9%2BwVAADA%2BcZ4wMrJydGrr75qelkAAICgYfwW4YUXXqiHH35YZWVluvjii9WmTRu/%2BWXLlpneJQAAQKtiPGB99NFH%2Bt73vidJcrvdppcHAABo9YwFrKKiIpWUlOg3v/mNb2z58uUqKCgwtQuft956S7feeqvfmNfrVUNDg1avXq0JEyYoMjLSb/6///u/fX8bcfXq1Xr%2B%2BeflcrmUnp6uefPmKSsrS5J04sQJPfzww9qyZYtOnDihnJwcLVy4UImJicaPAwAAhCZjAev1118/ZaysrMySgNWvXz85nU6/sRUrVmj37t2SpNTU1NPW82WdTzzxhJ5%2B%2Bmmlp6dr9erVmjp1qjZv3qzo6GiVlJSosrJSDodD7dq10/z58zVnzhytWLHC%2BHEAAIDQZOwh99N9cvC7%2BjThv/71L61atUr33nvvGbd1OBwaNWqUevfurbZt2%2Bq2226TJL3xxhtqbGxUeXm5pk%2BfrgsvvFAJCQmaMWOGtmzZok8%2B%2BcTqwwAAACHC2BWsL/%2Bw85nGrPD444/rpptu0kUXXaQDBw6orq5OBQUF2r59uyIjI3Xrrbdq0qRJCgsLU2Vlpa6%2B%2Bmrfe8PDw9WzZ085nU717NlTx44dU2Zmpm%2B%2BW7duatu2rSorK5WSktKieqqrq%2BVyufzGbLZoJScnmzng/y8iItzvJ8yht9aht9aiv9ahp9YLpR4bf8j9u/bxxx9r8%2BbN2rx5syQpNjZWl1xyiSZOnKiSkhL9/e9/11133aW4uDiNHj1aHo9H8fHxfmvEx8fL7XbL4/FIkux2u9%2B83W4/qwf2HQ6HSktL/cYKCgpUWFj4bQ7xjOz2dpasC3prJXprLfqLYBRK523QB6znn39ew4cPV8eOHSVJmZmZfg/a9%2B/fX%2BPHj9fatWs1evRoSWe%2BdXmutzbHjRunwYMH%2B43ZbNFyu%2BvOad2viogIl93eTjU19Wpqaja69vmO3lqH3lqL/lonlK6utFZWnLeJiTFG12spYwGroaFBM2fOPOOY6e/B2rRpk2bPnv2N26SmpmrTpk2SpMTERN%2BVqi8PvTf1AAATsklEQVR5PB716NFDSUlJvn/HxPz7/5DPPvtM7du3b3FNycnJp9wOdLmOqbHRml92TU3Nlq19vqO31qG31qK/CEahdN4ai%2BN9%2B/ZVdXW13%2Bt0Yybt2rVLBw8e1BVXXOEbe/XVV/W///u/ftvt2bNHaWlpkqSsrCxVVlb65pqamvTee%2B%2Bpd%2B/eSktLU3x8vN/8Bx98oJMnT/q%2BxgEAAOBMjF3B%2Bs/bct%2BV9957TwkJCYqNjfWNtWnTRo8%2B%2Bqguvvhi5eTk6O9//7vWrFmjRx99VJKUn5%2Bvu%2B%2B%2BW9dee63S09P1q1/9SpGRkRo4cKAiIiI0duxYrVixQtnZ2Wrbtq1%2B8YtfaNiwYerQocN3fnwAACA4BfUzWEeOHPE9e/WloUOHau7cuXrooYd06NAhdejQQXPnztXw4cMlSQMGDNDdd9%2BtGTNm6NNPP1V2drbKysrUtm1bSVJhYaHq6up0ww03qLGxUYMGDdIDDzzwXR8aAAAIYmHe7%2BrLqs5zLtcx42vabOFKTIyR210XMvesWwt6ax16ay36ax2bLVzDircGuoyQtf3hKy05bzt2jDO6XkvxkQgAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYZgt0AQCA89dVj/1foEsALMEVLAAAAMMIWAAAAIYFbcBKT09XVlaWsrOzfa%2BHHnpIkrRt2zaNHj1aP/zhD3XNNddo/fr1fu9dvXq1RowYoR/%2B8IfKz8/Xzp07fXMnTpzQz3/%2Bcw0YMEA5OTkqLCyU2%2B3%2BTo8NAAAEt6B%2BBmvjxo3q1KmT31h1dbWmT5%2BuefPm6brrrtPbb7%2BtadOmqWvXrsrOztbrr7%2BuJ554Qk8//bTS09O1evVqTZ06VZs3b1Z0dLRKSkpUWVkph8Ohdu3aaf78%2BZozZ45WrFgRoKMEAADBJmivYH2dl156SV26dNHo0aMVFRWl3NxcDR48WC%2B%2B%2BKIkyeFwaNSoUerdu7fatm2r2267TZL0xhtvqLGxUeXl5Zo%2BfbouvPBCJSQkaMaMGdqyZYs%2B%2BeSTQB4WAAAIIkEdsJYtW6aBAwfq0ksv1fz581VXV6fKykplZGT4bZeRkeG7DfjV%2BfDwcPXs2VNOp1P79%2B/XsWPHlJmZ6Zvv1q2b2rZtq8rKyu/moAAAQNAL2luEP/jBD5Sbm6tHH31UBw4c0IwZM7Rw4UJ5PB6lpKT4bZuQkOB7jsrj8Sg%2BPt5vPj4%2BXm63Wx6PR5Jkt9v95u12%2B1k9h1VdXS2Xy%2BU3ZrNFKzk5ucVrtERERLjfT5hDb61Db61FfxHMQum8DdqA5XA4fP%2B7W7dumjVrlqZNm6a%2Bffue8b1er/ec5ltSW2lpqd9YQUGBCgsLz2ndr2O3t7NkXdBbK9Fba9FfBKNQOm%2BDNmB9VadOndTU1KTw8HDflagvud1uJSUlSZISExNPmfd4POrRo4dvG4/Ho5iYGN/8Z599pvbt27e4lnHjxmnw4MF%2BYzZbtNzuurM6pjOJiAiX3d5ONTX1ampqNrr2%2BY7eWofeWov%2BIphZcd4mJsaceSMLBGXAeu%2B997R%2B/Xrdd999vrGqqipFRkYqLy9Pv//97/2237lzp3r37i1JysrKUmVlpUaOHClJampq0nvvvafRo0crLS1N8fHxqqysVGpqqiTpgw8%2B0MmTJ5WVldXi%2BpKTk0%2B5HehyHVNjozW/7Jqami1b%2B3xHb61Db61FfxGMQum8Dcqbne3bt5fD4VBZWZlOnjypvXv36vHHH9e4ceN0ww036ODBg3rxxRd14sQJvfnmm3rzzTc1duxYSVJ%2Bfr7%2B8Ic/6J133lF9fb2efPJJRUZGauDAgYqIiNDYsWO1YsUKHTp0SG63W7/4xS80bNgwdejQIcBHDQAAgkVQXsFKSUlRWVmZli1b5gtII0eOVFFRkaKiovTUU09p0aJFWrhwoVJTU7V06VJ9//vflyQNGDBAd999t2bMmKFPP/1U2dnZKisrU9u2bSVJhYWFqqur0w033KDGxkYNGjRIDzzwQACPFgAABJsw77k%2B0Y0WcbmOGV/TZgtXYmKM3O66kLmk2lrQW%2BvQW2sFW3/5Y8/40vaHr7TkvO3YMc7oei0VlLcIAQAAWjMCFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADAsaAPWwYMHVVBQoJycHOXm5uq%2B%2B%2B5TTU2NPv74Y6Wnpys7O9vv9atf/cr33ldeeUXXXXed%2BvTpo1GjRunPf/6zb665uVklJSUaMmSI%2BvXrp8mTJ%2BvAgQOBOEQAABCkgjZgTZ06VXa7Xa%2B//rrWrl2rDz/8UI8%2B%2Bqhv3ul0%2Br0mT54sSdq1a5dmz56tWbNm6a9//asmTZqkO%2B64Q4cPH5YkPf/883rppZdUVlamN954Q126dFFBQYG8Xm9AjhMAAASfoAxYNTU1ysrK0syZMxUTE6MLLrhAI0eO1Pbt28/43hdffFF5eXnKy8tTVFSUrr/%2Bel1yySVav369JMnhcGjSpEnq1q2bYmNjVVRUpKqqKu3YscPqwwIAACEiKAOW3W7X4sWL1aFDB9/YoUOHlJyc7Pv3vffeq/79%2B%2Bvyyy/XsmXL1NDQIEmqrKxURkaG33oZGRlyOp06fvy4PvroI7/52NhYde7cWU6n0%2BKjAgAAocIW6AJMcDqdeu655/Tkk08qMjJSffr00bBhw/Twww9r165duvPOO2Wz2XTXXXfJ4/EoPj7e7/3x8fH66KOP9Nlnn8nr9Z523u12t7ie6upquVwuvzGbLdovAJoQERHu9xPm0Fvr0Ftr0V8Es1A6b4M%2BYL399tuaNm2aZs6cqdzcXEnSb3/7W998r169NGXKFD311FO66667JOmMz1Od6/NWDodDpaWlfmMFBQUqLCw8p3W/jt3ezpJ1QW%2BtRG%2BtRX8RjELpvA3qgPX666/rnnvu0fz583XjjTd%2B7Xapqak6cuSIvF6vEhMT5fF4/OY9Ho%2BSkpKUkJCg8PDw0863b9%2B%2BxXWNGzdOgwcP9huz2aLldte1eI2WiIgIl93eTjU19Wpqaja69vmO3lqH3lqL/iKYWXHeJibGGF2vpYI2YP3jH//Q7Nmz9fjjj6t///6%2B8W3btumdd97RtGnTfGN79uxRamqqwsLClJWVpZ07d/qt5XQ6dc011ygqKko9evRQZWWlLrvsMklfPFC/f/9%2B9erVq8W1JScnn3I70OU6psZGa37ZNTU1W7b2%2BY7eWofeWov%2BIhiF0nkblDc7Gxsbdf/992vWrFl%2B4UqS4uLitHz5cq1bt04NDQ1yOp361a9%2Bpfz8fEnS2LFj9Ze//EVbtmzRiRMnVF5ern379un666%2BXJOXn52v16tWqqqpSbW2tiouL1bNnT2VnZ3/nxwkAAIJTUF7Beuedd1RVVaVFixZp0aJFfnMbN25USUmJSktL9fOf/1xxcXG65ZZbNHHiREnSJZdcouLiYi1evFgHDx5U9%2B7d9dRTT6ljx46SpPHjx8vlcumWW25RXV2dcnJyTnmeCgAA4JuEefkGze%2BEy3XM%2BJo2W7gSE2PkdteFzCXV1oLeWofeWivY%2BnvVY/8X6BLQSmx/%2BEpLztuOHeOMrtdSQXmLEAAAoDUjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDBboAsAAJh16byNgS4BOO9xBQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQSs0zh48KBuv/125eTkaNCgQVq6dKmam5sDXRYAAAgS/C3C07jzzjuVmZmpiooKffrpp5oyZYo6dOign/70p4EuDUAAXPXY/wW6BABBhitYX%2BF0OrV7927NmjVLcXFx6tKliyZNmiSHwxHo0gAAQJDgCtZXVFZWKjU1VfHx8b6xzMxM7d27V7W1tYqNjT3jGtXV1XK5XH5jNlu0kpOTjdYaERHu9xPm0FvrRESE69J5GwNdBoBWKJR%2B5xKwvsLj8chut/uNfRm23G53iwKWw%2BFQaWmp39gdd9yhO%2B%2B801yh%2BiLIPfvs0xo3bpzx8Ha%2Bo7fWqa6u1sQLPqS3FqmurpbD4aC/FqC31qmurtYTTzwRUr0NnahokNfrPaf3jxs3TmvXrvV7jRs3zlB1/%2BZyuVRaWnrK1TKcO3prHXprLfprHXprnVDsLVewviIpKUkej8dvzOPxKCwsTElJSS1aIzk5OWQSOAAAOHtcwfqKrKwsHTp0SEePHvWNOZ1Ode/eXTExMQGsDAAABAsC1ldkZGQoOztby5YtU21traqqqrRq1Srl5%2BcHujQAABAkIh544IEHAl1Ea/PjH/9YGzZs0EMPPaSXX35Zo0eP1uTJkxUWFhbo0k4RExOjyy67jKtrFqC31qG31qK/1qG31gm13oZ5z/WJbgAAAPjhFiEAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQSsIHTw4EHdfvvtysnJ0aBBg7R06VI1NzcHuqyQsXXrVuXm5qqoqCjQpYScgwcPqqCgQDk5OcrNzdV9992nmpqaQJcVEnbv3q2JEyeqb9%2B%2Bys3N1YwZM%2BRyuQJdVsh55JFHlJ6eHugyQkp6erqysrKUnZ3tez300EOBLuucEbCC0J133qmUlBRVVFRo1apVqqio0LPPPhvoskLCypUrtWjRInXu3DnQpYSkqVOnym636/XXX9fatWv14Ycf6tFHHw10WUHv5MmTuvXWW3XZZZdp27Zt2rBhgz799FPxp2bN2rVrl9atWxfoMkLSxo0b5XQ6fa/58%2BcHuqRzRsAKMk6nU7t379asWbMUFxenLl26aNKkSXI4HIEuLSRERUWpvLycgGWBmpoaZWVlaebMmYqJidEFF1ygkSNHavv27YEuLejV19erqKhIU6ZMUWRkpJKSkjRs2DB9%2BOGHgS4tZDQ3N2vBggWaNGlSoEtBkCBgBZnKykqlpqYqPj7eN5aZmam9e/eqtrY2gJWFhgkTJiguLi7QZYQku92uxYsXq0OHDr6xQ4cOKTk5OYBVhYb4%2BHiNGTNGNptNkrRnzx79/ve/11VXXRXgykLHb3/7W0VFRem6664LdCkhadmyZRo4cKAuvfRSzZ8/X3V1dYEu6ZwRsIKMx%2BOR3W73G/sybLnd7kCUBHwrTqdTzz33nKZNmxboUkLGwYMHlZWVpauvvlrZ2dkqLCwMdEkh4ciRI3riiSe0YMGCQJcSkn7wgx8oNzdXmzdvlsPh0DvvvKOFCxcGuqxzRsAKQl6vN9AlAOfk7bff1uTJkzVz5kzl5uYGupyQkZqaKqfTqY0bN2rfvn269957A11SSFi8eLFGjRql7t27B7qUkORwODRmzBhFRkaqW7dumjVrljZs2KCTJ08GurRzQsAKMklJSfJ4PH5jHo9HYWFhSkpKClBVQMu9/vrruv322zV37lxNmDAh0OWEnLCwMHXp0kVFRUXasGGDjh49GuiSgtq2bdv0z3/%2BUwUFBYEu5bzRqVMnNTU16dNPPw10KeeEgBVksrKydOjQIb9fmk6nU927d1dMTEwAKwPO7B//%2BIdmz56txx9/XDfeeGOgywkZ27Zt04gRI/y%2BriU8/Itf723atAlUWSFh/fr1%2BvTTTzVo0CDl5ORo1KhRkqScnBy9/PLLAa4u%2BL333ntasmSJ31hVVZUiIyOD/vlMAlaQycjIUHZ2tpYtW6ba2lpVVVVp1apVys/PD3RpwDdqbGzU/fffr1mzZql///6BLiekZGVlqba2VkuXLlV9fb2OHj2qJ554Qpdeeikf2jhH9913nzZt2qR169Zp3bp1KisrkyStW7dOgwcPDnB1wa99%2B/ZyOBwqKyvTyZMntXfvXj3%2B%2BOMaN26cIiIiAl3eOQnz8kBP0Dl8%2BLDmz5%2Bvv//974qNjdX48eN1xx13KCwsLNClBb3s7GxJX4QBSb5PZTmdzoDVFCq2b9%2Bu//qv/1JkZOQpcxs3blRqamoAqgod77//vhYtWqR3331X0dHRuvzyy3XfffcpJSUl0KWFlI8//lhDhgzR%2B%2B%2B/H%2BhSQsZbb72lZcuW6f3331dkZKRGjhypoqIiRUVFBbq0c0LAAgAAMIxbhAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABg2P8D5yfL8VDAn2EAAAAASUVORK5CYII%3D\"/>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\" id=\"common4156873781702074801\">\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">4.0</td>\n        <td class=\"number\">18066</td>\n        <td class=\"number\">49.5%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.5</td>\n        <td class=\"number\">10685</td>\n        <td class=\"number\">29.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:59%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">3.5</td>\n        <td class=\"number\">5455</td>\n        <td class=\"number\">14.9%</td>\n        <td>\n            <div class=\"bar\" style=\"width:30%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">5.0</td>\n        <td class=\"number\">1182</td>\n        <td class=\"number\">3.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:7%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">3.0</td>\n        <td class=\"number\">895</td>\n        <td class=\"number\">2.5%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2.5</td>\n        <td class=\"number\">175</td>\n        <td class=\"number\">0.5%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2.0</td>\n        <td class=\"number\">37</td>\n        <td class=\"number\">0.1%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">0.0</td>\n        <td class=\"number\">11</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.5</td>\n        <td class=\"number\">5</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.0</td>\n        <td class=\"number\">3</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\"  id=\"extreme4156873781702074801\">\n            <p class=\"h4\">Minimum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">0.0</td>\n        <td class=\"number\">11</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:7%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.0</td>\n        <td class=\"number\">3</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.5</td>\n        <td class=\"number\">5</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:3%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2.0</td>\n        <td class=\"number\">37</td>\n        <td class=\"number\">0.1%</td>\n        <td>\n            <div class=\"bar\" style=\"width:21%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2.5</td>\n        <td class=\"number\">175</td>\n        <td class=\"number\">0.5%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n            <p class=\"h4\">Maximum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">3.0</td>\n        <td class=\"number\">895</td>\n        <td class=\"number\">2.5%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">3.5</td>\n        <td class=\"number\">5455</td>\n        <td class=\"number\">14.9%</td>\n        <td>\n            <div class=\"bar\" style=\"width:30%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.0</td>\n        <td class=\"number\">18066</td>\n        <td class=\"number\">49.5%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4.5</td>\n        <td class=\"number\">10685</td>\n        <td class=\"number\">29.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:59%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">5.0</td>\n        <td class=\"number\">1182</td>\n        <td class=\"number\">3.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:7%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n    </div>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">is_ebook<br/>\n            <small>Numeric</small>\n        </p>\n    </div><div class=\"col-md-6\">\n    <div class=\"row\">\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n                <tr>\n                    <th>Distinct count</th>\n                    <td>2</td>\n                </tr>\n                <tr>\n                    <th>Unique (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (n)</th>\n                    <td>0</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (n)</th>\n                    <td>0</td>\n                </tr>\n            </table>\n\n        </div>\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n\n                <tr>\n                    <th>Mean</th>\n                    <td>0.078217</td>\n                </tr>\n                <tr>\n                    <th>Minimum</th>\n                    <td>0</td>\n                </tr>\n                <tr>\n                    <th>Maximum</th>\n                    <td>1</td>\n                </tr>\n                <tr class=\"alert\">\n                    <th>Zeros (%)</th>\n                    <td>92.2%</td>\n                </tr>\n            </table>\n        </div>\n    </div>\n</div>\n<div class=\"col-md-3 collapse in\" id=\"minihistogram-8506970005563105745\">\n    <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAABLCAYAAAA1fMjoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAAQlJREFUeJzt1cEJAlEQBUEVQzIIc/JsTgaxOY13kQaF5YtU3QfepZnjzMwBeOu0egD8svPqAa8ut8fHN9v9usMS8EEgCQSCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEgkAgCASCQCAIBIJAIAgEwnn1AP7X5fb4%2BGa7X3dY8j0fBIJAIAgEwnFmZvUI%2BFU%2BCASBQBAIBIFAEAgEgUAQCASBQBAIBIFAEAgEgUAQCASBQBAIBIFAEAgEgUAQCASBQBAIBIFAEAgEgUAQCIQn0uAOkS8Z/gAAAAAASUVORK5CYII%3D\">\n\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#descriptives-8506970005563105745,#minihistogram-8506970005563105745\"\n       aria-expanded=\"false\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"row collapse col-md-12\" id=\"descriptives-8506970005563105745\">\n    <ul class=\"nav nav-tabs\" role=\"tablist\">\n        <li role=\"presentation\" class=\"active\"><a href=\"#quantiles-8506970005563105745\"\n                                                  aria-controls=\"quantiles-8506970005563105745\" role=\"tab\"\n                                                  data-toggle=\"tab\">Statistics</a></li>\n        <li role=\"presentation\"><a href=\"#histogram-8506970005563105745\" aria-controls=\"histogram-8506970005563105745\"\n                                   role=\"tab\" data-toggle=\"tab\">Histogram</a></li>\n        <li role=\"presentation\"><a href=\"#common-8506970005563105745\" aria-controls=\"common-8506970005563105745\"\n                                   role=\"tab\" data-toggle=\"tab\">Common Values</a></li>\n        <li role=\"presentation\"><a href=\"#extreme-8506970005563105745\" aria-controls=\"extreme-8506970005563105745\"\n                                   role=\"tab\" data-toggle=\"tab\">Extreme Values</a></li>\n\n    </ul>\n\n    <div class=\"tab-content\">\n        <div role=\"tabpanel\" class=\"tab-pane active row\" id=\"quantiles-8506970005563105745\">\n            <div class=\"col-md-4 col-md-offset-1\">\n                <p class=\"h4\">Quantile statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Minimum</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>5-th percentile</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>Q1</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>Median</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>Q3</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>95-th percentile</th>\n                        <td>1</td>\n                    </tr>\n                    <tr>\n                        <th>Maximum</th>\n                        <td>1</td>\n                    </tr>\n                    <tr>\n                        <th>Range</th>\n                        <td>1</td>\n                    </tr>\n                    <tr>\n                        <th>Interquartile range</th>\n                        <td>0</td>\n                    </tr>\n                </table>\n            </div>\n            <div class=\"col-md-4 col-md-offset-2\">\n                <p class=\"h4\">Descriptive statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Standard deviation</th>\n                        <td>0.26852</td>\n                    </tr>\n                    <tr>\n                        <th>Coef of variation</th>\n                        <td>3.433</td>\n                    </tr>\n                    <tr>\n                        <th>Kurtosis</th>\n                        <td>7.8711</td>\n                    </tr>\n                    <tr>\n                        <th>Mean</th>\n                        <td>0.078217</td>\n                    </tr>\n                    <tr>\n                        <th>MAD</th>\n                        <td>0.1442</td>\n                    </tr>\n                    <tr class=\"\">\n                        <th>Skewness</th>\n                        <td>3.1418</td>\n                    </tr>\n                    <tr>\n                        <th>Sum</th>\n                        <td>2856</td>\n                    </tr>\n                    <tr>\n                        <th>Variance</th>\n                        <td>0.072101</td>\n                    </tr>\n                    <tr>\n                        <th>Memory size</th>\n                        <td>285.3 KiB</td>\n                    </tr>\n                </table>\n            </div>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-8 col-md-offset-2\" id=\"histogram-8506970005563105745\">\n            <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAlgAAAGQCAYAAAByNR6YAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAIABJREFUeJzt3Xt4VOW5/vE7yUiAJJMDELARCQVMyYEUEEMDJRwErAgoRSDWQyoqQjQmQkVFxHoougmiEjYQrbapbk0BSwVFKXIoVnpAt3QIUE2A4k7BRJkxJCSQw/z%2B4Me0IygTfRczE7%2Bf68rV5n3XetezHojcWWtlJcTtdrsFAAAAY0L9XQAAAEBbQ8ACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMCNmDt27dPN910kwYOHKjMzEzl5%2Berurpaf/nLX5SUlKS0tDSvjw0bNnj2LSkp0dixYzVgwABlZ2dr9%2B7dnrkTJ07owQcf1LBhw5SRkaG8vDw5nU7PfGVlpW677TZlZGRoxIgRWrRokVpaWs7ruQMAgOAW4na73f4u4otOnjyp4cOH6yc/%2BYluvfVW1dbW6q677pLdbteNN96o%2B%2B67T5s3bz7rvps3b9bcuXP13HPPKSkpSSUlJSopKdHGjRvVsWNHPf744/rb3/6moqIidejQQfPnz1djY6NWrFghSZo0aZJSUlJ0zz336LPPPtOMGTM0bdo0/fSnP/1G51Rdfewb7X82oaEhiouL0NGjdWppCbg/xqBGb61Db61Ff61Db61jZW%2B7dIkyup6vAvIKVn19vQoKCjRjxgy1a9dOcXFxGj16tD766KNz7ltaWqpJkyYpPT1d7du31y233CJJ2rJli5qamrR69WrNmjVLF154oWJiYpSfn6%2BtW7fqk08%2BkcPh0L59%2BzRnzhxFRUUpMTFROTk5Ki0ttfqUv5bQ0BCFhIQoNDTE36W0OfTWOvTWWvTXOvTWOm2xtzZ/F3A20dHRuvbaaz2f79%2B/X7/73e/0ox/9SJJUV1en3Nxc7dy5U%2B3atdPNN9%2BsnJwchYSEqKysTFdeeaVn39DQUPXt21cOh0N9%2B/bVsWPHlJKS4pnv1auX2rdvr7KyMlVVVSkhIUHR0dGe%2BZSUFB04cEC1tbWKjIz0qf6qqipVV1d7jdlsHRUfH/%2B1%2BvFlwsJCvf4X5tBb69Bba9Ff69Bb67TF3gZkwDqtsrJSY8eOVVNTk6ZMmaK8vDzt27dPl1xyiW666SYtWbJEf/3rX3XXXXcpKipKkydPlsvl8gpI0qnA5nQ65XK5JEl2u91r3m63e%2Ba/OHd6LafT6XPAKi0tVVFRkddYbm6u8vLyWnX%2BvrLbO1iyLuitleitteivdeitddpSbwM6YCUkJMjhcOif//ynHnzwQd1zzz1avHixfvOb33i2GTp0qKZNm6ZXX31VkydPliSd67Gyr5o38Uja1KlTNXLkSK8xm62jnM66b7z2fwoLC5Xd3kE1NfVqbuZBfJPorXXorbXor3XorXWs7G1sbITR9XwV0AFLkkJCQpSYmKiCggJNmzZN8%2BbNU1xcnNc2CQkJeuuttyRJsbGxnitVp7lcLvXp08ezn8vlUkTEvxv%2B%2Beefq1OnTmpubj7rviEhIWcc86vEx8efcTuwuvqYmpqs%2BYJsbm6xbO1vO3prHXprLfprHXprnbbU24C82bljxw6NHTvW6/UIoaGnSt22bZv%2B53/%2Bx2v7/fv3q3v37pKk1NRUlZWVeeaam5u1Z88epaenq3v37oqOjvaa//DDD3Xy5EmlpqYqNTVVhw8f1tGjRz3zDodDvXv39gpkAAAAXyUgA1Zqaqpqa2u1aNEi1dfX6%2BjRo1q6dKkuvfRSRUVF6YknntA777yjxsZG/elPf9KaNWuUnZ0tScrOztbatWv1wQcfqL6%2BXsuXL1e7du00fPhwhYWFacqUKVqxYoUOHz4sp9OpJ598UqNHj1bnzp2VnJystLQ0LV68WLW1taqoqNALL7zgWRsAAMAXAXmLMCoqSs8//7weffRRDR48WB07dtTgwYP12GOPqWvXrrr//vv1yCOP6PDhw%2BrcubPuv/9%2BjRkzRpI0bNgw3X333crPz9dnn32mtLQ0FRcXq3379pKkvLw81dXVaeLEiWpqatKIESP00EMPeY79zDPPaP78%2BRoyZIgiIyM1bdo0XXfddf5oAwAACFIB%2BaLRtsiKF43abKGKjY2Q01nXZu5ZBwp6ax16ay36ax16ax0re8uLRgEAANoIAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMC8jXNMB3l857098l%2BGxD/hB/lwAAwHnBFSwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYFrABa9%2B%2Bfbrppps0cOBAZWZmKj8/X9XV1ZKkHTt2aPLkyRowYIDGjRun1157zWvfkpISjR07VgMGDFB2drZ2797tmTtx4oQefPBBDRs2TBkZGcrLy5PT6fTMV1ZW6rbbblNGRoZGjBihRYsWqaWl5fycNAAAaBMCMmCdPHlSN998sy677DLt2LFD69ev12effaaHHnpIVVVVmjVrlqZNm6YdO3Zo3rx5mj9/vhwOhyRp8%2BbNWrp0qf7rv/5L7777rkaMGKHbb79dx48flyQtWbJEZWVlKi0t1VtvvSW326377rvPc%2Bw777xTXbt21aZNm/TCCy9o06ZN%2BvWvf%2B2XPgAAgOAUkAGrvr5eBQUFmjFjhtq1a6e4uDiNHj1aH330kdatW6fExERNnjxZ4eHhyszM1MiRI7Vq1SpJUmlpqSZNmqT09HS1b99et9xyiyRpy5Ytampq0urVqzVr1ixdeOGFiomJUX5%2BvrZu3apPPvlEDodD%2B/bt05w5cxQVFaXExETl5OSotLTUn%2B0AAABBxubvAs4mOjpa1157refz/fv363e/%2B51%2B9KMfqaysTMnJyV7bJycna8OGDZKksrIyXXnllZ650NBQ9e3bVw6HQ3379tWxY8eUkpLime/Vq5fat2%2BvsrIyVVVVKSEhQdHR0Z75lJQUHThwQLW1tYqMjPSp/qqqKs/tzNNsto6Kj4/3vQk%2BCAsLyHz8pWy24Kn3dG%2BDrcfBgN5ai/5ah95apy32NiAD1mmVlZUaO3asmpqaNGXKFOXl5enWW29V165dvbaLiYnxPEflcrm8ApJ0KrA5nU65XC5Jkt1u95q32%2B2e%2BS/OnV7L6XT6HLBKS0tVVFTkNZabm6u8vDyf9m%2BrYmMj/F1Cq9ntHfxdQptFb61Ff61Db63Tlnob0AErISFBDodD//znP/Xggw/qnnvu8Wk/t9v9tefPta8vpk6dqpEjR3qN2Wwd5XTWfeO1/1OwJX3T52%2BlsLBQ2e0dVFNTr%2BZmfsjBJHprLfprHXprHSt7669v7gM6YElSSEiIEhMTVVBQoGnTpikrK8tzJeo0p9OpuLg4SVJsbOwZ8y6XS3369PFs43K5FBHx74Z//vnn6tSpk5qbm8%2B6b0hIiGdfX8THx59xO7C6%2Bpiamr7dX5DBeP7NzS1BWXcwoLfWor/WobfWaUu9DchLIDt27NDYsWO9Xo8QGnqq1H79%2Bnm9dkGSdu/erfT0dElSamqqysrKPHPNzc3as2eP0tPT1b17d0VHR3vNf/jhhzp58qRSU1OVmpqqw4cP6%2BjRo555h8Oh3r17ewUyAACArxKQASs1NVW1tbVatGiR6uvrdfToUS1dulSXXnqpsrOzVVlZqVWrVunEiRPatm2btm3bpilTpkiSsrOztXbtWn3wwQeqr6/X8uXL1a5dOw0fPlxhYWGaMmWKVqxYocOHD8vpdOrJJ5/U6NGj1blzZyUnJystLU2LFy9WbW2tKioq9MILLyg7O9vPHQEAAMEkIANWVFSUnn/%2Bee3evVuDBw/WuHHjFBUVpSeffFKdOnXSypUr9eKLL2rgwIH6xS9%2BoUWLFul73/ueJGnYsGG6%2B%2B67lZ%2Bfr8suu0zvvvuuiouL1b59e0lSXl6e0tPTNXHiRI0aNUoRERF67LHHPMd%2B5plnVFVVpSFDhujGG2/U1Vdfreuuu84vfQAAAMEpxG3iqW6cU3X1MeNr2myhGl243fi6VtmQP8TfJfjMZgtVbGyEnM66NvM8QKCgt9aiv9aht9axsrddukQZXc9XAXkFCwAAIJgRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMC9iAVVlZqdzcXGVkZCgzM1P33nuvampq9H//939KSkpSWlqa18cvf/lLz75vvPGGxo8fr/79%2B2vSpEl65513PHMtLS1asmSJRo0apUGDBmn69On6%2BOOPPfMul0v5%2BfnKzMzU0KFDNW/ePDU0NJzXcwcAAMEtYAPW7bffLrvdrs2bN%2BvVV1/VRx99pCeeeMIz73A4vD6mT58uSdq7d6/mzp2rOXPm6M9//rNycnJ0xx136MiRI5Kkl156SevWrVNxcbG2bNmixMRE5ebmyu12S5Lmz5%2Bv%2Bvp6rV%2B/XmvWrFFFRYUKCwvPfwMAAEDQCsiAVVNTo9TUVM2ePVsRERHq1q2brrnmGu3cufOc%2B65atUpZWVnKyspSeHi4JkyYoEsuuUSvvfaaJKm0tFQ5OTnq1auXIiMjVVBQoIqKCu3atUuffvqpNm3apIKCAsXFxalr166aNWuW1qxZo8bGRqtPGwAAtBEBGbDsdrsWLlyozp07e8YOHz6s%2BPh4z%2Bf33HOPhg4dqsGDB2vx4sWeAFRWVqbk5GSv9ZKTk%2BVwONTQ0KDy8nKv%2BcjISPXo0UMOh0N79%2B5VWFiYkpKSPPMpKSk6fvy49u/fb9XpAgCANsbm7wJ84XA49OKLL2r58uVq166d%2Bvfvr9GjR%2Buxxx7T3r17deedd8pms%2Bmuu%2B6Sy%2BVSdHS01/7R0dEqLy/X559/LrfbfdZ5p9OpmJgYRUZGKiQkxGtOkpxOp8/1VlVVqbq62mvMZuvoFRBNCAsLyHz8pWy24Kn3dG%2BDrcfBgN5ai/5ah95apy32NuAD1nvvvaeZM2dq9uzZyszMlCS98sornvl%2B/fppxowZWrlype666y5J8jxP9WW%2Bav5c%2B/qitLRURUVFXmO5ubnKy8v7xmsHs9jYCH%2BX0Gp2ewd/l9Bm0Vtr0V/r0FvrtKXeBnTA2rx5s372s59p/vz5uvrqq790u4SEBH366adyu92KjY2Vy%2BXymne5XIqLi1NMTIxCQ0PPOt%2BpUyfFxcWptrZWzc3NCgsL88xJUqdOnXyue%2BrUqRo5cqTXmM3WUU5nnc9r%2BCLYkr7p87dSWFio7PYOqqmpV3Nzi7/LaVPorbXor3XorXWs7K2/vrkP2ID1/vvva%2B7cuXr66ac1dOhQz/iOHTv0wQcfaObMmZ6x/fv3KyEhQSEhIUpNTdXu3bu91nI4HBo3bpzCw8PVp08flZWV6bLLLpN06oH6Q4cOqV%2B/fkpISJDb7da%2BffuUkpLi2ddut6tnz54%2B1x4fH3/G7cDq6mNqavp2f0EG4/k3N7cEZd3BgN5ai/5ah95apy31NiAvgTQ1NemBBx7QnDlzvMKVJEVFRWnZsmX6/e9/r8bGRjkcDv3yl79Udna2JGnKlCl69913tXXrVp04cUKrV6/WwYMHNWHCBElSdna2SkpKVFFRodraWhUWFqpv375KS0tTXFycxo4dq6eeekpHjx7VkSNHtGzZMk2ePFk2W8BmUQAAEGBC3CYeOjJs586d%2BslPfqJ27dqdMffmm29qz549Kioq0sGDBxUVFaUbbrhBt956q0JDT%2BXFjRs3avHixaqsrFTv3r01b948DRo0SNKpZ6yWLl2qV155RXV1dcrIyNDDDz%2Bsbt26SZKOHTumBQsWaMuWLbrgggt01VVX6d577z1rLa1RXX3sG%2B1/NjZbqEYXbje%2BrlU25A/xdwk%2Bs9lCFRsbIaezrs18NxUo6K216K916K11rOxtly5RRtfzVUAGrLaIgEXAwin01lr01zr01jptMWAF5C1CAACAYEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYJjxgDVy5EgVFRXp8OHDppcGAAAICsYD1o9//GO98cYbuvzyy3XLLbdo48aNampqMn0YAACAgGU8YOXm5uqNN97Qb3/7W/Xp00e/%2BMUvlJWVpUWLFunAgQOmDwcAABBwLHsGKyUlRXPnztWWLVt0//3367e//a2uvPJKTZ8%2BXX//%2B9%2BtOiwAAIDfWRawGhsb9cYbb%2BjWW2/V3Llz1bVrV913333q27evcnJytG7dOqsODQAA4Fc20wtWVFRo9erVWrt2rerq6jR27Fj9%2Bte/1sCBAz3bDBo0SA899JDGjx9v%2BvAAAAB%2BZzxgjRs3Tj179tSMGTN09dVXKyYm5oxtsrKydPToUdOHBgAACAjGbxGWlJRow4YNysnJOWu4Om3Xrl1fuU5lZaVyc3OVkZGhzMxM3XvvvaqpqZEk7d27V9dff70GDhyoMWPG6Pnnn/fa94033tD48ePVv39/TZo0Se%2B8845nrqWlRUuWLNGoUaM0aNAgTZ8%2BXR9//LFn3uVyKT8/X5mZmRo6dKjmzZunhoaGr9MKAADwLWU8YCUlJen222/Xpk2bPGO/%2BtWvdOutt8rlcvm8zu233y673a7Nmzfr1Vdf1UcffaQnnnhCDQ0NmjFjhgYPHqzt27dryZIlWrlypTZu3CjpVPiaO3eu5syZoz//%2Bc/KycnRHXfcoSNHjkiSXnrpJa1bt07FxcXasmWLEhMTlZubK7fbLUmaP3%2B%2B6uvrtX79eq1Zs0YVFRUqLCw02CEAANDWGQ9YCxcu1LFjx9S7d2/P2PDhw9XS0qLHH3/cpzVqamqUmpqq2bNnKyIiQt26ddM111yjnTt3auvWrWpsbNTMmTPVsWNHpaSk6Nprr1VpaakkadWqVcrKylJWVpbCw8M1YcIEXXLJJXrttdckSaWlpcrJyVGvXr0UGRmpgoICVVRUaNeuXfr000%2B1adMmFRQUKC4uTl27dtWsWbO0Zs0aNTY2mm4VAABoo4wHrHfeeUdFRUVKTEz0jCUmJqqwsFDbt2/3aQ273a6FCxeqc%2BfOnrHDhw8rPj5eZWVlSkpKUlhYmGcuOTlZu3fvliSVlZUpOTnZa73k5GQ5HA41NDSovLzcaz4yMlI9evSQw%2BHQ3r17FRYWpqSkJM98SkqKjh8/rv3797eqDwAA4NvL%2BEPuDQ0NCg8PP2M8NDRU9fX1X2tNh8OhF198UcuXL9eGDRtkt9u95mNiYuRyudTS0iKXy6Xo6Giv%2BejoaJWXl%2Bvzzz%2BX2%2B0%2B67zT6VRMTIwiIyMVEhLiNSdJTqfT53qrqqpUXV3tNWazdVR8fLzPa/giLCy4fpWkzRY89Z7ubbD1OBjQW2vRX%2BvQW%2Bu0xd4aD1iDBg3S448/rtmzZ3vCySeffKInnnjC61UNvnrvvfc0c%2BZMzZ49W5mZmdqwYcNZt/vPUHT6eaov81Xz59rXF6WlpSoqKvIay83NVV5e3jdeO5jFxkb4u4RWs9s7%2BLuENoveWov%2BWofeWqct9dZ4wLr//vt188036wc/%2BIEiIyPV0tKiuro6de/eXb/5zW9atdbmzZv1s5/9TPPnz9fVV18tSYqLi9PBgwe9tnO5XIqJiVFoaKhiY2PPeJje5XIpLi7Os83Z5jt16qS4uDjV1taqubnZcwvy9LadOnXyue6pU6dq5MiRXmM2W0c5nXU%2Br%2BGLYEv6ps/fSmFhobLbO6impl7NzS3%2BLqdNobfWor/WobfWsbK3/vrm3njA6t69u15//XX98Y9/1KFDhxQaGqqePXtq6NChXs9Nncv777%2BvuXPn6umnn9bQoUM946mpqXr55ZfV1NQkm%2B1U%2BQ6HQ%2Bnp6Z75089jneZwODRu3DiFh4erT58%2BKisr02WXXSbp1AP1hw4dUr9%2B/ZSQkCC32619%2B/YpJSXFs6/dblfPnj19rj0%2BPv6M24HV1cfU1PTt/oIMxvNvbm4JyrqDAb21Fv21Dr21TlvqrSWXQNq1a6fLL79cN998s3JycpSVldWqcNXU1KQHHnhAc%2BbM8QpX0qmXlEZGRmr58uWqr6/Xrl27tHr1amVnZ0uSpkyZonfffVdbt27ViRMntHr1ah08eFATJkyQJGVnZ6ukpEQVFRWqra1VYWGh%2Bvbtq7S0NMXFxWns2LF66qmndPToUR05ckTLli3T5MmTPWEOAADgXELcJh46%2Bg8ff/yxFi9erI8%2B%2BuisL%2Bh8%2B%2B23z7nGzp079ZOf/ETt2rU7Y%2B7NN99UXV2dFixYoN27d6tz58669dZbdd1113m22bhxoxYvXqzKykr17t1b8%2BbN06BBgySdesZq6dKleuWVV1RXV6eMjAw9/PDD6tatmyTp2LFjWrBggbZs2aILLrhAV111le69996z1tIa1dXHvtH%2BZ2OzhWp0oW8/mRkINuQP8XcJPrPZQhUbGyGns67NfDcVKOitteivdeitdazsbZcuUUbX85XxgHXDDTeoqqpKQ4cOVceOHc%2BYnz17tsnDBQ0CFgELp9Bba9Ff69Bb67TFgGX8vtfu3bv19ttvKy4uzvTSAAAAQcH4M1idOnU665UrAACAbwvjAWvGjBkqKioy8j4pAACAYGT8FuEf//hHvf/%2B%2B3r11Vd10UUXKTTUO8O98sorpg8JAAAQUIwHrMjISA0bNsz0sgAAAEHDeMBauHCh6SUBAACCiiUvGt2/f7%2BWLl2q%2B%2B67zzP2v//7v1YcCgAAIOAYD1g7duzQhAkTtHHjRq1fv17SqZeP3njjjT69ZBQAACDYGQ9YS5Ys0c9%2B9jOtW7dOISEhkk79fsLHH39cy5YtM304AACAgGM8YH344Yee3wt4OmBJ0hVXXKGKigrThwMAAAg4xgNWVFTUWX8HYVVV1Tf%2BfX4AAADBwHjAGjBggH7xi1%2BotrbWM3bgwAHNnTtXP/jBD0wfDgAAIOAYf03Dfffdp5tuukkZGRlqbm7WgAEDVF9frz59%2Bujxxx83fTgAAICAYzxgdevWTevXr9e2bdt04MABtW/fXj179tSQIUO8nskCAABoq4wHLEm64IILdPnll1uxNAAAQMAzHrBGjhz5lVeqeBcWAABo64wHrCuvvNIrYDU3N%2BvAgQNyOBy66aabTB8OAAAg4BgPWHPmzDnr%2BFtvvaW//OUvpg8HAAAQcCz5XYRnc/nll%2Bv1118/X4cDAADwm/MWsPbs2SO3232%2BDgcAAOA3xm8RTps27Yyx%2Bvp6VVRUaMyYMaYPBwAAEHCMB6zExMQzfoowPDxckydP1rXXXmv6cAAAAAHHeMDibe0AAODbznjAWrt2rc/bXn311aYPDwAA4HfGA9a8efPU0tJyxgPtISEhXmMhISEELAAA0CYZD1jPPfecnn/%2Bed1%2B%2B%2B1KSkqS2%2B3WP/7xDz377LO6/vrrlZGRYfqQAAAAAcWSZ7CKi4vVtWtXz9ill16q7t27a/r06Vq/fr3pQwIAAAQU4%2B/BOnjwoKKjo88Yt9vtqqysNH04AACAgGM8YCUkJOjxxx%2BX0%2Bn0jNXU1Gjx4sW6%2BOKLTR8OAAAg4Bi/RXj//fdr9uzZKi0tVUREhEJDQ1VbW6v27dtr2bJlpg8HAAAQcIwHrKFDh2rr1q3atm2bjhw5Irfbra5du%2BqHP/yhoqKiTB8OAAAg4BgPWJLUoUMHjRo1SkeOHFH37t2tOAQAAEDAMv4MVkNDg%2BbOnav%2B/fvrRz/6kaRTz2DdcsstqqmpMX04AACAgGM8YC1atEh79%2B5VYWGhQkP/vXxzc7MKCwtNHw4AACDgGA9Yb731lp555hldccUVnl/6bLfbtXDhQm3cuNH04QAAAAKO8YBVV1enxMTEM8bj4uJ0/PjxVq21fft2ZWZmqqCgwGv81Vdf1fe%2B9z2lpaV5ffz973%2BXJLW0tGjJkiUaNWqUBg0apOnTp%2Bvjjz/27O9yuZSfn6/MzEwNHTpU8%2BbNU0NDg2d%2B7969uv766zVw4ECNGTNGzz//fKvqBgAA327GA9bFF1%2Bsv/zlL5Lk9bsH33zzTX3nO9/xeZ1nn31Wjz76qHr06HHW%2BUGDBsnhcHh99OvXT5L00ksvad26dSouLtaWLVuUmJio3NxcTz3z589XfX291q9frzVr1qiiosJz%2B7KhoUEzZszQ4MGDtX37di1ZskQrV67k6hsAAPCZ8YB13XXX6c4779QTTzyhlpYWvfDCC5o9e7buv/9%2B3XTTTT6vEx4ertWrV39pwPoqpaWlysnJUa9evRQZGamCggJVVFRo165d%2BvTTT7Vp0yYVFBQoLi5OXbvYKv5pAAAZvElEQVR21axZs7RmzRo1NjZq69atamxs1MyZM9WxY0elpKTo2muvVWlpaavrAAAA307GX9MwdepU2Ww2vfjiiwoLC9OKFSvUs2dPFRYW6oorrvB5nRtvvPEr5w8fPqyf/vSn2r17t%2Bx2u/Ly8jRx4kQ1NDSovLxcycnJnm0jIyPVo0cPORwOHTt2TGFhYUpKSvLMp6Sk6Pjx49q/f7/KysqUlJSksLAwz3xycrJWrVrlc%2B1VVVWqrq72GrPZOio%2BPt7nNXwRFmY8H1vKZgueek/3Nth6HAzorbXor3XorXXaYm%2BNB6yjR4/qxz/%2BsX784x%2BbXtojLi5OiYmJuvvuu9W7d2/94Q9/0D333KP4%2BHh997vfldvtPuP3IUZHR8vpdComJkaRkZGeB/BPz0mS0%2BmUy%2BWS3W732jcmJkYul0stLS1ePxn5ZUpLS1VUVOQ1lpubq7y8vK97ym1CbGyEv0toNbu9g79LaLPorbXor3XorXXaUm%2BNB6xRo0bp/fff9wowpg0fPlzDhw/3fD5u3Dj94Q9/0Kuvvqo5c%2BZI8n7%2B64u%2Bau7LtOZ8pk6dqpEjR3qN2Wwd5XTWtfq4XyXYkr7p87dSWFio7PYOqqmpV3Nzi7/LaVPorbXor3XorXWs7K2/vrk3HrAyMjK0YcMGXXnllaaX/koJCQnavXu3YmJiFBoaKpfL5TXvcrnUqVMnxcXFqba2Vs3NzZ7bgKe3PT1/8ODBM/Y9va4v4uPjz7gdWF19TE1N3%2B4vyGA8/%2BbmlqCsOxjQW2vRX%2BvQW%2Bu0pd4aD1gXXnihHnvsMRUXF%2Bviiy/WBRdc4DW/ePHib3yMl19%2BWdHR0V4hrqKiQt27d1d4eLj69OmjsrIyXXbZZZJOvUn%2B0KFD6tevnxISEuR2u7Vv3z6lpKRIkhwOh%2Bx2u3r27KnU1FS9/PLLampqks1m88ynp6d/47oBAMC3g/F7TOXl5frud7%2BrqKgoOZ1OVVVVeX2YcPLkST3yyCNyOBxqbGzU%2BvXr9cc//lHTpk2TJGVnZ6ukpEQVFRWqra1VYWGh%2Bvbtq7S0NMXFxWns2LF66qmndPToUR05ckTLli3T5MmTZbPZlJWVpcjISC1fvlz19fXatWuXVq9erezsbCO1AwCAts/YFayCggItWbJEv/nNbzxjy5YtU25u7tdaLy0tTZLU1NQkSdq0aZOkU1eTbrzxRtXV1emuu%2B5SdXW1LrroIi1btkypqamSpGnTpqm6ulo33HCD6urqlJGR4fXQ%2BcMPP6wFCxZo1KhRuuCCC3TVVVd5Xmbarl07rVixQgsWLFBxcbE6d%2B6sgoICr2e%2BAAAAvkqI%2B%2Bs88X0W6enp2rVr1znHvq2qq48ZX9NmC9Xowu3G17XKhvwh/i7BZzZbqGJjI%2BR01rWZ5wECBb21Fv21Dr21jpW97dIlyuh6vjJ2i/BsOc1QdgMAAAgqxgLW2V5jYOWrGgAAAAJVcL1ICQAAIAgQsAAAAAwz9lOEjY2Nmj179jnHTLwHCwAAIJAZC1gDBw484z1XZxsDAABo64wFrP98/xUAAMC3Gc9gAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYFhAB6zt27crMzNTBQUFZ8y98cYbGj9%2BvPr3769JkybpnXfe8cy1tLRoyZIlGjVqlAYNGqTp06fr448/9sy7XC7l5%2BcrMzNTQ4cO1bx589TQ0OCZ37t3r66//noNHDhQY8aM0fPPP2/tiQIAgDYlYAPWs88%2Bq0cffVQ9evQ4Y27v3r2aO3eu5syZoz//%2Bc/KycnRHXfcoSNHjkiSXnrpJa1bt07FxcXasmWLEhMTlZubK7fbLUmaP3%2B%2B6uvrtX79eq1Zs0YVFRUqLCyUJDU0NGjGjBkaPHiwtm/friVLlmjlypXauHHj%2BTt5AAAQ1AI2YIWHh2v16tVnDVirVq1SVlaWsrKyFB4ergkTJuiSSy7Ra6%2B9JkkqLS1VTk6OevXqpcjISBUUFKiiokK7du3Sp59%2Bqk2bNqmgoEBxcXHq2rWrZs2apTVr1qixsVFbt25VY2OjZs6cqY4dOyolJUXXXnutSktLz3cLAABAkLL5u4Avc%2BONN37pXFlZmbKysrzGkpOT5XA41NDQoPLyciUnJ3vmIiMj1aNHDzkcDh07dkxhYWFKSkryzKekpOj48ePav3%2B/ysrKlJSUpLCwMK%2B1V61a5XPtVVVVqq6u9hqz2ToqPj7e5zV8ERYWsPn4rGy24Kn3dG%2BDrcfBgN5ai/5ah95apy32NmAD1ldxuVyKjo72GouOjlZ5ebk%2B//xzud3us847nU7FxMQoMjJSISEhXnOS5HQ65XK5ZLfbvfaNiYmRy%2BVSS0uLQkPP/YdfWlqqoqIir7Hc3Fzl5eW16jzbmtjYCH%2BX0Gp2ewd/l9Bm0Vtr0V/r0FvrtKXeBmXAkuR5nurrzJ9r37P5z0B2LlOnTtXIkSO9xmy2jnI661p93K8SbEnf9PlbKSwsVHZ7B9XU1Ku5ucXf5bQp9NZa9Nc69NY6VvbWX9/cB2XAio2Nlcvl8hpzuVyKi4tTTEyMQkNDzzrfqVMnxcXFqba2Vs3NzZ7bgKe3PT1/8ODBM/Y9va4v4uPjz7gdWF19TE1N3%2B4vyGA8/%2BbmlqCsOxjQW2vRX%2BvQW%2Bu0pd4G1yWQ/y81NVW7d%2B/2GnM4HEpPT1d4eLj69OmjsrIyz1xNTY0OHTqkfv36qW/fvnK73dq3b5/Xvna7XT179lRqaqr%2B8Y9/qKmp6Yy1AQAAfBGUAWvKlCl69913tXXrVp04cUKrV6/WwYMHNWHCBElSdna2SkpKVFFRodraWhUWFqpv375KS0tTXFycxo4dq6eeekpHjx7VkSNHtGzZMk2ePFk2m01ZWVmKjIzU8uXLVV9fr127dmn16tXKzs7281kDAIBgEbC3CNPS0iTJcyVp06ZNkk5dTbrkkktUWFiohQsXqrKyUr1799bKlSvVpUsXSdK0adNUXV2tG264QXV1dcrIyPB66Pzhhx/WggULNGrUKF1wwQW66qqrPC8zbdeunVasWKEFCxaouLhYnTt3VkFBgYYPH34ezx4AAASzEPfXeeIbrVZdfcz4mjZbqEYXbje%2BrlU25A/xdwk%2Bs9lCFRsbIaezrs08DxAo6K216K916K11rOxtly5RRtfzVVDeIgQAAAhkBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAw4I2YCUlJSk1NVVpaWmej0ceeUSStGPHDk2ePFkDBgzQuHHj9Nprr3ntW1JSorFjx2rAgAHKzs7W7t27PXMnTpzQgw8%2BqGHDhikjI0N5eXlyOp3n9dwAAEBws/m7gG/izTff1EUXXeQ1VlVVpVmzZmnevHkaP3683nvvPc2cOVM9e/ZUWlqaNm/erKVLl%2Bq5555TUlKSSkpKdPvtt2vjxo3q2LGjlixZorKyMpWWlqpDhw6aP3%2B%2B7rvvPq1YscJPZwkAAIJN0F7B%2BjLr1q1TYmKiJk%2BerPDwcGVmZmrkyJFatWqVJKm0tFSTJk1Senq62rdvr1tuuUWStGXLFjU1NWn16tWaNWuWLrzwQsXExCg/P19bt27VJ5984s/TAgAAQSSoA9bixYs1fPhwXXrppZo/f77q6upUVlam5ORkr%2B2Sk5M9twG/OB8aGqq%2BffvK4XDo0KFDOnbsmFJSUjzzvXr1Uvv27VVWVnZ%2BTgoAAAS9oL1F%2BP3vf1%2BZmZl64okn9PHHHys/P18///nP5XK51LVrV69tY2JiPM9RuVwuRUdHe81HR0fL6XTK5XJJkux2u9e83W5v1XNYVVVVqq6u9hqz2ToqPj7e5zV8ERYWXPnYZgueek/3Nth6HAzorbXor3XorXXaYm%2BDNmCVlpZ6/n%2BvXr00Z84czZw5UwMHDjznvm63%2BxvN%2B1JbUVGR11hubq7y8vK%2B0brBLjY2wt8ltJrd3sHfJbRZ9NZa9Nc69NY6bam3QRuwvuiiiy5Sc3OzQkNDPVeiTnM6nYqLi5MkxcbGnjHvcrnUp08fzzYul0sREf8OA59//rk6derkcy1Tp07VyJEjvcZsto5yOutadU7nEmxJ3/T5WyksLFR2ewfV1NSrubnF3%2BW0KfTWWvTXOvTWOlb21l/f3AdlwNqzZ49ee%2B013XvvvZ6xiooKtWvXTllZWfrd737ntf3u3buVnp4uSUpNTVVZWZmuueYaSVJzc7P27NmjyZMnq3v37oqOjlZZWZkSEhIkSR9%2B%2BKFOnjyp1NRUn%2BuLj48/43ZgdfUxNTV9u78gg/H8m5tbgrLuYEBvrUV/rUNvrdOWehtcl0D%2Bv06dOqm0tFTFxcU6efKkDhw4oKefflpTp07VxIkTVVlZqVWrVunEiRPatm2btm3bpilTpkiSsrOztXbtWn3wwQeqr6/X8uXL1a5dOw0fPlxhYWGaMmWKVqxYocOHD8vpdOrJJ5/U6NGj1blzZz%2BfNQAACBZBeQWra9euKi4u1uLFiz0B6ZprrlFBQYHCw8O1cuVKPfroo/r5z3%2BuhIQELVq0SN/73vckScOGDdPdd9%2Bt/Px8ffbZZ0pLS1NxcbHat28vScrLy1NdXZ0mTpyopqYmjRgxQg899JAfzxYAAASbEPc3faIbPqmuPmZ8TZstVKMLtxtf1yob8of4uwSf2Wyhio2NkNNZ12YuVwcKemst%2BmsdemsdK3vbpUuU0fV8FZS3CAEAAAIZAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhtn8XQAAALDGj576k79L8NnOx67wdwlGcQULAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGEbAAgAAMIyABQAAYBgBCwAAwDACFgAAgGEELAAAAMMIWAAAAIYRsAAAAAwjYAEAABhGwAIAADCMgAUAAGAYAQsAAMAwAhYAAIBhBCwAAADDCFgAAACGEbAAAAAMI2CdRWVlpW677TZlZGRoxIgRWrRokVpaWvxdFgAACBI2fxcQiO68806lpKRo06ZN%2BuyzzzRjxgx17txZP/3pT/1dGgAACAJcwfoCh8Ohffv2ac6cOYqKilJiYqJycnJUWlrq79IAAECQ4ArWF5SVlSkhIUHR0dGesZSUFB04cEC1tbWKjIw85xpVVVWqrq72GrPZOio%2BPt5orWFhwZWPbbbgqfd0b4Otx8GA3lqL/lqH3lqvLfWWgPUFLpdLdrvda%2Bx02HI6nT4FrNLSUhUVFXmN3XHHHbrzzjvNFapTQe6mbh9p6tSpxsPbt11VVZV%2B/evn6K0F6K216K91grG3Ox%2B7wt8l%2BKSqqkpLly4Nqt6eS9uJiga53e5vtP/UqVP16quven1MnTrVUHX/Vl1draKiojOuluGbo7fWobfWor/WobfWaYu95QrWF8TFxcnlcnmNuVwuhYSEKC4uzqc14uPj20wCBwAArccVrC9ITU3V4cOHdfToUc%2BYw%2BFQ7969FRER4cfKAABAsCBgfUFycrLS0tK0ePFi1dbWqqKiQi%2B88IKys7P9XRoAAAgSYQ899NBD/i4i0Pzwhz/U%2BvXr9cgjj%2Bj111/X5MmTNX36dIWEhPi7tDNERETosssu4%2BqaBeitdeitteivdeitddpab0Pc3/SJbgAAAHjhFiEAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQSsAFdZWanbbrtNGRkZGjFihBYtWqSWlpazbltSUqKxY8dqwIABys7O1u7du89ztcGlNb19%2BeWXNXbsWPXv318TJ07Upk2bznO1waU1vT3tk08%2BUf/%2B/bV06dLzVGXwak1/KyoqdMMNNyg9PV1ZWVn61a9%2BdX6LDTK%2B9ralpUXPPPOMRo4cqf79%2B2v8%2BPF64403/FBxcNm%2BfbsyMzNVUFDwldu1tLRoyZIlGjVqlAYNGqTp06fr448/Pk9VGuJGQLvmmmvcDzzwgLumpsZ94MAB95gxY9zPP//8Gdu9/fbb7ksvvdT9wQcfuOvr690rV650DxkyxF1XV%2BeHqoODr71988033QMHDnTv3LnTffLkSfdvf/tbd0pKivvQoUN%2BqDo4%2BNrb/3THHXe4Bw4c6H7mmWfOU5XBy9f%2B1tfXu4cPH%2B5%2B9tln3cePH3fv2rXLPW7cOHd5ebkfqg4Ovvb2xRdfdA8dOtRdUVHhbmpqcm/evNmdnJzs3rt3rx%2BqDg7FxcXuMWPGuKdNm%2BbOz8//ym1LSkrcI0aMcJeXl7uPHTvmfvjhh93jx493t7S0nKdqvzmuYAUwh8Ohffv2ac6cOYqKilJiYqJycnJUWlp6xralpaWaNGmS0tPT1b59e91yyy2SpC1btpzvsoNCa3rb0NCgu%2B%2B%2BWwMHDtQFF1yga6%2B9VhEREfrggw/8UHnga01vT9u2bZvKy8s1fPjw81dokGpNfzds2KDIyEjdcsst6tChg/r166f169erV69efqg88LWmt2VlZRo4cKC%2B%2B93vKiwsTCNGjFBMTIz%2B8Y9/%2BKHy4BAeHq7Vq1erR48e59y2tLRUOTk56tWrlyIjI1VQUKCKigrt2rXrPFRqBgErgJWVlSkhIUHR0dGesZSUFB04cEC1tbVnbJucnOz5PDQ0VH379pXD4Thv9QaT1vR24sSJuu666zyf19TUqK6uTl27dj1v9QaT1vRWOhVgH374YS1YsEA2m%2B18lhqUWtPf9957T5dcconuu%2B8%2BXXrppbriiiv02muvne%2BSg0Zrejt8%2BHD99a9/1d69e3Xy5Em9/fbbqq%2Bv12WXXXa%2Byw4aN954o6Kios65XUNDg8rLy73%2BTYuMjFSPHj2C6t80AlYAc7lcstvtXmOnv/CdTucZ2/7nfxROb/vF7XBKa3r7n9xutx544AGlp6fzH9Iv0dreLlu2TN///vc1ePDg81JfsGtNf48cOaK3335bmZmZ2r59u2bMmKG5c%2Bdqz549563eYNKa3o4ZM0ZTp07V1VdfrbS0NM2ePVsLFy7UhRdeeN7qbas%2B//xzud3uoP83jW8XA5zb7bZkW7S%2BX42Njbr33ntVXl6ukpISi6pqG3ztbXl5uVatWqV169ZZXFHb4mt/3W63UlJSNH78eEnSNddco1deeUVvvvmm19UB/JuvvV27dq3Wrl2rVatWKSkpSTt27NDs2bN14YUXql%2B/fhZX%2Be0Q7P%2BmcQUrgMXFxcnlcnmNuVwuhYSEKC4uzms8Njb2rNt%2BcTuc0preSqcuWc%2BYMUP/%2Bte/9NJLL6lz587nq9Sg42tv3W63HnroId15553q0qXL%2BS4zaLXm726XLl3OuCWTkJCg6upqy%2BsMRq3p7YsvvqipU6eqX79%2BCg8P1/DhwzV48GBuwRoQExOj0NDQs/5ZdOrUyU9VtR4BK4Clpqbq8OHDOnr0qGfM4XCod%2B/eioiIOGPbsrIyz%2BfNzc3as2eP0tPTz1u9waQ1vXW73SooKJDNZtOvfvUrxcbGnu9yg4qvvf3Xv/6lv/3tb3rmmWeUkZGhjIwMvf7663ruued0zTXX%2BKP0oNCav7u9evXShx9%2B6HUloLKyUgkJCeet3mDSmt62tLSoubnZa%2BzkyZPnpc62Ljw8XH369PH6N62mpkaHDh0KqquDBKwAlpycrLS0NC1evFi1tbWqqKjQCy%2B8oOzsbEnSFVdcoZ07d0qSsrOztXbtWn3wwQeqr6/X8uXL1a5dO34q60u0prfr1q1TeXm5nn76aYWHh/uz7KDga2%2B7deumbdu26fe//73nY%2BTIkZo2bZqKi4v9fBaBqzV/dydMmCCn06kVK1aooaFB69evV1lZmSZMmODPUwhYrentyJEjtXr1au3bt09NTU165513tGPHDo0aNcqfpxC0PvnkE11xxRWed11lZ2erpKREFRUVqq2tVWFhofr27au0tDQ/V%2Bo7nsEKcM8884zmz5%2BvIUOGKDIyUtOmTfP8RNuBAwd0/PhxSdKwYcN09913Kz8/X5999pnS0tJUXFys9u3b%2B7P8gOZrb9esWaPKysozHmqfOHGiHn300fNedzDwpbdhYWHq1q2b134dOnRQZGQktwzPwde/u127dtXKlSv12GOP6b//%2B7/1ne98R8uWLdPFF1/sz/IDmq%2B9nTFjhpqampSbm6ujR48qISFBjz76qH7wgx/4s/yAdjocNTU1SZLnhc0Oh0ONjY06cOCA5yrgtGnTVF1drRtuuEF1dXXKyMhQUVGRfwr/mkLcwf4UGQAAQIDhFiEAAIBhBCwAAADDCFgAAACGEbAAAAAMI2ABAAAYRsACAAAwjIAFAABgGAELAADAMAIWAACAYQQsAAAAwwhYAAAAhhGwAAAADCNgAQAAGPb/APy%2BdyrjS2LcAAAAAElFTkSuQmCC\"/>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\" id=\"common-8506970005563105745\">\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">0.0</td>\n        <td class=\"number\">33658</td>\n        <td class=\"number\">92.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.0</td>\n        <td class=\"number\">2856</td>\n        <td class=\"number\">7.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\"  id=\"extreme-8506970005563105745\">\n            <p class=\"h4\">Minimum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">0.0</td>\n        <td class=\"number\">33658</td>\n        <td class=\"number\">92.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.0</td>\n        <td class=\"number\">2856</td>\n        <td class=\"number\">7.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n            <p class=\"h4\">Maximum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">0.0</td>\n        <td class=\"number\">33658</td>\n        <td class=\"number\">92.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1.0</td>\n        <td class=\"number\">2856</td>\n        <td class=\"number\">7.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n    </div>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">num_pages<br/>\n            <small>Categorical</small>\n        </p>\n    </div><div class=\"col-md-3\">\n    <table class=\"stats \">\n        <tr class=\"\">\n            <th>Distinct count</th>\n            <td>10</td>\n        </tr>\n        <tr>\n            <th>Unique (%)</th>\n            <td>0.0%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (%)</th>\n            <td>0.0%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (n)</th>\n            <td>0</td>\n        </tr>\n    </table>\n</div>\n<div class=\"col-md-6 collapse in\" id=\"minifreqtable401320848774450276\">\n    <table class=\"mini freq\">\n        <tr class=\"\">\n    <th>(-11.961, 437.44]</th>\n    <td>\n        <div class=\"bar\" style=\"width:100%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 93.6%\">\n            34172\n        </div>\n        \n    </td>\n</tr><tr class=\"\">\n    <th>(437.44, 875.88]</th>\n    <td>\n        <div class=\"bar\" style=\"width:6%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 5.2%\">\n            &nbsp;\n        </div>\n        1914\n    </td>\n</tr><tr class=\"\">\n    <th>(875.88, 1314.32]</th>\n    <td>\n        <div class=\"bar\" style=\"width:1%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 0.8%\">\n            &nbsp;\n        </div>\n        295\n    </td>\n</tr><tr class=\"other\">\n    <th>Other values (22)</th>\n    <td>\n        <div class=\"bar\" style=\"width:1%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 0.4%\">\n            &nbsp;\n        </div>\n        133\n    </td>\n</tr>\n    </table>\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#freqtable401320848774450276, #minifreqtable401320848774450276\"\n       aria-expanded=\"true\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"col-md-12 extrapadding collapse\" id=\"freqtable401320848774450276\">\n    \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">(-11.961, 437.44]</td>\n        <td class=\"number\">34172</td>\n        <td class=\"number\">93.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(437.44, 875.88]</td>\n        <td class=\"number\">1914</td>\n        <td class=\"number\">5.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:6%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(875.88, 1314.32]</td>\n        <td class=\"number\">295</td>\n        <td class=\"number\">0.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(1314.32, 1752.76]</td>\n        <td class=\"number\">76</td>\n        <td class=\"number\">0.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(1752.76, 2191.2]</td>\n        <td class=\"number\">27</td>\n        <td class=\"number\">0.1%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(2191.2, 2629.64]</td>\n        <td class=\"number\">11</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(2629.64, 3068.08]</td>\n        <td class=\"number\">11</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(3068.08, 3506.52]</td>\n        <td class=\"number\">4</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(3506.52, 3944.96]</td>\n        <td class=\"number\">3</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(10521.56, 10960.0]</td>\n        <td class=\"number\">1</td>\n        <td class=\"number\">0.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:1%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">publication_year<br/>\n            <small>Categorical</small>\n        </p>\n    </div><div class=\"col-md-3\">\n    <table class=\"stats \">\n        <tr class=\"alert\">\n            <th>Distinct count</th>\n            <td>202</td>\n        </tr>\n        <tr>\n            <th>Unique (%)</th>\n            <td>0.6%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (%)</th>\n            <td>0.0%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (n)</th>\n            <td>0</td>\n        </tr>\n    </table>\n</div>\n<div class=\"col-md-6 collapse in\" id=\"minifreqtable-1127510253279857554\">\n    <table class=\"mini freq\">\n        <tr class=\"\">\n    <th>2100</th>\n    <td>\n        <div class=\"bar\" style=\"width:22%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 15.9%\">\n            5816\n        </div>\n        \n    </td>\n</tr><tr class=\"\">\n    <th>2013</th>\n    <td>\n        <div class=\"bar\" style=\"width:7%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 4.7%\">\n            &nbsp;\n        </div>\n        1719\n    </td>\n</tr><tr class=\"\">\n    <th>2014</th>\n    <td>\n        <div class=\"bar\" style=\"width:7%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 4.6%\">\n            &nbsp;\n        </div>\n        1669\n    </td>\n</tr><tr class=\"other\">\n    <th>Other values (199)</th>\n    <td>\n        <div class=\"bar\" style=\"width:100%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 74.8%\">\n            27310\n        </div>\n        \n    </td>\n</tr>\n    </table>\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#freqtable-1127510253279857554, #minifreqtable-1127510253279857554\"\n       aria-expanded=\"true\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"col-md-12 extrapadding collapse\" id=\"freqtable-1127510253279857554\">\n    \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">2100</td>\n        <td class=\"number\">5816</td>\n        <td class=\"number\">15.9%</td>\n        <td>\n            <div class=\"bar\" style=\"width:34%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2013</td>\n        <td class=\"number\">1719</td>\n        <td class=\"number\">4.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:10%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2014</td>\n        <td class=\"number\">1669</td>\n        <td class=\"number\">4.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:10%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2015</td>\n        <td class=\"number\">1570</td>\n        <td class=\"number\">4.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2012</td>\n        <td class=\"number\">1545</td>\n        <td class=\"number\">4.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2009</td>\n        <td class=\"number\">1440</td>\n        <td class=\"number\">3.9%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2011</td>\n        <td class=\"number\">1393</td>\n        <td class=\"number\">3.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2016</td>\n        <td class=\"number\">1382</td>\n        <td class=\"number\">3.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2010</td>\n        <td class=\"number\">1334</td>\n        <td class=\"number\">3.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2008</td>\n        <td class=\"number\">1332</td>\n        <td class=\"number\">3.6%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"other\">\n        <td class=\"fillremaining\">Other values (192)</td>\n        <td class=\"number\">17314</td>\n        <td class=\"number\">47.4%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">ratings_count<br/>\n            <small>Categorical</small>\n        </p>\n    </div><div class=\"col-md-3\">\n    <table class=\"stats \">\n        <tr class=\"\">\n            <th>Distinct count</th>\n            <td>25</td>\n        </tr>\n        <tr>\n            <th>Unique (%)</th>\n            <td>0.1%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (%)</th>\n            <td>0.0%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (n)</th>\n            <td>0</td>\n        </tr>\n    </table>\n</div>\n<div class=\"col-md-6 collapse in\" id=\"minifreqtable4614937497148659063\">\n    <table class=\"mini freq\">\n        <tr class=\"\">\n    <th>(-0.001, 2.0]</th>\n    <td>\n        <div class=\"bar\" style=\"width:9%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 6.8%\">\n            &nbsp;\n        </div>\n        2496\n    </td>\n</tr><tr class=\"\">\n    <th>(5.0, 7.0]</th>\n    <td>\n        <div class=\"bar\" style=\"width:7%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 5.7%\">\n            &nbsp;\n        </div>\n        2081\n    </td>\n</tr><tr class=\"\">\n    <th>(8.0, 10.0]</th>\n    <td>\n        <div class=\"bar\" style=\"width:6%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 4.9%\">\n            &nbsp;\n        </div>\n        1781\n    </td>\n</tr><tr class=\"other\">\n    <th>Other values (22)</th>\n    <td>\n        <div class=\"bar\" style=\"width:100%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 82.6%\">\n            30156\n        </div>\n        \n    </td>\n</tr>\n    </table>\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#freqtable4614937497148659063, #minifreqtable4614937497148659063\"\n       aria-expanded=\"true\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"col-md-12 extrapadding collapse\" id=\"freqtable4614937497148659063\">\n    \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">(-0.001, 2.0]</td>\n        <td class=\"number\">2496</td>\n        <td class=\"number\">6.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:13%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(5.0, 7.0]</td>\n        <td class=\"number\">2081</td>\n        <td class=\"number\">5.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:11%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(8.0, 10.0]</td>\n        <td class=\"number\">1781</td>\n        <td class=\"number\">4.9%</td>\n        <td>\n            <div class=\"bar\" style=\"width:9%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(10.0, 12.0]</td>\n        <td class=\"number\">1570</td>\n        <td class=\"number\">4.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(21.0, 25.0]</td>\n        <td class=\"number\">1548</td>\n        <td class=\"number\">4.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(40.0, 49.0]</td>\n        <td class=\"number\">1522</td>\n        <td class=\"number\">4.2%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(18.0, 21.0]</td>\n        <td class=\"number\">1515</td>\n        <td class=\"number\">4.1%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(179.0, 285.0]</td>\n        <td class=\"number\">1464</td>\n        <td class=\"number\">4.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(614.0, 1029527.0]</td>\n        <td class=\"number\">1460</td>\n        <td class=\"number\">4.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">(285.0, 614.0]</td>\n        <td class=\"number\">1455</td>\n        <td class=\"number\">4.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:8%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"other\">\n        <td class=\"fillremaining\">Other values (15)</td>\n        <td class=\"number\">19622</td>\n        <td class=\"number\">53.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n</div>\n</div>\n    <div class=\"row headerrow highlight\">\n        <h1>Sample</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-12\" style=\"overflow:scroll; width: 100%%; overflow-y: hidden;\">\n        <table border=\"1\" class=\"dataframe sample\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>average_rating</th>\n      <th>is_ebook</th>\n      <th>num_pages</th>\n      <th>publication_year</th>\n      <th>ratings_count</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>4.0</td>\n      <td>0.0</td>\n      <td>(-11.961, 437.44]</td>\n      <td>1887</td>\n      <td>(2.0, 3.0]</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>4.0</td>\n      <td>0.0</td>\n      <td>(-11.961, 437.44]</td>\n      <td>2015</td>\n      <td>(34.0, 40.0]</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>4.5</td>\n      <td>0.0</td>\n      <td>(-11.961, 437.44]</td>\n      <td>2008</td>\n      <td>(40.0, 49.0]</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>3.5</td>\n      <td>0.0</td>\n      <td>(-11.961, 437.44]</td>\n      <td>1964</td>\n      <td>(94.0, 125.0]</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>5.0</td>\n      <td>0.0</td>\n      <td>(-11.961, 437.44]</td>\n      <td>2015</td>\n      <td>(8.0, 10.0]</td>\n    </tr>\n  </tbody>\n</table>\n    </div>\n</div>\n</div>\n</body>\n</html>"
  },
  {
    "path": "recommender/results/profiler_interactions.html",
    "content": "<!doctype html>\n\n<html lang=\"en\">\n<head>\n  <meta charset=\"utf-8\">\n\n  <title>Profile report</title>\n  <meta name=\"description\" content=\"Profile report generated by pandas-profiling. See GitHub.\">\n  <meta name=\"author\" content=\"pandas-profiling\">\n    <script src=\"https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js\"></script>\n\n    <link rel=\"stylesheet\" href=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css\"\n          integrity=\"sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7\" crossorigin=\"anonymous\">\n    <link rel=\"stylesheet\" href=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap-theme.min.css\"\n          integrity=\"sha384-fLW2N01lMqjakBkx3l/M9EahuwpSfeNvV63J5ezn3uZzapT0u7EYsXMjQV+0En5r\" crossorigin=\"anonymous\">\n    <script src=\"https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js\" integrity=\"sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS\" crossorigin=\"anonymous\"></script>\n    <script>\n       $(function () {\n              $('[data-toggle=\"tooltip\"]').tooltip()\n        })\n    </script>\n</head>\n\n<body>\n    <meta charset=\"UTF-8\">\n\n<style>\n\n        .variablerow {\n            border: 1px solid #e1e1e8;\n            border-top: hidden;\n            padding-top: 2em;\n            padding-bottom: 2em;\n            padding-left: 1em;\n            padding-right: 1em;\n        }\n\n        .headerrow {\n            border: 1px solid #e1e1e8;\n            background-color: #f5f5f5;\n            padding: 2em;\n        }\n        .namecol {\n            margin-top: -1em;\n            overflow-x: auto;\n        }\n\n        .dl-horizontal dt {\n            text-align: left;\n            padding-right: 1em;\n            white-space: normal;\n        }\n\n        .dl-horizontal dd {\n            margin-left: 0;\n        }\n\n        .ignore {\n            opacity: 0.4;\n        }\n\n        .container.pandas-profiling {\n            max-width:975px;\n        }\n\n        .col-md-12 {\n            padding-left: 2em;\n        }\n\n        .indent {\n            margin-left: 1em;\n        }\n\n        /* Table example_values */\n            table.example_values {\n                border: 0;\n            }\n\n            .example_values th {\n                border: 0;\n                padding: 0 ;\n                color: #555;\n                font-weight: 600;\n            }\n\n            .example_values tr, .example_values td{\n                border: 0;\n                padding: 0;\n                color: #555;\n            }\n\n        /* STATS */\n            table.stats {\n                border: 0;\n            }\n\n            .stats th {\n                border: 0;\n                padding: 0 2em 0 0;\n                color: #555;\n                font-weight: 600;\n            }\n\n            .stats tr {\n                border: 0;\n            }\n\n            .stats tr:hover{\n                text-decoration: underline;\n            }\n\n            .stats td{\n                color: #555;\n                padding: 1px;\n                border: 0;\n            }\n\n\n        /* Sample table */\n            table.sample {\n                border: 0;\n                margin-bottom: 2em;\n                margin-left:1em;\n            }\n            .sample tr {\n                border:0;\n            }\n            .sample td, .sample th{\n                padding: 0.5em;\n                white-space: nowrap;\n                border: none;\n\n            }\n\n            .sample thead {\n                border-top: 0;\n                border-bottom: 2px solid #ddd;\n            }\n\n            .sample td {\n                width:100%;\n            }\n\n\n        /* There is no good solution available to make the divs equal height and then center ... */\n            .histogram {\n                margin-top: 3em;\n            }\n        /* Freq table */\n\n            table.freq {\n                margin-bottom: 2em;\n                border: 0;\n            }\n            table.freq th, table.freq tr, table.freq td {\n                border: 0;\n                padding: 0;\n            }\n\n            .freq thead {\n                font-weight: 600;\n                white-space: nowrap;\n                overflow: hidden;\n                text-overflow: ellipsis;\n\n            }\n\n            td.fillremaining{\n                width:auto;\n                max-width: none;\n            }\n\n            td.number, th.number {\n                text-align:right ;\n            }\n\n        /* Freq mini */\n            .freq.mini td{\n                width: 50%;\n                padding: 1px;\n                font-size: 12px;\n\n            }\n            table.freq.mini {\n                 width:100%;\n            }\n            .freq.mini th {\n                overflow: hidden;\n                text-overflow: ellipsis;\n                white-space: nowrap;\n                max-width: 5em;\n                font-weight: 400;\n                text-align:right;\n                padding-right: 0.5em;\n            }\n\n            .missing {\n                color: #a94442;\n            }\n            .alert, .alert > th, .alert > td {\n                color: #a94442;\n            }\n\n\n        /* Bars in tables */\n            .freq .bar{\n                float: left;\n                width: 0;\n                height: 100%;\n                line-height: 20px;\n                color: #fff;\n                text-align: center;\n                background-color: #337ab7;\n                border-radius: 3px;\n                margin-right: 4px;\n            }\n            .other .bar {\n                background-color: #999;\n            }\n            .missing .bar{\n                background-color: #a94442;\n            }\n            .tooltip-inner {\n                width: 100%;\n                white-space: nowrap;\n                text-align:left;\n            }\n\n            .extrapadding{\n                padding: 2em;\n            }\n\n</style>\n\n<div class=\"container pandas-profiling\">\n    <div class=\"row headerrow highlight\">\n        <h1>Overview</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-6 namecol\">\n        <p class=\"h4\">Dataset info</p>\n        <table class=\"stats\" style=\"margin-left: 1em;\">\n            <tbody>\n            <tr>\n                <th>Number of variables</th>\n                <td>2 </td>\n            </tr>\n            <tr>\n                <th>Number of observations</th>\n                <td>2734350 </td>\n            </tr>\n            <tr>\n                <th>Total Missing (%)</th>\n                <td>0.0% </td>\n            </tr>\n            <tr>\n                <th>Total size in memory</th>\n                <td>41.7 MiB </td>\n            </tr>\n            <tr>\n                <th>Average record size in memory</th>\n                <td>16.0 B </td>\n            </tr>\n            </tbody>\n        </table>\n    </div>\n    <div class=\"col-md-6 namecol\">\n        <p class=\"h4\">Variables types</p>\n        <table class=\"stats\" style=\"margin-left: 1em;\">\n            <tbody>\n            <tr>\n                <th>Numeric</th>\n                <td>1 </td>\n            </tr>\n            <tr>\n                <th>Categorical</th>\n                <td>1 </td>\n            </tr>\n            <tr>\n                <th>Date</th>\n                <td>0 </td>\n            </tr>\n            <tr>\n                <th>Text (Unique)</th>\n                <td>0 </td>\n            </tr>\n            <tr>\n                <th>Rejected</th>\n                <td>0 </td>\n            </tr>\n            </tbody>\n        </table>\n    </div>\n    <div class=\"col-md-12\" style=\"padding-left: 1em;\">\n        <p class=\"h4\">Warnings</p>\n        <ul class=\"list-unstyled\"><li><code>rating</code> has 1505291 / 55.1% zeros</l><li>Dataset has 2734343 duplicate rows <span class=\"label label-warning\">Warning</span></l> </ul>\n    </div>\n</div>\n    <div class=\"row headerrow highlight\">\n        <h1>Variables</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">is_read<br/>\n            <small>Categorical</small>\n        </p>\n    </div><div class=\"col-md-3\">\n    <table class=\"stats \">\n        <tr class=\"\">\n            <th>Distinct count</th>\n            <td>2</td>\n        </tr>\n        <tr>\n            <th>Unique (%)</th>\n            <td>0.0%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (%)</th>\n            <td>0.0%</td>\n        </tr>\n        <tr class=\"ignore\">\n            <th>Missing (n)</th>\n            <td>0</td>\n        </tr>\n    </table>\n</div>\n<div class=\"col-md-6 collapse in\" id=\"minifreqtable-2446476274456877104\">\n    <table class=\"mini freq\">\n        <tr class=\"\">\n    <th>false</th>\n    <td>\n        <div class=\"bar\" style=\"width:100%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 52.0%\">\n            1420740\n        </div>\n        \n    </td>\n</tr><tr class=\"\">\n    <th>true</th>\n    <td>\n        <div class=\"bar\" style=\"width:92%\" data-toggle=\"tooltip\" data-placement=\"right\" data-html=\"true\"\n             data-delay=500 title=\"Percentage: 48.0%\">\n            1313610\n        </div>\n        \n    </td>\n</tr>\n    </table>\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#freqtable-2446476274456877104, #minifreqtable-2446476274456877104\"\n       aria-expanded=\"true\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"col-md-12 extrapadding collapse\" id=\"freqtable-2446476274456877104\">\n    \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">false</td>\n        <td class=\"number\">1420740</td>\n        <td class=\"number\">52.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">true</td>\n        <td class=\"number\">1313610</td>\n        <td class=\"number\">48.0%</td>\n        <td>\n            <div class=\"bar\" style=\"width:92%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n</div>\n</div><div class=\"row variablerow\">\n    <div class=\"col-md-3 namecol\">\n        <p class=\"h4\">rating<br/>\n            <small>Numeric</small>\n        </p>\n    </div><div class=\"col-md-6\">\n    <div class=\"row\">\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n                <tr>\n                    <th>Distinct count</th>\n                    <td>6</td>\n                </tr>\n                <tr>\n                    <th>Unique (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Missing (n)</th>\n                    <td>0</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (%)</th>\n                    <td>0.0%</td>\n                </tr>\n                <tr class=\"ignore\">\n                    <th>Infinite (n)</th>\n                    <td>0</td>\n                </tr>\n            </table>\n\n        </div>\n        <div class=\"col-sm-6\">\n            <table class=\"stats \">\n\n                <tr>\n                    <th>Mean</th>\n                    <td>1.8248</td>\n                </tr>\n                <tr>\n                    <th>Minimum</th>\n                    <td>0</td>\n                </tr>\n                <tr>\n                    <th>Maximum</th>\n                    <td>5</td>\n                </tr>\n                <tr class=\"alert\">\n                    <th>Zeros (%)</th>\n                    <td>55.1%</td>\n                </tr>\n            </table>\n        </div>\n    </div>\n</div>\n<div class=\"col-md-3 collapse in\" id=\"minihistogram1908170423746513921\">\n    <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAABLCAYAAAA1fMjoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAASFJREFUeJzt3cEJwjAYgFEVR3IId/LsTg7hTvEu8qGCNMT37oUc%2BvE3bSH7McbYAS8dtl4AzOy49QKenS63j6%2B5X88/WAmYIJAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJhuuMPWMcKR1mYIBAEAsEjFm/55nFpBSYIBIFAEAgEgUCwSV/Apxvo2b41zMwEgSAQCAKBIBAIAoHgLdYPrfA367/720DcvLxjP8YYWy8CZmUPAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAkEgEAQCQSAQBAJBIBAEAuEBWGQZAJHGWO4AAAAASUVORK5CYII%3D\">\n\n</div>\n<div class=\"col-md-12 text-right\">\n    <a role=\"button\" data-toggle=\"collapse\" data-target=\"#descriptives1908170423746513921,#minihistogram1908170423746513921\"\n       aria-expanded=\"false\" aria-controls=\"collapseExample\">\n        Toggle details\n    </a>\n</div>\n<div class=\"row collapse col-md-12\" id=\"descriptives1908170423746513921\">\n    <ul class=\"nav nav-tabs\" role=\"tablist\">\n        <li role=\"presentation\" class=\"active\"><a href=\"#quantiles1908170423746513921\"\n                                                  aria-controls=\"quantiles1908170423746513921\" role=\"tab\"\n                                                  data-toggle=\"tab\">Statistics</a></li>\n        <li role=\"presentation\"><a href=\"#histogram1908170423746513921\" aria-controls=\"histogram1908170423746513921\"\n                                   role=\"tab\" data-toggle=\"tab\">Histogram</a></li>\n        <li role=\"presentation\"><a href=\"#common1908170423746513921\" aria-controls=\"common1908170423746513921\"\n                                   role=\"tab\" data-toggle=\"tab\">Common Values</a></li>\n        <li role=\"presentation\"><a href=\"#extreme1908170423746513921\" aria-controls=\"extreme1908170423746513921\"\n                                   role=\"tab\" data-toggle=\"tab\">Extreme Values</a></li>\n\n    </ul>\n\n    <div class=\"tab-content\">\n        <div role=\"tabpanel\" class=\"tab-pane active row\" id=\"quantiles1908170423746513921\">\n            <div class=\"col-md-4 col-md-offset-1\">\n                <p class=\"h4\">Quantile statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Minimum</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>5-th percentile</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>Q1</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>Median</th>\n                        <td>0</td>\n                    </tr>\n                    <tr>\n                        <th>Q3</th>\n                        <td>4</td>\n                    </tr>\n                    <tr>\n                        <th>95-th percentile</th>\n                        <td>5</td>\n                    </tr>\n                    <tr>\n                        <th>Maximum</th>\n                        <td>5</td>\n                    </tr>\n                    <tr>\n                        <th>Range</th>\n                        <td>5</td>\n                    </tr>\n                    <tr>\n                        <th>Interquartile range</th>\n                        <td>4</td>\n                    </tr>\n                </table>\n            </div>\n            <div class=\"col-md-4 col-md-offset-2\">\n                <p class=\"h4\">Descriptive statistics</p>\n                <table class=\"stats indent\">\n                    <tr>\n                        <th>Standard deviation</th>\n                        <td>2.1232</td>\n                    </tr>\n                    <tr>\n                        <th>Coef of variation</th>\n                        <td>1.1635</td>\n                    </tr>\n                    <tr>\n                        <th>Kurtosis</th>\n                        <td>-1.6116</td>\n                    </tr>\n                    <tr>\n                        <th>Mean</th>\n                        <td>1.8248</td>\n                    </tr>\n                    <tr>\n                        <th>MAD</th>\n                        <td>2.0215</td>\n                    </tr>\n                    <tr class=\"\">\n                        <th>Skewness</th>\n                        <td>0.4371</td>\n                    </tr>\n                    <tr>\n                        <th>Sum</th>\n                        <td>4989606</td>\n                    </tr>\n                    <tr>\n                        <th>Variance</th>\n                        <td>4.5081</td>\n                    </tr>\n                    <tr>\n                        <th>Memory size</th>\n                        <td>20.9 MiB</td>\n                    </tr>\n                </table>\n            </div>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-8 col-md-offset-2\" id=\"histogram1908170423746513921\">\n            <img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAlgAAAGQCAYAAAByNR6YAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD%2BnaQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBodHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAIABJREFUeJzt3Xt0VOW9//FPkiEBEibJAImelBLKreRSKgqxgQIJIhYEBcMlHotpoYBEcoikBxAjYKnQEkQFFhB7pNJ67JRg5aJcVsql%2BAPbaisOI6hEOGoWkEhmgMQEyOX3h4tpRyIZZO%2BMM75fa/EHz7P383znuyB82LNnT0hTU1OTAAAAYJhQfxcAAAAQbAhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGMzi7wK%2BKSorLxi%2BZmhoiGy2SFVV1aixscnw9b/J6K156K256K956K15zOxt584dDF3PV1zBCmChoSEKCQlRaGiIv0sJOvTWPPTWXPTXPPTWPMHYWwIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwSz%2BLgA35rYFO/1dgs92zB7o7xIAAGgVXMECAAAwGAELAADAYAQsAAAAgxGwAAAADEbAAgAAMBgBCwAAwGAELAAAAIMRsAAAAAxGwAIAADAYAQsAAMBgBCwAAACDEbAAAAAMRsACAAAwGAELAADAYAQsAAAAgxGwAAAADGZqwDpw4IDS09OVn5//pcfU1NRo6NChmjdvnmessbFRK1eu1LBhw9S/f39NmTJFH3/8sWfe7XZr9uzZSk9P16BBg7RgwQLV1dV55o8ePaoHHnhAt956q%2B688049//zzXnu%2B9tprGj16tG655RaNGzdOr7/%2Bus97AwAAtMS0gPXcc89pyZIl6tq16zWPW7Vqlaqrq73GXnzxRW3btk3FxcXau3evEhMTlZubq6amJklSYWGhamtrtX37dm3evFllZWUqKiqSJNXV1Wn69Om6/fbbdeDAAa1cuVLr16/X7t27JX0evubOnauCggK98cYbysnJ0cMPP6zTp0/7tDcAAEBLTAtYERERKikpuWbAOnbsmLZv366xY8d6jdvtduXk5Kh79%2B6KiopSfn6%2BysrKdPjwYX366acqLS1Vfn6%2BbDab4uPjNXPmTG3evFmXL1/Wvn37dPnyZT300ENq3769kpOTNX78eNntdknSpk2bNGTIEA0ZMkQREREaM2aMevXqpa1bt7a4NwAAgC9MC1iTJ09Whw4dvnS%2BqalJixYtUn5%2BvqxWq2e8rq5Ox48fV1JSkmcsKipKXbt2lcPh0NGjRxUWFqbevXt75pOTk/XZZ5/pww8/lNPpVO/evRUWFuaZT0pK0pEjRyRJTqfTa%2B0r8w6Ho8W9AQAAfGHx18Z2u10hISEaN26cVq9e7Rk/d%2B6cmpqaFB0d7XV8dHS0XC6XYmJiFBUVpZCQEK85SXK5XHK73V6BTZJiYmLkdrvV2Ngot9vd7NrHjx9vcW9fVVRUqLKy0mvMYmmvuLg4n9fwRVhYYH1GwWIJnHqv9DbQehwI6K256K956K15grG3fglYZ8%2Be1TPPPKPf/va3XkHp313rnqevcj/Uv%2B/T0vk3er%2BV3W73Co2SlJubq7y8vBtaN9DFxkb6u4TrZrW283cJQYvemov%2BmofemieYeuuXgLVs2TLde%2B%2B9Xm/zXRETE6PQ0FC53W6vcbfbrY4dO8pms6m6uloNDQ2etwGvHHtl/uTJk1ede2Xd2NjYZte22Wwt7u2riRMnKjMz02vMYmkvl6vG5zV8EWhJ3%2BjXb6awsFBZre10/nytGhoa/V1OUKG35qK/5qG35jGzt/76z71fAtbWrVtltVr18ssvS/r8vqvGxkbt3btXf/3rX9WzZ085nU4NGDBAknT%2B/Hl99NFH%2Bt73vqeEhAQ1NTXp2LFjSk5OliQ5HA5ZrVZ169ZNKSkpeumll1RfXy%2BLxeKZ79u3ryQpJSXFcz/WFQ6HQ6NGjVJERMQ19/ZVXFzcVW8HVlZeUH39N/svZCC%2B/oaGxoCsOxDQW3PRX/PQW/MEU2/9cglk//792rZtm7Zs2aItW7Zo0qRJyszM1JYtWyRJ2dnZ2rhxo8rKylRdXa2ioiL16dNHqampstlsGjFihJ5%2B%2BmlVVVXp9OnTWrNmjbKysmSxWDRkyBBFRUVp7dq1qq2t1eHDh1VSUqLs7GxJ0oQJE3Tw4EHt27dPFy9eVElJiU6ePKkxY8a0uDcAAIAvTLuCdSWQ1NfXS5JKS0slfX616KabbvI6NioqSu3atfOMT5o0SZWVlfrxj3%2BsmpoapaWled3T9MQTT2jhwoUaNmyY2rRpo7vvvtvzMNPw8HCtW7dOCxcuVHFxsTp16qT8/HwNHTpUktSrVy8VFRVp6dKlKi8vV48ePbR%2B/Xp17tzZp70BAABaEtLEEzRbRWXlBcPXtFhCNbzogOHrmmXH7IH%2BLsFnFkuoYmMj5XLVBM3l6q8Lemsu%2BmseemseM3vbufOXPzLKTIF1lzQAAEAAIGABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGMzUgHXgwAGlp6crPz//qrndu3drzJgxuuWWWzRixAj98Y9/9JrfuHGjRowYoX79%2Bik7O1tHjhzxzF28eFGPP/64Bg8erLS0NOXl5cnlcnnmy8vLNW3aNKWlpSkjI0PLly9XY2OjZ/7QoUPKyspSv379NGrUKG3dutXnvQEAAFpiWsB67rnntGTJEnXt2vWquXfeeUcFBQXKy8vT3//%2Bdz366KN64okn9Oabb0qS9uzZo1WrVunXv/61Dh48qIyMDM2YMUOfffaZJGnlypVyOp2y2%2B3atWuXmpqaNH/%2BfM/6s2bNUnx8vEpLS7VhwwaVlpbqhRdekCRVVFRo5syZmjRpkg4dOqQFCxaosLBQDofDp70BAABaYlrAioiIUElJSbMBy%2B12a/r06brjjjtksVg0ZMgQ9erVyxOw7Ha7xo0bp759%2B6pt27aaOnWqJGnv3r2qr69XSUmJZs6cqZtvvlkxMTGaPXu29u3bpzNnzsjhcOjYsWMqKChQhw4dlJiYqJycHNntdknStm3blJiYqKysLEVERCg9PV2ZmZnatGlTi3sDAAD4wmLWwpMnT/7SucGDB2vw4MGe39fX16uyslLx8fGSJKfTqZEjR3rmQ0ND1adPHzkcDvXp00cXLlxQcnKyZ7579%2B5q27atnE6nKioqlJCQoOjoaM98cnKyTpw4oerqajmdTiUlJXnVk5SUpB07drS496hRo3x67RUVFaqsrPQas1jaKy4uzqfzfRUWFli30FksgVPvld4GWo8DAb01F/01D701TzD21rSAdT2KiorUvn17T7Bxu91eAUmSoqOj5XK55Ha7JUlWq9Vr3mq1eua/OHdlrSvzV4LcFTExMZ57uK61t6/sdrtWr17tNZabm6u8vDyf1whGsbGR/i7hulmt7fxdQtCit%2Baiv%2Baht%2BYJpt76NWA1NTWpqKhI27dv18aNGxUREeE119K5X2XO17puxMSJE5WZmek1ZrG0l8tVc0PrflGgJX2jX7%2BZwsJCZbW20/nztWpoaGz5BPiM3pqL/pqH3prHzN766z/3fgtYjY2Nmj9/vt555x299NJL6tKli2cuNjbWc6XqCrfbrZ49e8pms3l%2BHxn5r6adO3dOHTt2VENDQ7PnhoSEyGazNbu2y%2BXyrHutvX0VFxd31duBlZUXVF//zf4LGYivv6GhMSDrDgT01lz01zz01jzB1Fu/XQJ58skn9cEHH1wVriQpJSVFTqfT8/uGhga9%2B%2B676tu3r7p06aLo6Giv%2Bffff1%2BXLl1SSkqKUlJSdOrUKVVVVXnmHQ6HevToocjISKWmpl712IUjR46ob9%2B%2BLe4NAADgC78ErLfeektbt25VcXGxYmJirprPzs7WK6%2B8orffflu1tbVau3atwsPDNXToUIWFhWnChAlat26dTp06JZfLpaeeekrDhw9Xp06dlJSUpNTUVK1YsULV1dUqKyvThg0blJ2dLUkaPXq0ysvLtWnTJl28eFH79%2B/X/v37NWHChBb3BgAA8IVpbxGmpqZK%2BvwTgpJUWloq6fOrSZs3b9aFCxeUkZHhdU7//v31/PPPa/DgwXrkkUc0e/ZsnT17VqmpqSouLlbbtm0lSXl5eaqpqdE999yj%2Bvp6ZWRkaNGiRZ51nn32WRUWFmrgwIGKiorSpEmTdP/990uSOnbsqPXr12vJkiVavHixEhIStHz5cn33u9%2BVpBb3BgAAaElI043e0Q2fVFZeMHxNiyVUw4sOGL6uWXbMHujvEnxmsYQqNjZSLldN0NwP8HVBb81Ff81Db81jZm87d%2B5g6Hq%2BCqyPoQEAAAQAAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMFMD1oEDB5Senq78/Pyr5l577TWNHj1at9xyi8aNG6fXX3/dM9fY2KiVK1dq2LBh6t%2B/v6ZMmaKPP/7YM%2B92uzV79mylp6dr0KBBWrBggerq6jzzR48e1QMPPKBbb71Vd955p55//nnD9gYAAGiJaQHrueee05IlS9S1a9er5o4ePaq5c%2BeqoKBAb7zxhnJycvTwww/r9OnTkqQXX3xR27ZtU3Fxsfbu3avExETl5uaqqalJklRYWKja2lpt375dmzdvVllZmYqKiiRJdXV1mj59um6//XYdOHBAK1eu1Pr167V7925D9gYAAGiJaQErIiJCJSUlzQasTZs2aciQIRoyZIgiIiI0ZswY9erVS1u3bpUk2e125eTkqHv37oqKilJ%2Bfr7Kysp0%2BPBhffrppyotLVV%2Bfr5sNpvi4%2BM1c%2BZMbd68WZcvX9a%2Bfft0%2BfJlPfTQQ2rfvr2Sk5M1fvx42e32G94bAADAF6YFrMmTJ6tDhw7NzjmdTiUlJXmNJSUlyeFwqK6uTsePH/eaj4qKUteuXeVwOHT06FGFhYWpd%2B/envnk5GR99tln%2BvDDD%2BV0OtW7d2%2BFhYV5rX3kyJEb3hsAAMAXFn9s6na7FR0d7TUWHR2t48eP69y5c2pqamp23uVyKSYmRlFRUQoJCfGakySXyyW32y2r1ep1bkxMjNxutxobG29ob19VVFSosrLSa8xiaa%2B4uDif1/BFWFhgfUbBYgmceq/0NtB6HAjorbnor3norXmCsbd%2BCViSWryn6VrzX%2BV%2BqH8PZDeyty/sdrtWr17tNZabm6u8vLwbWjfQxcZG%2BruE62a1tvN3CUGL3pqL/pqH3ponmHrrl4AVGxsrt9vtNeZ2u2Wz2RQTE6PQ0NBm5zt27Cibzabq6mo1NDR43ga8cuyV%2BZMnT1517pV1b2RvX02cOFGZmZleYxZLe7lcNT6v4YtAS/pGv34zhYWFymptp/Pna9XQ0OjvcoIKvTUX/TUPvTWPmb3113/u/RKwUlJSPPdEXeFwODRq1ChFRESoZ8%2BecjqdGjBggCTp/Pnz%2Buijj/S9731PCQkJampq0rFjx5ScnOw512q1qlu3bkpJSdFLL72k%2Bvp6WSwWz3zfvn1veG9fxcXFXfV2YGXlBdXXf7P/Qgbi629oaAzIugMBvTUX/TUPvTVPMPXWL5dAJkyYoIMHD2rfvn26ePGiSkpKdPLkSY0ZM0aSlJ2drY0bN6qsrEzV1dUqKipSnz59lJqaKpvNphEjRujpp59WVVWVTp8%2BrTVr1igrK0sWi0VDhgxRVFSU1q5dq9raWh0%2BfFglJSXKzs6%2B4b0BAAB8YdoVrCuBpL6%2BXpJUWloq6fOrRb169VJRUZGWLl2q8vJy9ejRQ%2BvXr1fnzp0lSZMmTVJlZaV%2B/OMfq6amRmlpaV73ND3xxBNauHChhg0bpjZt2ujuu%2B/2PMw0PDxc69at08KFC1VcXKxOnTopPz9fQ4cOlaQb3hsAAKAlIU08QbNVVFZeMHxNiyVUw4sOGL6uWXbMHujvEnxmsYQqNjZSLldN0Fyu/rqgt%2Baiv%2Baht%2BYxs7edOzf/yCizBdZd0gAAAAGAgAUAAGAwAhYAAIDBmg1YmZmZWr16tU6dOtXa9QAAAAS8ZgPWfffdp9dee0133HGHpk6dqt27d3s%2BDQgAAIBrazZg5ebm6rXXXtMf//hH9ezZU08%2B%2BaSGDBmi5cuX68SJE61dIwAAQEC55j1YycnJmjt3rvbu3atHH31Uf/zjHzVy5EhNmTJF77zzTmvVCAAAEFCuGbAuX76s1157TT/72c80d%2B5cxcfHa/78%2BerTp49ycnK0bdu21qoTAAAgYDT7JPeysjKVlJTolVdeUU1NjUaMGKEXXnhBt956q%2BeY/v37a9GiRRo9enSrFQsAABAImg1Yo0aNUrdu3TR9%2BnTde%2B%2B9iomJueqYIUOGqKqqyvQCAQAAAk2zAWvjxo0aMGBAiycfPnzY8IIAAAACXbP3YPXu3VszZszwfEGzJP32t7/Vz372M7nd7lYrDgAAIBA1G7CWLl2qCxcuqEePHp6xoUOHqrGxUcuWLWu14gAAAAJRs28Rvv7669q2bZtiY2M9Y4mJiSoqKtLdd9/dasUBAAAEomavYNXV1SkiIuLqg0NDVVtba3pRAAAAgazZgNW/f38tW7ZM586d84ydOXNGixcv9npUAwAAAK7W7FuEjz76qH7605/qBz/4gaKiotTY2Kiamhp16dJFv/vd71q7RgAAgIDSbMDq0qWLXn31Vf3lL3/RRx99pNDQUHXr1k2DBg1SWFhYa9cIAAAQUJoNWJIUHh6uO%2B64ozVrAQAACArNBqyPP/5YK1as0AcffKC6urqr5v/85z%2BbXhgAAECg%2BtJ7sCoqKjRo0CC1b9%2B%2BtWsCAAAIaM0GrCNHjujPf/6zbDZba9cDAAAQ8Jp9TEPHjh25cgUAAPAVNRuwpk%2BfrtWrV6upqam16wEAAAh4zb5F%2BJe//EX/%2BMc/9PLLL%2Btb3/qWQkO9c9gf/vCHVikOAAAgEDUbsKKiojR48ODWrgUAACAoNBuwli5d2tp1AAAABI1m78GSpA8//FCrVq3S/PnzPWP//Oc/W6UoAACAQNZswDp06JDGjBmj3bt3a/v27ZI%2Bf/jo5MmTecgoAABAC5oNWCtXrtTPf/5zbdu2TSEhIZI%2B/37CZcuWac2aNa1aIAAAQKBpNmC9//77ys7OliRPwJKku%2B66S2VlZYZt/u6772ry5Mm67bbbNHDgQBUUFKiqqkrS51fRsrKy1K9fP40aNUpbt271Onfjxo0aMWKE%2BvXrp%2BzsbB05csQzd/HiRT3%2B%2BOMaPHiw0tLSlJeXJ5fL5ZkvLy/XtGnTlJaWpoyMDC1fvlyNjY2e%2BZb2BgAAuJZmA1aHDh2a/Q7CiooKhYeHG7JxfX29pk2bpu9///s6ePCgtm/frqqqKi1atEgVFRWaOXOmJk2apEOHDmnBggUqLCyUw%2BGQJO3Zs0erVq3Sr3/9ax08eFAZGRmaMWOGPvvsM0mfX4FzOp2y2%2B3atWuXmpqavO4lmzVrluLj41VaWqoNGzaotLRUL7zwguc1XmtvAACAljQbsPr166cnn3xS1dXVnrETJ05o7ty5%2BsEPfmDIxpWVlaqsrNQ999yj8PBwxcbGavjw4Tp69Ki2bdumxMREZWVlKSIiQunp6crMzNSmTZskSXa7XePGjVPfvn3Vtm1bTZ06VZK0d%2B9e1dfXq6SkRDNnztTNN9%2BsmJgYzZ49W/v27dOZM2fkcDh07NgxFRQUqEOHDkpMTFROTo7sdrsktbg3AABAS5p9TMP8%2BfP14IMPKi0tTQ0NDerXr59qa2vVs2dPLVu2zJCN4%2BPj1adPH9ntdv3Xf/2X6urqtHv3bg0dOlROp1NJSUlexyclJWnHjh2SJKfTqZEjR3rmQkND1adPHzkcDvXp00cXLlxQcnKyZ7579%2B5q27atnE6nKioqlJCQoOjoaM98cnKyTpw4oerq6hb39kVFRYUqKyu9xiyW9oqLi/N5DV%2BEhX3ph0C/liyWwKn3Sm8DrceBgN6ai/6ah96aJxh722zAuummm7R9%2B3bt379fJ06cUNu2bdWtWzcNHDjQ656sGxEaGqpVq1YpJyfH8/bcgAEDNGfOHM2cOVPx8fFex8fExHjuo3K73V4BSZKio6PlcrnkdrslSVar1WvearV65r84d2WtK/PX2tsXdrtdq1ev9hrLzc1VXl6ez2sEo9jYSH%2BXcN2s1nb%2BLiFo0Vtz0V/z0FvzBFNvmw1YktSmTRvdcccdpm186dIlzZgxQ3fddZfn/qnFixeroKDAp/Nb%2Bp7Ea82b/R2LEydOVGZmpteYxdJeLleNofsEWtI3%2BvWbKSwsVFZrO50/X6uGhsaWT4DP6K256K956K15zOytv/5z32zAyszMvOaVKiOehXXo0CF98skneuSRRxQWFqYOHTooLy9P99xzj374wx96rkRd4XK5ZLPZJEmxsbFXzbvdbvXs2dNzjNvtVmTkv5p67tw5dezYUQ0NDc2eGxISIpvN1uza/763L%2BLi4q56O7Cy8oLq67/ZfyED8fU3NDQGZN2BgN6ai/6ah96aJ5h622zAGjlypFfAamho0IkTJ%2BRwOPTggw8asnFDQ4MaGxu9riZdunRJkpSenq4//elPXscfOXJEffv2lSSlpKTI6XRq7NixnrXeffddZWVlqUuXLoqOjpbT6VRCQoKkzx87cenSJaWkpKiiokKnTp1SVVWVJzQ5HA716NFDkZGRSk1N1ebNm790bwAAgJY0G7C%2B7G26Xbt26a9//ashG99yyy1q3769Vq1apRkzZqiurk5r165V//79dc8992j16tXatGmTxowZozfeeEP79%2B/3fNIvOztbjzzyiO6%2B%2B2717t1b//M//6Pw8HANHTpUYWFhmjBhgtatW6fU1FS1bdtWTz31lIYPH65OnTqpU6dOSk1N1YoVKzR//nydOXNGGzZs0E9/%2BlNJ0ujRo/Xss89%2B6d4AAAAtCWm6jhuSGhoalJ6ebljIOnLkiH71q1/p2LFjCg8P14ABAzRv3jzFx8fr73//u5YsWaKysjIlJCRozpw5uvPOOz3n/u///q%2BKi4t19uxZpaamatGiRerVq5ekz6%2BELV26VK%2B%2B%2Bqrq6%2BuVkZGhRYsWqUOHDpKk06dPq7CwUH/7298UFRWlSZMm6eGHH/ZctWtp76%2BisvLCDZ3fHIslVMOLDhi%2Brll2zB7o7xJ8ZrGEKjY2Ui5XTdBcrv66oLfmor/mobfmMbO3nTt3MHQ9X11XwHI4HJoyZYr%2B9re/mVlTUCJgEbDwOXprLvprHnprnmAMWM2%2BRThp0qSrxmpra1VWVnbDV3IAAACCXbMBKzEx8apPEUZERCgrK0vjx49vlcIAAAACVbMBy6intQMAAHwTNRuwXnnlFZ8XuPfeew0rBgAAIBg0G7AWLFhw1TOqJCkkJMRrLCQkhIAFAADwBc0GrN/85jd6/vnnNWPGDPXu3VtNTU1677339Nxzz%2BmBBx5QWlpaa9cJAAAQML70Hqzi4mKvLz2%2B7bbb1KVLF02ZMkXbt29vtQIBAAACTbPfFnzy5ElFR0dfNW61WlVeXm56UQAAAIGs2YCVkJCgZcuWyeVyecbOnz%2BvFStW6Nvf/narFQcAABCImn2L8NFHH9WcOXNkt9sVGRmp0NBQVVdXq23btlqzZk1r1wgAABBQmg1YgwYN0r59%2B7R//36dPn1aTU1Nio%2BP1w9/%2BEPP9/kBAACgec0GLElq166dhg0bptOnT6tLly6tWRMAAEBAa/YerLq6Os2dO1e33HKLfvSjH0n6/B6sqVOn6vz5861aIAAAQKBpNmAtX75cR48eVVFRkUJD/3VIQ0ODioqKWq04AACAQNRswNq1a5eeffZZ3XXXXZ4vfbZarVq6dKl2797dqgUCAAAEmmYDVk1NjRITE68at9ls%2Buyzz8yuCQAAIKA1G7C%2B/e1v669//askeX334M6dO/Uf//EfrVMZAABAgGr2U4T333%2B/Zs2apfvuu0%2BNjY3asGGDjhw5ol27dmnBggWtXSMAAEBAaTZgTZw4URaLRb///e8VFhamdevWqVu3bioqKtJdd93V2jUCAAAElGYDVlVVle677z7dd999rV0PAABAwGv2Hqxhw4Z53XsFAAAA3zUbsNLS0rRjx47WrgUAACAoNPsW4c0336xf/vKXKi4u1re//W21adPGa37FihWtUhwAAEAgajZgHT9%2BXN/5znckSS6Xq1ULAgAACHReASs/P18rV67U7373O8/YmjVrlJub2%2BqFAQAABCqve7D27Nlz1QHFxcWtVgwAAEAw8ApYzX1ykE8TAgAAXB%2BvgHXli51bGgMAAMCXa/YxDQAAAPjq/B6w1q5dq0GDBun73/%2B%2BcnJy9Mknn0iSDh06pKysLPXr10%2BjRo3S1q1bvc7buHGjRowYoX5ubQjKAAAZKUlEQVT9%2Bik7O1tHjhzxzF28eFGPP/64Bg8erLS0NOXl5Xl9GrK8vFzTpk1TWlqaMjIytHz5cjU2NnrmW9obAADgWrw%2BRXj58mXNmTNHLY0Z9RysF198UVu3btXGjRsVFxenp59%2BWr/97W81bdo0zZw5UwsWLNDo0aP11ltv6aGHHlK3bt2UmpqqPXv2aNWqVfrNb36j3r17a%2BPGjZoxY4Z2796t9u3ba%2BXKlXI6nbLb7WrXrp0KCws1f/58rVu3TpI0a9YsJScnq7S0VGfPntX06dPVqVMn/eQnP1FFRcU19wYAAGiJ1xWsW2%2B9VRUVFV6/mhszyvPPP6/8/Hx95zvfUVRUlB577DE99thj2rZtmxITE5WVlaWIiAilp6crMzNTmzZtkiTZ7XaNGzdOffv2Vdu2bTV16lRJ0t69e1VfX6%2BSkhLNnDlTN998s2JiYjR79mzt27dPZ86ckcPh0LFjx1RQUKAOHTooMTFROTk5stvtktTi3gAAAC3xuoL178%2B/MtuZM2f0ySef6Ny5cxo5cqTOnj2rtLQ0LVq0SE6nU0lJSV7HJyUleb6%2Bx%2Bl0auTIkZ650NBQ9enTRw6HQ3369NGFCxeUnJzsme/evbvatm0rp9OpiooKJSQkKDo62jOfnJysEydOqLq6usW9AQAAWtLsk9xbw%2BnTpyVJO3fu1IYNG9TU1KS8vDw99thjqqurU3x8vNfxMTExnvuo3G63V0CSpOjoaLlcLrndbkmS1Wr1mrdarZ75L85dWevK/LX29kVFRYUqKyu9xiyW9oqLi/N5DV%2BEhfn9FrrrYrEETr1XehtoPQ4E9NZc9Nc89NY8wdhbvwWsK8/Xmjp1qifQzJo1Sz/72c%2BUnp7u8/lfZd7sZ3vZ7XatXr3aayw3N1d5eXmm7vt1Fxsb6e8SrpvV2s7fJQQtemsu%2BmseemueYOqt3wJWp06dJHlfaUpISFBTU5MuX77suRJ1hcvlks1mkyTFxsZeNe92u9WzZ0/PMW63W5GR//oH/dy5c%2BrYsaMaGhqaPTckJEQ2m63Ztf99b19MnDhRmZmZXmMWS3u5XDU%2Br%2BGLQEv6Rr9%2BM4WFhcpqbafz52vV0NDY8gnwGb01F/01D701j5m99dd/7v0WsG666SZFRUXp6NGjnvulysvL1aZNGw0ZMkRbtmzxOv7IkSPq27evJCklJUVOp1Njx46VJDU0NOjdd99VVlaWunTpoujoaDmdTiUkJEiS3n//fV26dEkpKSmqqKjQqVOnVFVV5QlNDodDPXr0UGRkpFJTU7V58%2BYv3dsXcXFxV70dWFl5QfX13%2By/kIH4%2BhsaGgOy7kBAb81Ff81Db80TTL312yUQi8WirKwsrVu3Tv/3f/%2Bns2fPas2aNRo9erTGjh2r8vJybdq0SRcvXtT%2B/fu1f/9%2BTZgwQZKUnZ2tV155RW%2B//bZqa2u1du1ahYeHa%2BjQoQoLC9OECRO0bt06nTp1Si6XS0899ZSGDx%2BuTp06KSkpSampqVqxYoWqq6tVVlamDRs2KDs7W5I0evToa%2B4NAADQEr9dwZKkOXPm6NKlSxo/frwuX76sESNG6LHHHlNkZKTWr1%2BvJUuWaPHixUpISNDy5cv13e9%2BV5I0ePBgPfLII5o9e7bOnj2r1NRUFRcXq23btpKkvLw81dTU6J577lF9fb0yMjK0aNEiz77PPvusCgsLNXDgQEVFRWnSpEm6//77JUkdO3a85t4AAAAtCWni25xbRWXlBcPXtFhCNbzogOHrmmXH7IH%2BLsFnFkuoYmMj5XLVBM3l6q8Lemsu%2BmseemseM3vbuXMHQ9fzVWDdJQ0AABAACFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGs/i7AAAAYI4fPf3//F2Cz9785V3%2BLsFQXMECAAAwGAELAADAYAQsAAAAgxGwAAAADEbAAgAAMBgBCwAAwGAELAAAAIMRsAAAAAxGwAIAADAYAQsAAMBgBCwAAACDEbAAAAAMRsACAAAwGAELAADAYAQsAAAAg30tAtaTTz6p3r17e35/6NAhZWVlqV%2B/fho1apS2bt3qdfzGjRs1YsQI9evXT9nZ2Tpy5Ihn7uLFi3r88cc1ePBgpaWlKS8vTy6XyzNfXl6uadOmKS0tTRkZGVq%2BfLkaGxt93hsAAKAlfg9YR48e1ZYtWzy/r6io0MyZMzVp0iQdOnRICxYsUGFhoRwOhyRpz549WrVqlX7961/r4MGDysjI0IwZM/TZZ59JklauXCmn0ym73a5du3apqalJ8%2BfP96w/a9YsxcfHq7S0VBs2bFBpaaleeOEFn/YGAADwhV8DVmNjoxYuXKicnBzP2LZt25SYmKisrCxFREQoPT1dmZmZ2rRpkyTJbrdr3Lhx6tu3r9q2baupU6dKkvbu3av6%2BnqVlJRo5syZuvnmmxUTE6PZs2dr3759OnPmjBwOh44dO6aCggJ16NBBiYmJysnJkd1u92lvAAAAX1j8ufkf/vAHRUREaPTo0Xr66aclSU6nU0lJSV7HJSUlaceOHZ75kSNHeuZCQ0PVp08fORwO9enTRxcuXFBycrJnvnv37mrbtq2cTqcqKiqUkJCg6Ohoz3xycrJOnDih6urqFvf2VUVFhSorK73GLJb2iouLu651WhIW5vcLkNfFYgmceq/0NtB6HAjorbnor3nCwkJ124Kd/i4jqAXTn1u/BaxPP/1Uq1at0u9%2B9zuvcbfbrfj4eK%2BxmJgYz31UbrfbKyBJUnR0tFwul9xutyTJarV6zVutVs/8F%2BeurHVl/lp7%2B8put2v16tVeY7m5ucrLy7uudYJNbGykv0u4blZrO3%2BXELTorbnoLwJRMP259VvAWrp0qcaNG6cePXrok08%2Bua5zm5qavvJ8S%2BcaYeLEicrMzPQas1jay%2BWqMXSfQEv6Rr9%2BM4WFhcpqbafz52vV0NDY8gnwGb01F/01T6D9zA1EZvy59dd/7v0SsA4dOqR//vOf2r59%2B1VzsbGxnitRV7hcLtlsti%2Bdd7vd6tmzp%2BcYt9utyMh/NfTcuXPq2LGjGhoamj03JCRENputxb19FRcXd9XbgZWVF1Rf/83%2BYReIr7%2BhoTEg6w4E9NZc9BeBKJj%2B3Poljm/dulVnz55VRkaG0tLSNG7cOElSWlqaevXq5fXYBUk6cuSI%2BvbtK0lKSUmR0%2Bn0zDU0NOjdd99V37591aVLF0VHR3vNv//%2B%2B7p06ZJSUlKUkpKiU6dOqaqqyjPvcDjUo0cPRUZGKjU19Zp7AwAA%2BMIvAWvevHnatWuXtmzZoi1btqi4uFiStGXLFo0ePVrl5eXatGmTLl68qP3792v//v2aMGGCJCk7O1uvvPKK3n77bdXW1mrt2rUKDw/X0KFDFRYWpgkTJmjdunU6deqUXC6XnnrqKQ0fPlydOnVSUlKSUlNTtWLFClVXV6usrEwbNmxQdna2JLW4NwAAgC/88hZhdHS0143q9fX1kqSbbrpJkrR%2B/XotWbJEixcvVkJCgpYvX67vfve7kqTBgwfrkUce0ezZs3X27FmlpqaquLhYbdu2lSTl5eWppqZG99xzj%2Brr65WRkaFFixZ59nr22WdVWFiogQMHKioqSpMmTdL9998vSerYseM19wYAAPBFSFNr3PUNVVZeMHxNiyVUw4sOGL6uWXbMHujvEnxmsYQqNjZSLldN0NwP8HVBb81Ff80TaD9zA82bv7zLlD%2B3nTt3MHQ9X/GRCAAAAIMRsAAAAAxGwAIAADAYAQsAAMBgBCwAAACDEbAAAAAMRsACAAAwGAELAADAYAQsAAAAgxGwAAAADEbAAgAAMBgBCwAAwGAELAAAAIMRsAAAAAxGwAIAADAYAQsAAMBgBCwAAACDEbAAAAAMRsACAAAwGAELAADAYAQsAAAAgxGwAAAADEbAAgAAMBgBCwAAwGAELAAAAIMRsAAAAAxm8XcBAABj3bZgp79L8NmO2QP9XQJgCq5gAQAAGIyABQAAYDC/Bqzy8nLl5uYqLS1N6enpmjdvns6fPy9JOnr0qB544AHdeuutuvPOO/X88897nfvaa69p9OjRuuWWWzRu3Di9/vrrnrnGxkatXLlSw4YNU//%2B/TVlyhR9/PHHnnm3263Zs2crPT1dgwYN0oIFC1RXV%2BeZb2lvAACAa/FrwJoxY4asVqv27Nmjl19%2BWR988IF%2B9atfqa6uTtOnT9ftt9%2BuAwcOaOXKlVq/fr12794t6fMANHfuXBUUFOiNN95QTk6OHn74YZ0%2BfVqS9OKLL2rbtm0qLi7W3r17lZiYqNzcXDU1NUmSCgsLVVtbq%2B3bt2vz5s0qKytTUVGRJLW4NwAAQEv8FrDOnz%2BvlJQUzZkzR5GRkbrppps0duxYvfnmm9q3b58uX76shx56SO3bt1dycrLGjx8vu90uSdq0aZOGDBmiIUOGKCIiQmPGjFGvXr20detWSZLdbldOTo66d%2B%2BuqKgo5efnq6ysTIcPH9ann36q0tJS5efny2azKT4%2BXjNnztTmzZt1%2BfLlFvcGAABoid8CltVq1dKlS9WpUyfP2KlTpxQXFyen06nevXsrLCzMM5eUlKQjR45IkpxOp5KSkrzWS0pKksPhUF1dnY4fP%2B41HxUVpa5du8rhcOjo0aMKCwtT7969PfPJycn67LPP9OGHH7a4NwAAQEu%2BNo9pcDgc%2Bv3vf6%2B1a9dqx44dslqtXvMxMTFyu91qbGyU2%2B1WdHS013x0dLSOHz%2Buc%2BfOqampqdl5l8ulmJgYRUVFKSQkxGtOklwul9xu9zX3Dg1tOZNWVFSosrLSa8xiaa%2B4uLiWG3EdwsIC6zMKFkvg1Hult4HW40BAb80VaH0NxJ8LME8w9fhrEbDeeustPfTQQ5ozZ47S09O1Y8eOZo/791B05X6qL3Ot%2BZbObWnvltjtdq1evdprLDc3V3l5ede9bzCJjY30dwnXzWpt5%2B8Sgha9hRSYPxdgnmD6ueD3gLVnzx79/Oc/V2Fhoe69915Jks1m08mTJ72Oc7vdiomJUWhoqGJjY%2BV2u6%2Bat9lsnmOam%2B/YsaNsNpuqq6vV0NDgeRvwyrFX5q%2B1ty8mTpyozMxMrzGLpb1crhqfzvdVoCV9o1%2B/mcLCQmW1ttP587VqaGj0dzlBhd6ai58L5gm03gYiM34u%2BCvE%2BzVg/eMf/9DcuXP1zDPPaNCgQZ7xlJQUvfTSS6qvr5fF8nmJDodDffv29cx/8Z4oh8OhUaNGKSIiQj179pTT6dSAAQMkfX5D/UcffaTvfe97SkhIUFNTk44dO6bk5GTPuVarVd26dWtxb1/ExcVd9XZgZeUF1dd/s/8xCcTX39DQGJB1BwJ6Cykwfy7APMH0c8Fvcby%2Bvl6PPfaYCgoKvMKVJA0ZMkRRUVFau3atamtrdfjwYZWUlCg7O1uSNGHCBB08eFD79u3TxYsXVVJSopMnT2rMmDGSpOzsbG3cuFFlZWWqrq5WUVGR%2BvTpo9TUVNlsNo0YMUJPP/20qqqqdPr0aa1Zs0ZZWVmyWCwt7g0AANASv13Bevvtt1VWVqYlS5ZoyZIlXnM7d%2B7UunXrtHDhQhUXF6tTp07Kz8/X0KFDJUm9evVSUVGRli5dqvLycvXo0UPr169X586dJUmTJk1SZWWlfvzjH6umpkZpaWle90Q98cQTWrhwoYYNG6Y2bdro7rvvVn5%2BviQpPDz8mnsDAAC0JKTpq9zxjetWWXnB8DUtllANLzpg%2BLpmCaQvdbVYQhUbGymXqyZoLld/XdBbc/FzwTyB1ttA8%2BYv7zLl50Lnzh0MXc9X3LEHAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2ABAAAYjIAFAABgMAIWAACAwQhYAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwi78LAPDNc9uCnf4u4brsmD3Q3yUACDBcwQIAADAYAQsAAMBgBCwAAACDEbAAAAAMRsACAAAwGAELAADAYAQsAAAAgxGwAAAADMaDRptRXl6uxYsX6/Dhw2rfvr1GjhypOXPmKDSUPPpNwsMwAQBfFQGrGbNmzVJycrJKS0t19uxZTZ8%2BXZ06ddJPfvITf5cGAAACAJdkvsDhcOjYsWMqKChQhw4dlJiYqJycHNntdn%2BXBgAAAgRXsL7A6XQqISFB0dHRnrHk5GSdOHFC1dXVioqKanGNiooKVVZWeo1ZLO0VFxdnaK1hYYGVjy2WwKk30HorBU5/6a25Aq2/9Bb/Lph6TMD6ArfbLavV6jV2JWy5XC6fApbdbtfq1au9xh5%2B%2BGHNmjXLuEL1eZB78KYPNHHiRMPD2zcdvTUPvTUX/TUPvTVPRUWFVq1aFVS9DZ6oaKCmpqYbOn/ixIl6%2BeWXvX5NnDjRoOr%2BpbKyUqtXr77qahluHL01D701F/01D701TzD2litYX2Cz2eR2u73G3G63QkJCZLPZfFojLi4uaBI4AAC4flzB%2BoKUlBSdOnVKVVVVnjGHw6EePXooMjLSj5UBAIBAQcD6gqSkJKWmpmrFihWqrq5WWVmZNmzYoOzsbH%2BXBgAAAkTYokWLFvm7iK%2BbH/7wh9q%2Bfbt%2B8Ytf6NVXX1VWVpamTJmikJAQf5d2lcjISA0YMICrayagt%2Baht%2Baiv%2Baht%2BYJtt6GNN3oHd0AAADwwluEAAAABiNgAQAAGIyABQAAYDACFgAAgMEIWAAAAAYjYAEAABiMgAUAAGAwAhYAAIDBCFgAAAAGI2AFoPLyck2bNk1paWnKyMjQ8uXL1djY6O%2BygsaBAweUnp6u/Px8f5cSdMrLy5Wbm6u0tDSlp6dr3rx5On/%2BvL/LCgrHjh3Tgw8%2BqFtvvVXp6emaPXu2Kisr/V1W0HnyySfVu3dvf5cRVHr37q2UlBSlpqZ6fv3iF7/wd1k3jIAVgGbNmqX4%2BHiVlpZqw4YNKi0t1QsvvODvsoLCc889pyVLlqhr167%2BLiUozZgxQ1arVXv27NHLL7%2BsDz74QL/61a/8XVbAu3Tpkn76059qwIABOnTokLZv366zZ8%2BKr5o11tGjR7VlyxZ/lxGUdu7cKYfD4flVWFjo75JuGAErwDgcDh07dkwFBQXq0KGDEhMTlZOTI7vd7u/SgkJERIRKSkoIWCY4f/68UlJSNGfOHEVGRuqmm27S2LFj9eabb/q7tIBXW1ur/Px8TZ8%2BXeHh4bLZbBo%2BfLg%2B%2BOADf5cWNBobG7Vw4ULl5OT4uxQECAJWgHE6nUpISFB0dLRnLDk5WSdOnFB1dbUfKwsOkydPVocOHfxdRlCyWq1aunSpOnXq5Bk7deqU4uLi/FhVcIiOjtb48eNlsVgkSR9%2B%2BKH%2B9Kc/6Uc/%2BpGfKwsef/jDHxQREaHRo0f7u5SgtGLFCg0dOlS33XabCgsLVVNT4%2B%2BSbhgBK8C43W5ZrVavsSthy%2BVy%2BaMk4CtxOBz6/e9/r4ceesjfpQSN8vJypaSkaOTIkUpNTVVeXp6/SwoKn376qVatWqWFCxf6u5Sg9P3vf1/p6enavXu37Ha73n77bS1evNjfZd0wAlYAampq8ncJwA156623NGXKFM2ZM0fp6en%2BLidoJCQkyOFwaOfOnTp58qT%2B%2B7//298lBYWlS5dq3Lhx6tGjh79LCUp2u13jx49XeHi4unfvroKCAm3fvl2XLl3yd2k3hIAVYGw2m9xut9eY2%2B1WSEiIbDabn6oCfLdnzx5NmzZNjz76qCZPnuzvcoJOSEiIEhMTlZ%2Bfr%2B3bt6uqqsrfJQW0Q4cO6Z///Kdyc3P9Xco3xre%2B9S01NDTo7Nmz/i7lhhCwAkxKSopOnTrl9UPT4XCoR48eioyM9GNlQMv%2B8Y9/aO7cuXrmmWd07733%2BrucoHHo0CGNGDHC63EtoaGf/3hv06aNv8oKClu3btXZs2eVkZGhtLQ0jRs3TpKUlpamV1991c/VBb53331Xy5Yt8xorKytTeHh4wN%2BfScAKMElJSUpNTdWKFStUXV2tsrIybdiwQdnZ2f4uDbim%2Bvp6PfbYYyooKNCgQYP8XU5QSUlJUXV1tZYvX67a2lpVVVVp1apVuu222/jQxg2aN2%2Bedu3apS1btmjLli0qLi6WJG3ZskWZmZl%2Bri7wdezYUXa7XcXFxbp06ZJOnDihZ555RhMnTlRYWJi/y7shIU3c0BNwTp8%2BrcLCQv3tb39TVFSUJk2apIcfflghISH%2BLi3gpaamSvo8DEjyfCrL4XD4raZg8eabb%2Bo///M/FR4eftXczp07lZCQ4Ieqgsd7772nJUuW6J133lH79u11%2B%2B23a968eYqPj/d3aUHlk08%2B0bBhw/Tee%2B/5u5Sg8fe//10rVqzQe%2B%2B9p/DwcI0dO1b5%2BfmKiIjwd2k3hIAFAABgMN4iBAAAMBgBCwAAwGAELAAAAIMRsAAAAAxGwAIAADAYAQsAAMBgBCwAAACDEbAAAAAMRsACAAAwGAELAADAYAQsAAAAgxGwAAAADEbAAgAAMNj/B/spe0NaBEcBAAAAAElFTkSuQmCC\"/>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\" id=\"common1908170423746513921\">\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">0</td>\n        <td class=\"number\">1505291</td>\n        <td class=\"number\">55.1%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">5</td>\n        <td class=\"number\">500971</td>\n        <td class=\"number\">18.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:33%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4</td>\n        <td class=\"number\">405565</td>\n        <td class=\"number\">14.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:27%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">3</td>\n        <td class=\"number\">237942</td>\n        <td class=\"number\">8.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:16%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2</td>\n        <td class=\"number\">64084</td>\n        <td class=\"number\">2.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1</td>\n        <td class=\"number\">20497</td>\n        <td class=\"number\">0.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n        <div role=\"tabpanel\" class=\"tab-pane col-md-12\"  id=\"extreme1908170423746513921\">\n            <p class=\"h4\">Minimum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">0</td>\n        <td class=\"number\">1505291</td>\n        <td class=\"number\">55.1%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">1</td>\n        <td class=\"number\">20497</td>\n        <td class=\"number\">0.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:2%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2</td>\n        <td class=\"number\">64084</td>\n        <td class=\"number\">2.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">3</td>\n        <td class=\"number\">237942</td>\n        <td class=\"number\">8.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:16%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4</td>\n        <td class=\"number\">405565</td>\n        <td class=\"number\">14.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:27%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n            <p class=\"h4\">Maximum 5 values</p>\n            \n<table class=\"freq table table-hover\">\n    <thead>\n    <tr>\n        <td class=\"fillremaining\">Value</td>\n        <td class=\"number\">Count</td>\n        <td class=\"number\">Frequency (%)</td>\n        <td style=\"min-width:200px\">&nbsp;</td>\n    </tr>\n    </thead>\n    <tr class=\"\">\n        <td class=\"fillremaining\">1</td>\n        <td class=\"number\">20497</td>\n        <td class=\"number\">0.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:5%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">2</td>\n        <td class=\"number\">64084</td>\n        <td class=\"number\">2.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:13%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">3</td>\n        <td class=\"number\">237942</td>\n        <td class=\"number\">8.7%</td>\n        <td>\n            <div class=\"bar\" style=\"width:48%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">4</td>\n        <td class=\"number\">405565</td>\n        <td class=\"number\">14.8%</td>\n        <td>\n            <div class=\"bar\" style=\"width:81%\">&nbsp;</div>\n        </td>\n</tr><tr class=\"\">\n        <td class=\"fillremaining\">5</td>\n        <td class=\"number\">500971</td>\n        <td class=\"number\">18.3%</td>\n        <td>\n            <div class=\"bar\" style=\"width:100%\">&nbsp;</div>\n        </td>\n</tr>\n</table>\n        </div>\n    </div>\n</div>\n</div>\n    <div class=\"row headerrow highlight\">\n        <h1>Sample</h1>\n    </div>\n    <div class=\"row variablerow\">\n    <div class=\"col-md-12\" style=\"overflow:scroll; width: 100%%; overflow-y: hidden;\">\n        <table border=\"1\" class=\"dataframe sample\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>is_read</th>\n      <th>rating</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>true</td>\n      <td>4</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>true</td>\n      <td>4</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>true</td>\n      <td>5</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>false</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>true</td>\n      <td>3</td>\n    </tr>\n  </tbody>\n</table>\n    </div>\n</div>\n</div>\n</body>\n</html>"
  }
]